Archive for June, 2010

Winnipeg Free Press Article

Saturday, June 19th, 2010

Here is an article from the Winnipeg Free Press featured on June 17th 2010.

A cubicle as big as the world – Web conferencing quickly catching on (except in Winnipeg)

  • Share/Bookmark

Cottage Life Magazine Article

Saturday, June 19th, 2010

Here’s a bit of a mention in the Summer issue of Cottage Life magazine.

How to work from the cottage—successfully

  • Share/Bookmark

Addressing Multiple Classrooms (part 3)

Friday, June 18th, 2010

On April 9th 2002 audiences in Ottawa and Kanata Ontario were connected with St. John’s Nfld. (Atlantic time plus one half-hour) in a large-scale interactive performance event. Carleton University’s new media and sonic design students presented creative projects in digital media and performance, and collaborated with musicians at Holy Heart High School in St. John’s for what is believed to be the first musical work composed for a multimedia broadband event. Of all the events in the series, April 9th was by far the most elaborate in terms of programming and technology.

The event was co-produced with the National Research Council of Canada (NRC) and the Virtual Classroom of the Communications Research Centre (CRC). Dr. Martin Brooks of the NRC’s Institute of Information Technology led the technical team. The CRC “BADLAB” was set up as the master site, with Carleton and St. John’s connecting to CRC as interactive sites. The event’s backbone was CA*net3 ( The BADLAB, St. John’s and Carleton U were all connected to CA*Net3 via CRC GigaPOP8, Memorial University of Newfoundland GigaPOP, and ONet GigaPOP, respectively. Three high-end Pentium boxes running Linux utilized ISABEL, a conferencing application designed to create multi-media, multi-point distributed events. (See Figure 1)

Three workstations were set up at Carleton, with two running as separate interactive sites equipped with cameras to capture the audience and the performers, and the third to act as flowserver and stream the multimedia. This flowserver in turn connected to the CRC flowserver to compensate for limited bandwidth to the Architecture building. CA*net3 was accessed via the university ethernet whose bandwidth provided 10 Mbps of transfer, although actual transfer speeds were affected by general network activity on the campus at the time of the event. The single workstation in St. John’s would connect directly to the CRC flowserver.

A principal goal of this event was the merging of interactive cyberspace with a public performance forum. Large projections were utilized to create a sense of presence in the main audience venue at Carleton. Data projectors cast light on adjoining wall surfaces, offset by a 90-degree angle. (See Figures 2 and 3). ISABEL provides multiple programmable window sets within each projection. This, for example, made possible a kind of theatrical depiction of actors facing one another in a naturalistic conversational style. However, the scale ofthe projections produced a kind of cinematic amplification. Furthermore, roving cameras allowed me (as director and animator located across town at the CRC) to remotely “reach” into the locations and provoke participation from the audience. Indeed, the traditional distinction between audience and performers was deliberately blurred through a variety of engagement devices. The main venue on the Carleton campus was not an electronic presentation space. The closest ethernet port was several meters outside of the site in a small meeting room. The 10Mbps nominal connection speed at this port was seriously hampered by general network activity on the campus. This resulted in a recurring freeze effect to and from Carleton, although the CRC Kanata to St. John’s connection was unaffected. A fundamental challenge exists in locating a public performance facility that is directly wired to broadband. Such a facility would also require professional caliber presentation systems found in conventional theatre, stage or television production.

Two responses were particularly noteworthy. The usual shyness of being on-camera was very evident among the students; and, the default posture was to attempt eye contact via the videoconference. Participants tended to address projections rather than cameras, tending towards eye contact and facial images. Participants had to be occasionally directed away from the virtual facial image of a conversational partner and towards the camera lens. The makeshift venue also produced clumsiness with respect to more normal videoconference interaction. Separate monitor kiosks would allow an individual to 12 engage in cyberspatial conversations on a more human scale; with split video feeds generating the large-scale audience depiction. Issues relating to theatrical lighting, audience illumination and large-scale projections also require some kind of solution. The challenge in using Linux-based computers was the non-commercial, nonthoroughlytested nature of the operating system. Applications, drivers etc. were rife with incompatibilities, with devises such as the data projectors failing without warning.

  • Share/Bookmark

Addressing Multiple Classrooms (part 2)

Wednesday, June 16th, 2010

I returned to the Banff Centre on November 27th 2002 for the final event of phase one. As was the case at the start, this event connected Banff with a class in Ottawa for a studio tour with demonstrations. The CLE was again deployed, this time with extended audio, VGA and NTSC video inputs via a mixer. Cabling allowed for a walk-about from room to room, with a floor crew including a switcher/director, two cameramen, and a floor director. On this occasion, students in Ottawa were located off-campus in an auditorium of the National Research Council. While the NRC is a primary node of CA*net3, Banff’s connectivity was again influenced by local network activity. However, the point-to-point connection exhibited dramatically improved audio and video quality.

Students were again given a demo/tour of Luscar, plus the Rice Studio television and video production facility. Rice includes a fully-equipped 2500-square-foot studio space with cyclorama and computerized lighting board. The production complement includes a Panther dolly, portable crane, and an extensive lighting package with both Tungsten and HMI lamps. Visually, the project now began to approach broadcast quality.

A number of significant production issues emerged from this early event, all of which relate to the merging of public performance space with cyberspace. By introducing production techniques derived from television and cinema, as well as developing original solutions, phase two of this project will address the following objectives:

  • Lighting. A challenge exists when illuminating an audience/participant group for a broadband performance. Large projections, monitors, performers and a participatory on-camera audience all require specific solutions be resolved within the same physical space, and cyberspace.

  • Audio. Increasing the capacity of audio transmission to a stereo (and possibly surround) format, to better suit musical events, and to create an ambient envelope that more readily expresses the feeling of a physical space that has been virtually “transposed” from one location to another.

  • Classrom/Performance Space Design. Human nature draws us to eye contact in any interactive relationship. It is therefore necessary to devise a solution that allows for large-scale presentations and human scale interaction to coexist within the same space and within the same virtual event.

  • Audience/Participant Engagement. When does a group of active participants become an audience? The issue of encouraging and maintaining involvement is affected by the size of an audience. Through an evaluation process, this project will seek to determine when that threshold has been breached, and will also experiment with various creative methods of “reaching through” a portal to engage participation at another location. Focus groups will vary in size and complexity in order to determine data with respect to a continuum of engagement.

  • Remote control Options. To what degree can a remote participant group/audience influence capture technology at another location? The issue of allowing for a degree of remote technological intervention not only allows for refinements, but also further encourages participation.


Experiential cyberspace, and indeed all computer-generated realities, can be defined simply as a light source with associated audio elements. Comparisons with other light-source media are therefore unavoidable. Interactive teleconferencing requires sophisticated production values to hold its own against the formalism and refinements of conventional television, cinema, gaming, etc.
Finally, there is the issue of public deployment, and the eventual opportunity to test a streaming method to make such an event readily available to a larger audience on the Web. In this instance, new issues such as content and participation controls, bandwidth management, and so forth, enter the equation.

  • Share/Bookmark

Connecting Rural Schools

Wednesday, June 2nd, 2010

I served as a CODEC evaluator and team member on an incredible project that connected 5 schools in a sparsely-populated portion of Northern Alberta.

The following is taken from RACOL: Rural Advanced Community of Learners

“One of the major challenges to rural communities in Alberta is to provide high quality education for their inhabitants. With the evolution of broadband networks, it is now possible to facilitate even more effective learning for distanced students.

The Rural Advanced Community of Learners Project (RACOL) is developing a model of teaching and learning that exploits the potential of broadband networks and advanced digital technologies. Rather that falling into either of the synchronous or asynchronous distance learning camps, RACOL exploits the best of each. Capabilities such as broadcast quality digital video, streaming media, electronic whiteboards and educational objects will aid in the facilitation of effective learning and address the needs of students in rural and remote school districts.

The Fort Vermilion School Division (FVSD) is the focal point of the RACOL project. FVSD is located in the North Western corner of Alberta, a very rural area. The most serious educational challenge for the Division is the delivery of a quality and equitable high school program. There are 6 small high schools in the Fort Vermillion jurisdiction, some as small as twelve students. The schools are so geographically separated that there is no opportunity to combine them into one or two larger facilities. For the past 6 years the jurisdiction has been using audio graphics to synchronously deliver 8 academic courses to all high schools. Although this technology has been fairly successful, teachers and students have indicated some dissatisfaction with this learning environment.

Students have said that they feel isolated and have indicated that they would like to see what their teachers and the other students look like. “Teachers feel disconnected from their students because they cannot see their faces and judge their reactions,” says Superintendent Ken Dropko. “Because the audio graphics only facilitates voice communication, teachers can’t gauge if students are lost or following along on a topic.” The Fort Vermillion teachers often find themselves falling into “presentation mode” because of lack of feedback. Also, due to very limited bandwidth (soon to be fixed by the implementation of the Alberta SuperNet) there has been limited ability to develop digital presentations (e.g., PowerPoint) and to share digital resources with students.

How does RACOL address these concerns? Each high school is being equipped with a Virtual Presence Learning Environment (VPLE) that each can originate and receive broadcast-quality video and audio. Students or teachers at each location see the teacher/presenter on one large monitor and the students on a second large monitor in “split screen” mode. Two smaller monitors also display these images at the back of each room. Each location also has a SMART Board™ 3000i electronic whiteboard, a visualizer and CD-ROM/DVD/videotape player. Anything displayed at one location is automatically displayed at all. Each student has a question button and an “I’m lost” button. Each VPLE also contains 4 Polycom Via Video™ units that enable students at different locations to work together in small groups. Everything that happens synchronously is stored and made available to members of the class asynchronously via streaming video. A special application has been developed to allow a student to switch between the image of the instructor, the students or the electronic whiteboard while the sound continues, and to bookmark locations in the stream for later review. One of the major tasks, of course, is to work with the teachers to help them use this technology effectively.

Dr. Craig Montgomerie, RACOL project leader and a professor in instructional technology at the University of Alberta says, “Through this project, we want to provide the best possible learning experience for students. We are starting with students in northern Alberta and hope to eventually expand to students in remote schools across Canada and abroad. We expect this project will set a new standard for distance education.”

The two major partners in RACOL are the Fort Vermilion School Division No. 52 and the University of Alberta. Other partners include the University of Calgary, the Banff Centre, Sonic Design Interactive Inc., the Northern Alberta Institute of Technology and Netera Alliance. This project would not have been possible without the tremendous financial, technical and in-kind support from CANARIE Inc., Alberta Infrastructure, Alberta Learning, Alberta Innovation and Science, Smart Technologies and Apple Canada.”

  • Share/Bookmark