Here is an article from the Winnipeg Free Press featured on June 17th 2010.
On April 9th 2002 audiences in Ottawa and Kanata Ontario were connected with St. John’s Nfld. (Atlantic time plus one half-hour) in a large-scale interactive performance event. Carleton University’s new media and sonic design students presented creative projects in digital media and performance, and collaborated with musicians at Holy Heart High School in St. John’s for what is believed to be the first musical work composed for a multimedia broadband event. Of all the events in the series, April 9th was by far the most elaborate in terms of programming and technology.
The event was co-produced with the National Research Council of Canada (NRC) and the Virtual Classroom of the Communications Research Centre (CRC). Dr. Martin Brooks of the NRC’s Institute of Information Technology led the technical team. The CRC “BADLAB” was set up as the master site, with Carleton and St. John’s connecting to CRC as interactive sites. The event’s backbone was CA*net3 (http://www.canet3.net)7. The BADLAB, St. John’s and Carleton U were all connected to CA*Net3 via CRC GigaPOP8, Memorial University of Newfoundland GigaPOP, and ONet GigaPOP, respectively. Three high-end Pentium boxes running Linux utilized ISABEL, a conferencing application designed to create multi-media, multi-point distributed events. (See Figure 1)
Three workstations were set up at Carleton, with two running as separate interactive sites equipped with cameras to capture the audience and the performers, and the third to act as flowserver and stream the multimedia. This flowserver in turn connected to the CRC flowserver to compensate for limited bandwidth to the Architecture building. CA*net3 was accessed via the university ethernet whose bandwidth provided 10 Mbps of transfer, although actual transfer speeds were affected by general network activity on the campus at the time of the event. The single workstation in St. John’s would connect directly to the CRC flowserver.
A principal goal of this event was the merging of interactive cyberspace with a public performance forum. Large projections were utilized to create a sense of presence in the main audience venue at Carleton. Data projectors cast light on adjoining wall surfaces, offset by a 90-degree angle. (See Figures 2 and 3). ISABEL provides multiple programmable window sets within each projection. This, for example, made possible a kind of theatrical depiction of actors facing one another in a naturalistic conversational style. However, the scale ofthe projections produced a kind of cinematic amplification. Furthermore, roving cameras allowed me (as director and animator located across town at the CRC) to remotely “reach” into the locations and provoke participation from the audience. Indeed, the traditional distinction between audience and performers was deliberately blurred through a variety of engagement devices. The main venue on the Carleton campus was not an electronic presentation space. The closest ethernet port was several meters outside of the site in a small meeting room. The 10Mbps nominal connection speed at this port was seriously hampered by general network activity on the campus. This resulted in a recurring freeze effect to and from Carleton, although the CRC Kanata to St. John’s connection was unaffected. A fundamental challenge exists in locating a public performance facility that is directly wired to broadband. Such a facility would also require professional caliber presentation systems found in conventional theatre, stage or television production.
Two responses were particularly noteworthy. The usual shyness of being on-camera was very evident among the students; and, the default posture was to attempt eye contact via the videoconference. Participants tended to address projections rather than cameras, tending towards eye contact and facial images. Participants had to be occasionally directed away from the virtual facial image of a conversational partner and towards the camera lens. The makeshift venue also produced clumsiness with respect to more normal videoconference interaction. Separate monitor kiosks would allow an individual to 12 engage in cyberspatial conversations on a more human scale; with split video feeds generating the large-scale audience depiction. Issues relating to theatrical lighting, audience illumination and large-scale projections also require some kind of solution. The challenge in using Linux-based computers was the non-commercial, nonthoroughlytested nature of the operating system. Applications, drivers etc. were rife with incompatibilities, with devises such as the data projectors failing without warning.
I returned to the Banff Centre on November 27th 2002 for the final event of phase one. As was the case at the start, this event connected Banff with a class in Ottawa for a studio tour with demonstrations. The CLE was again deployed, this time with extended audio, VGA and NTSC video inputs via a mixer. Cabling allowed for a walk-about from room to room, with a floor crew including a switcher/director, two cameramen, and a floor director. On this occasion, students in Ottawa were located off-campus in an auditorium of the National Research Council. While the NRC is a primary node of CA*net3, Banff’s connectivity was again influenced by local network activity. However, the point-to-point connection exhibited dramatically improved audio and video quality.
Students were again given a demo/tour of Luscar, plus the Rice Studio television and video production facility. Rice includes a fully-equipped 2500-square-foot studio space with cyclorama and computerized lighting board. The production complement includes a Panther dolly, portable crane, and an extensive lighting package with both Tungsten and HMI lamps. Visually, the project now began to approach broadcast quality.
A number of significant production issues emerged from this early event, all of which relate to the merging of public performance space with cyberspace. By introducing production techniques derived from television and cinema, as well as developing original solutions, phase two of this project will address the following objectives:
Lighting. A challenge exists when illuminating an audience/participant group for a broadband performance. Large projections, monitors, performers and a participatory on-camera audience all require specific solutions be resolved within the same physical space, and cyberspace.
Audio. Increasing the capacity of audio transmission to a stereo (and possibly surround) format, to better suit musical events, and to create an ambient envelope that more readily expresses the feeling of a physical space that has been virtually “transposed” from one location to another.
Classrom/Performance Space Design. Human nature draws us to eye contact in any interactive relationship. It is therefore necessary to devise a solution that allows for large-scale presentations and human scale interaction to coexist within the same space and within the same virtual event.
Audience/Participant Engagement. When does a group of active participants become an audience? The issue of encouraging and maintaining involvement is affected by the size of an audience. Through an evaluation process, this project will seek to determine when that threshold has been breached, and will also experiment with various creative methods of “reaching through” a portal to engage participation at another location. Focus groups will vary in size and complexity in order to determine data with respect to a continuum of engagement.
Remote control Options. To what degree can a remote participant group/audience influence capture technology at another location? The issue of allowing for a degree of remote technological intervention not only allows for refinements, but also further encourages participation.
Experiential cyberspace, and indeed all computer-generated realities, can be defined simply as a light source with associated audio elements. Comparisons with other light-source media are therefore unavoidable. Interactive teleconferencing requires sophisticated production values to hold its own against the formalism and refinements of conventional television, cinema, gaming, etc.
Finally, there is the issue of public deployment, and the eventual opportunity to test a streaming method to make such an event readily available to a larger audience on the Web. In this instance, new issues such as content and participation controls, bandwidth management, and so forth, enter the equation.
I served as a CODEC evaluator and team member on an incredible project that connected 5 schools in a sparsely-populated portion of Northern Alberta.
The following is taken from RACOL: Rural Advanced Community of Learners
“One of the major challenges to rural communities in Alberta is to provide high quality education for their inhabitants. With the evolution of broadband networks, it is now possible to facilitate even more effective learning for distanced students.
The Rural Advanced Community of Learners Project (RACOL) is developing a model of teaching and learning that exploits the potential of broadband networks and advanced digital technologies. Rather that falling into either of the synchronous or asynchronous distance learning camps, RACOL exploits the best of each. Capabilities such as broadcast quality digital video, streaming media, electronic whiteboards and educational objects will aid in the facilitation of effective learning and address the needs of students in rural and remote school districts.
The Fort Vermilion School Division (FVSD) is the focal point of the RACOL project. FVSD is located in the North Western corner of Alberta, a very rural area. The most serious educational challenge for the Division is the delivery of a quality and equitable high school program. There are 6 small high schools in the Fort Vermillion jurisdiction, some as small as twelve students. The schools are so geographically separated that there is no opportunity to combine them into one or two larger facilities. For the past 6 years the jurisdiction has been using audio graphics to synchronously deliver 8 academic courses to all high schools. Although this technology has been fairly successful, teachers and students have indicated some dissatisfaction with this learning environment.
Students have said that they feel isolated and have indicated that they would like to see what their teachers and the other students look like. “Teachers feel disconnected from their students because they cannot see their faces and judge their reactions,” says Superintendent Ken Dropko. “Because the audio graphics only facilitates voice communication, teachers can’t gauge if students are lost or following along on a topic.” The Fort Vermillion teachers often find themselves falling into “presentation mode” because of lack of feedback. Also, due to very limited bandwidth (soon to be fixed by the implementation of the Alberta SuperNet) there has been limited ability to develop digital presentations (e.g., PowerPoint) and to share digital resources with students.
How does RACOL address these concerns? Each high school is being equipped with a Virtual Presence Learning Environment (VPLE) that each can originate and receive broadcast-quality video and audio. Students or teachers at each location see the teacher/presenter on one large monitor and the students on a second large monitor in “split screen” mode. Two smaller monitors also display these images at the back of each room. Each location also has a SMART Board™ 3000i electronic whiteboard, a visualizer and CD-ROM/DVD/videotape player. Anything displayed at one location is automatically displayed at all. Each student has a question button and an “I’m lost” button. Each VPLE also contains 4 Polycom Via Video™ units that enable students at different locations to work together in small groups. Everything that happens synchronously is stored and made available to members of the class asynchronously via streaming video. A special application has been developed to allow a student to switch between the image of the instructor, the students or the electronic whiteboard while the sound continues, and to bookmark locations in the stream for later review. One of the major tasks, of course, is to work with the teachers to help them use this technology effectively.
Dr. Craig Montgomerie, RACOL project leader and a professor in instructional technology at the University of Alberta says, “Through this project, we want to provide the best possible learning experience for students. We are starting with students in northern Alberta and hope to eventually expand to students in remote schools across Canada and abroad. We expect this project will set a new standard for distance education.”
The two major partners in RACOL are the Fort Vermilion School Division No. 52 and the University of Alberta. Other partners include the University of Calgary, the Banff Centre, Sonic Design Interactive Inc., the Northern Alberta Institute of Technology and Netera Alliance. This project would not have been possible without the tremendous financial, technical and in-kind support from CANARIE Inc., Alberta Infrastructure, Alberta Learning, Alberta Innovation and Science, Smart Technologies and Apple Canada.”
These videos present a series of virtual classroom examples set among locations across Canada. Classrooms and studios are connected over thousands of kms. using a variety of networks, engagement methods, and studio technology. The intention is to provide a level learning experience to students both in the room and at the remote location. It was also a goal to improve the AV quality of the experience by employing television studio technology and production techniques.
First example here:
The first event took place on October 25th 2001 and connected the Luscar Digital Recording Studio of the Banff Centre for Continuing Education, Banff, Alberta (Mountain Time) with Ottawa(ET). The studio falls under the jurisdiction of the Creative Electronic Environment (CEE), a unit of the Banff Centre for the Arts that supports the utilization of electronic media by artists and faculty. As the first event in the series, my goals were simple: to experience a connection with a classroom from a remote location; and, develop a feel for the enhancements and inhibitors inherent to this particular method of communication.
The participant group was an undergraduate course in computer music at Carleton University, with approximately 45 students concentrating on a specialized diploma in sonic design and media art. The video conferencing classroom room was furnished with desk mics, 6 large suspended monitors, front and rear cameras, and a control lectern. Clearly retrofitted from a conventional classroom, the space is an excellent example of how not to design a cyberspace portal. All capture and presentation technology was oriented from the ceiling, discouraging a natural feeling of penetration into, and from, a remote location.
Carleton’s facility utilized a V-Tel H.320-based video conferencing unit served by 6 ISDN lines leased from Bell Canada . The Banff system, known as the Client Learning Environment (CLE), is built on a VCON ViGO H.323-based CODEC . This portable unit was developed as part of the BELLE (Broadband Enabled Lifelong Learning Environment) partnership led by the Alberta-based Netera Alliance, with shared funding under the CANARIE Learning Program. BELLE’s objective was to develop a prototype educational object repository. CLE was installed in the Banff’s Luscar Studio for this event. The incompatibility with H.320 was resolved via the University of Ottawa’s Accord gateway that served as the multipoint control unit.
The primary content set for this trial included standard audio and video, but also extended audio inputs and data. The class lesson included a discussion of the video conferencing setup itself, with numerous questions from students regarding its configuration. A Banff associate audio engineer provided a tour of the Luscar studio and fed the CLE a variety of musical projects and test materials. Luscar is built around a Euphonix CS-3000, 56 channel digitally controlled analogue console with a 24-track digital recorder. A demonstration of leading edge computer music software was provided by its developer. The lack of VGA inputs prohibited the connection of a second computer to the CLE. A solution was found by simply utilizing a camera and second monitor. An LCD monitor could be used and the inherent scanning in such a set up would be avoided.
The event was characterized by a 1-2 second perceivable delay that made conversation clumsy, although overall stability meant a natural continuity in the lesson plan. There were occasional freezes and pixilated video effects. The audio suffered when a signal was rich enough to saturate the bandwidth. A test off Luscar’s console sent a variety of instrumental (mono) mixes to the CLE. It was discovered that an overly demanding audio track would cause sound to deteriorate into an indiscernible garble. In addition, inexplicably, the “density” of the mix (number of tracks, signal characteristics) also provoked this breakdown. Microphones left open at both locations also led to intolerable audio.
Perhaps the most interesting aspect of the experience was the feel of the communication itself. The delay and lack of open microphones created the biggest feeling of detachment. I was unaware of any student reaction during the event, although it was later expressed to me via email (from the proctor) that the class was very engaged throughout. Except for the occasional conversation, which was shut down by neighbouring students, the class was attentive. I occasionally prompted a reaction to insure that participation was holding.