International Research Seminars through Multimedia Conferencing: Experiences from the MICE project[*]

Ulf Bilting[a], Martina-Angela Sasse[b], Claus-Dieter Schulz[c] & Thierry Turletti[d]

1. Introduction

In 1992, the CEC agreed to finance a one-year piloting project called Multimedia Integrated Services for European Researchers (MICE). The aim of the project was to enable interworking between European researchers, using a heterogeneous hardware platform, existing software tools as far as possible, and the emerging 2 Mb European network infrastructure. The technology which the MICE partners adopted and developed allows multimedia conferencing - audio, video and shared workspace - between conference rooms and workstation-based facilities, hardware and software codecs, packet-switched networks and ISDN, using both uni- and multicast technology. A detailed rationale for adopting these technologies, and the technical effort required to integrate them, is given in [Kir]. The project was conducted in three overlapping phases: definition, trial and evaluation. During the definition phase, a multimedia conferencing reference architecture was defined, and facilities required in conference rooms, conferencing workstations and the Conference Multiplexing and Management Centre (CMMC) [Han] at UCL were specified. During the trial phase, partners worked to develop and improve the facilities provided in all 3 areas. In 1993, multiway interworking between the partners, and some sites in the US, was demonstrated in public events at Joint European Networking Conference (JENC`5) in Trondheim, Internet Engineering Task Force (27th IETF) in Amsterdam, and Interop`93 in Paris. During the evaluation phase, cost and benefit of providing a regular multimedia conferencing service for research collaboration had to be assessed, and recommendations drawn up for the development of future systems and services. The MICE partners decided to run an International Research Seminar Series between some of the partners. The aim was to gain hands-on experience as operators and users of a regular service, as a basis for our assessment and recommendations for the development of future technology and services.

Since the overall aim of the project was to further remote cooperation between researchers, a series of International Research Seminars seemed to be an appropriate application of the technology. Two of the MICE partners, University College London (UCL) and the Swedish Institute of Computer Science/Royal University of Technology.(SICS/KTH) in Stockholm organised a joint series of 11 weekly seminars on Multimedia, Communications and Networks, Distributed Systems and CSCW, starting in October 1993. The seminars were attended by computer science researchers and students in the conference rooms in London and Stockholm, and MICE partners in Germany and France followed the seminars from their conference rooms or workstations as well as other researchers in Europe and the US. The seminar speakers were mostly invited from outside the MICE project from various backgrounds. Seminar topics ranged from high-speed networking architecture and congestion control schemes to distance education and CSCW implementations.

2. Description of typical session

Since the seminars were multicast using the Mbone [Cas], on which bandwidth is limited, each seminar was announced on the mailing list.

The seminars were also announced using the Session Directory (sd) [Jac3] tool. Before the seminar, the speaker provides PostScript or ASCII files of his/her slides to the moderator at the transmitting site. The moderator will place these in the shared whiteboard tool wb [Jac2], through which they are displayed on the workstation screens of the remote audience.

Some remote attendees are allowed to send video at a very low rate to provide feedback to the speaker. Video parameters can be changed during the conference according to the feedback received from the control whiteboard (see section 3.3). After the talk, questions are taken from local attendees and remote participants.

2.1 Multimedia conferencing tools

All the multimedia conferencing software tools used to broadcast the MICE seminars are available in the public domain. The shared whiteboard tool, wb [Jac2] from Lawrence Berkeley Laboratory (LBL), is used as a shared drawing surface in which the operator can enter all the slides he needs for the speech. To broadcast audio, MICE currently uses the LBL Visual Audio Tool (vat) [Jac1]. To receive video and send it when no hardware codec was available, we used the INRIA Videoconferencing System (ivs) [Tur]. It includes a H.261 [H261] software codec. A packetisation scheme for H.261 has been designed and specified in an internet draft report [Hui]. It defines how H.261 video streams can be carried over the Internet using the RTP protocol [Hen]. For floor control, or more precisely for video floor control, we use the speakers [Hed] tool .

2.2 Participating from conference rooms

At all the seminars there have been participants at remote conference room as well as single persons sitting at their desktop workstations. The conference room first of all needs good facilities for audio. Optimal types and placement of microphones needs the attention of an experienced expert. The MICE partners have invested effort in finding adequate audio configurations. To display video and whiteboard output to the audience, either a large monitor or wall display is needed. The MICE partners have used different approaches in their conference rooms: backprojected screens with video projectors and light pens or digitiser boards, front projected overhead projector LCD displays, TV monitors for local and remote video. Unfortunately, today there simply are no inexpensive solutions for large, good quality, high resolution, high contrast and mouse-able screen projection facilities.

2.3 Participating from workstations

The software tools needed for a desktop participant are the same as those used in the conference room. The hardware needs to support audio, which most workstations already do, and a microphone. Shared whiteboard and video decoding are provided by the wb and ivs tools, both not requiring any hardware. If the user wants to transmit video, a video card is needed, e. g. VideoPix from Sun, and a camera.

3. Experiences

The main purpose of the seminar series was to get hands-on experience in conducting networked seminars with the multimedia technology available to us. A summary of the eleven seminars clearly shows that seminars of this kind are a feasible form to gather geographically distant people in a rewarding discussion. We have the view that the current technical quality of the sound and video is just above what is acceptable and that the next generation of hardware and network technology will make it fully useful even for the non-enthusiast. Below we list the experience of the various types of participants in the seminars.

3.1 Speakers` view

The speaker of the seminar has in all our seminars given a 30-45 minute talk on his/her topic and has naturally been the focus of attention for both local and remote audiences, as well as for the local technician in charge of the system. Many of the speakers were academics from the US visiting either UCL or SICS/KTH, and their seminars were given a an audience in many European countries by being multicast.

It requires skill to give a lecture to a remote audience - especially if the audience gives little feedback. To restrict bandwidth, we kept the video bandwidth from remote sites very low (typically 0.2 frames a second with low resolution) or even switched off when the network load caused a large packet loss. Contact with an audience is very important unless the speaker is specially trained (such as for television). The speaker will normally have the slides projected through the whiteboard program. This may be shown either on a workstation screen in front of him or projected on a large wall screen. Pointing and further drawing on the slides can be done with the mouse and keyboard of the workstation or with a light pen on the projected image, if available.

3.2 Audiences` view

The seminar series has been run with various types of remote audiences at the different sites, ranging from single persons at their desktop workstation to a traditional seminar audience of some ten persons. The use of the distributed whiteboard to display the outline of the talk is of great help, especially if audio quality drops, otherwise the audience lose interest quickly.

3.2.1 Local conference room audiences` view

The greatest procedural difference from a normal seminar concerns when and how to interrupt the speaker for questions. If the transmitted video image does not include the local audience, it may be hard for the remote audience to follow what is happening. From our experience, the best solution to this is to locally show some remote audience and also the image transmitted. This will increase the feeling of presence and cause "normal" seminar behaviour. There is a conflict of lighting. The whiteboard, used to show slides and for the speaker to draw on, is usually a projected screen image, which with current technologies is never very bright. Careful attention and testing is required to get the lighting required for video cameras to produce a good image of the speaker who is often close to the projected whiteboard, without losing contrast in the projected image.

3.2.2 Remote conference room audiences` view

The main challenge is to turn remote audiences into participants rather than passive recipients of a seminar. Receiving good video and audio is essential: participants endure blurred and slow video images (in the worst case they ignore it), but bad audio quality drives audiences away. During some seminars we decided to drop video for the benefit of more bandwidth to audio. Together with a whiteboard, the seminar was still informative and judged a success.

We show the video image transmitted from the audience site to increase the feeling of presence. We have mostly run the seminars without interruption for questions. This is probably due to audiences` normal shyness, here increased by the remoteness. Also, our tools has not so far been very good at supporting this and the inevitable audio delay makes interruption a bit awkward. Instead we organised a question session after the lecture part. With the number of listeners we have had (up to 60), normal social contention, i.e. the speaker who talks first gets the floor, has proved to work.

3.2.3 Remote desktop participants` view

One of the greatest advantages of using multicast is the scalability. An added listener consumes very little extra network resources. This means that listening to a seminar on your desktop system is a normal thing to do. Since most workstations today are equipped with audio input, it is easy for a desktop participant to ask questions. The user has also been able to decode the video transmission with the software decoder, ivs, we have used with no additional hardware. Setting up reception of an sd advertised seminar is very easy.

3.3 Local operator's view

Running a successful seminar currently requires a lot of preparation and testing. For presenting slides during the seminar we use a shared whiteboard rather than sending video of projected slides. This has the advantage of both reducing the load on the network and providing more readable output. All slides have to be put into the shared whiteboard tool before the seminar, so that all participants have them available when the seminar starts. they were also stored on a software server. We used an additional instance of wb running to exchange information between the technicians in charge at each site. This ensures that the main wb is used for slides only. The best solution is to have a second workstation or at least an extra screen, to keep the control information out of sight of the audience. Since we currently have no tool to monitor the video/audio quality at the remote sites, this control wb is used to provide feedback on quality. This way we could best adapt to the sometimes changing network quality. If the remote sites have trouble receiving the audio, they report to the sending site. The technician first reduces the data rate of video. If the audio is not getting better, we usually decide to stop sending video. In case there are still problems we tried to switch audio coding schemes (PCM, IDVI, GSM).

To summarise, a technician will be needed at the sending site. One technician can support a reasonable quality of transmission - a professional media production requires more staff. Broadcast quality is not a standard we can aspire to if we want to take advantage of wide access and low cost of these new media.

4. Conclusions

The series showed that regular distributed multimedia seminars are currently possible, using available hardware and software tools, and - to a certain extent - the current network infrastructure. Most of the tools and hardware we have used are prototypes or of the first generation, and, whilst their usefulness is beyond any doubt, there is considerable scope for development.

The participants have been either passive or active, gathered in a remote seminar room or sitting alone at their desktop workstation. The interaction has caused very little problems in our seminars. A multimedia seminar or similar event need to be coordinated, both to perform a chair`s function and for technical supervision, the latter in particular with today`s technology.The greatest single problem encountered has been congestion of the Internet, causing unacceptable audio quality due to packet loss. Packet loss quickly turns audio unbearable. Video transmission is also affected by packet loss: compression techniques yielding a good compression also degrade reception quality to a higher degree in case of network data loss. In a seminar situation, however, low video quality is easier to tolerate then bad audio quality.

To stage a successful seminar, speaker, moderator and technical support people at all sites need to be well prepared: testing of equipment and settings is essential. Speakers need prepare visual material in advance, and need some practice to drive the audio and Shared Workspace tools.

5. Future work

The MICE project will continue in 1994, as will the Seminars. Current research topics include:

Workstation Components - codecs, protocols, audio compression, network adapdation Workstation Platforms and Conference Rooms - flexible configurations

Conference Control and Management - protocols and tools development

Support and Shrinkwrapping - to get experience from a larger user body

Multimedia Servers - storage, retrieval, indexing, synchronisation for audio/video and other data

Security - access control, encryption

Traffic measurement, Analysis and Congestion Control

Applications - trials in external research and administrative environments

High Speed Networks - use of evolving infrastructures and recent developments

6. References

[Cas] S. Casner, "Frequently Asked Questions (FAQ) on the Multicast Backbone (MBONE)", available by anonymous ftp from in the mbone/faq.txt, May 6th 93.

[Han] M. J. Handley, P. T. Kirstein & M. A. Sasse, "Multimedia Integrated Conferencing for European Researchers (MICE): piloting activities and the Conference Management and Multiplexing Centre", Computer Networks and ISDN Systems, 26, 275-290, 1993.

[Hed] A. Hedstrom, "SPEAKERS" manual pages, Swedish Institute of Computer (SICS), November 9th.

[Hui] C. Huitema, T. Turletti, "Packetization of H.261 video streams", INTERNET-DRAFT, December 5, 1993.

[H261] "Video codec for audiovisual services at p x 64 kbit/s", CCITT Recommendation H.261, 1990.

[Jac1] V. Jacobson, "VAT" manual pages, Lawrence Berkeley Laboratory (LBL), February 17th 93.

[Jac2] V. Jacobson, "WB" README file, Lawrence Berkeley Laboratory (LBL), August 12th 93.

[Jac3] V. Jacobson, "SD" README file, Lawrence Berkeley Laboratory (LBL), March 30th 93.

[Kir] P. T. Kirstein, M. J. Handley, M. A. Sasse, "Piloting of Multimedia Integrated Communications for European Researchers (MICE)", Proc. INET `93.

[Mar] H. Martinsen: MICE Evaluation Report. Deliverable ESPRIT Project MICE, 1993.

[Sch] H. Schulzrinne, S. Casner, "RTP: A Transport Protocol for Real-Time Applications", INTERNET-DRAFT, October 20, 1993.

[Tur] T. Turletti, "H.261 Software Codec for Videoconferencing Over the Internet", Research report No 1834, INRIA, January 1993.

[*] This is a heavily condensed version of the original paper which is available from the authors.

[a] <>, Department of Teleinformatics, Royal Institute of Technology (KTH), Stockholm, Sweden

[b ]<>, Department of Computer Science, University College London (UCL), UK

[c] <>, Rechenzentrum der Universität Stuttgart (RUS), Germany

[d] <>, INRIA Sophia Antipolis, France