AGOCG logo
Graphics Multimedia VR Visualization Contents
Training Reports Workshops Briefings Index
Also available in Acrobat format Back Next

CONTENTS


1 Introduction

2 What is Multimedia?
2.1 Definitions
2.2 The analogue legacy networks
2.3 Digital systems
2.4 Compression standards
2.5 CD-ROM formats
2.6 Scanners
2.7 Video compression
2.8 Real-time or off-line

3 Pedagogy and technology
4 Networks
5 Future Work
Glossary
Appendices
Bibliography


Case Studies

Multimedia in the Teaching Space

2 WHAT IS MULTIMEDIA

2.1 DEFINITIONS

The title of this document contains two words which need to be defined; the term `multimedia' has a variety of meanings to different people and an attempt is made in the following section to define the term within the ideas of this document.

The term `teaching space' is basically self explanatory, but is not frequently used, and so a definition has been given.

2.1.1. Multi-Media

There is common confusion due to the different meanings of the term "Media", which can refer to communications industry such as the press, newspapers and television, or it can refer to the plurality of medium by which information is conveyed to and between humans. In this latter case the information is conveyed by text, sounds and vision, touch and smell. The computer has made it possible to handle several of these medium simultaneously, in particular those which can be digitised easily.

Documents can be scanned or typed into electronic formats which can be handled in sophisticated ways, altering the layout, font styles and sizes etc. with ease. These documents can include pictorial and graphic information. Pictures and graphics can be digitised and handled either as bitmaps or as vector drawings. The JPEG standards exist for bitmap images. Animations can also be produced to support text information.

Visual information is very flexible and if the time factor is taken into account moving images can be handled as well as still pictures. Much recent research has been carried out into the handling of moving images, and when this information is synchronised with digitised audio information video is available. This technology is covered by a number of MPEG standards, some hybrids such as Motion-JPEG and the H.26X standards used in video-conferencing.

Multimedia has become a generic term for "multimedia computing" or "interactive multimedia". The computer and its software are used to control and navigate through the communications medium, not only one at the time, but several simultaneously which simulates the real-world and presents unique and innovative opportunities to captivate human senses. Computer systems are most developed in using vision and hearing to interface between the digital and analogue worlds, e.g. still and moving images, text and graphics use the visual senses, audio uses hearing. In this paper "Multimedia" is defined as visual, audio and textual information which can be presented separately or simultaneously to convey and present information interactively to users. Multimedia is now possible because it is technically easy to digitise the analogue forms of these common media and handle this by computers which are easily available and which are small enough to be used on the desk-top.

The applications of multimedia are various and include virtual reality and 3-D presentations.

2.1.2. Teaching Space

Teaching space is space within a teaching environment where teaching and learning can take place. It would include lecture theatres, laboratories, seminar rooms and tutors rooms where tutorials might take place. The size of these room can vary considerably and the size can pose problems when presenting multimedia materials, especially in large lecture theatres. This therefore also includes learning centres and libraries where the student may be working alone, because these locations would be within institutional space and under some kind of supervised control. A liberal definition of teaching space would include the students own living quarters, whether it is in a home or student residence. This discussion will deliberately not included that situation.

If multimedia material are to be available in the teaching space then in most cases a computer, lap-top or desk-top should be available to be operated in the space. If legacy audio-visual networks are available, then the computer is not necessary, as the information is delivered in analogue format and will be displayed without having to be digitised. The analogue signals are capable of producing high resolution images, and do not suffer from the delays introduced by the compression and encoding procedures.

However the signals cannot be manipulated easily in the analogue form, usually requiring dedicated hardware. If the signals are digitised then a large variety of digital tools can be applied, and a more flexible system is made available to the teacher. One of the problems which arises from introducing greater flexibility of presentation into the teaching space is that the teacher's attention is diverted away from teaching into managing the various devices and facilities. Experience has shown in the Teaching and Learning Technology Project, INSURRECT (http:/www.mmscc.ucl.ac.uk/INSURRECT/) for teaching undergraduate surgery on the SuperJANET ATM video network that the students are quickly aware that the teacher's attention is divided and would prefer their attention to the technology.

As will be indicated elsewhere multimedia material can be presented in the standalone mode or it can be networked. The standalone mode gives limited opportunities, generally due to the limitations in the storage capacity of the computer, and the speed with which information can be retrieved from floppy disk or CD-ROM. Greater flexibility comes with the use of networks, and the higher the bandwidth of these networks the more sophisticated the multimedia presentation can be, and the images have higher resolution.

2.2 THE ANALOGUE LEGACY NETWORKS

Most people assume that a computer is essential to introduce multimedia information into teaching material, but moving and still images and audio have been transferred over audio-visual analogue networks direct into lecture theatres for some years, since the mid 1970s. These applications are sometimes called legacy networks and work is being carried out to link these analogue networks with new high speed broadband networks. As these legacy network were analogue they supplied high resolution images which were near broadcast quality.

Examples of these analogue video networks are LIVENET (London University Interactive Video Network for Education, established in the early 1980s) and the Charing Cross Hospital Video Teaching Network. These network used optical fibre to carry video and audio signals over limited distances up to about 25 miles, and have been used for teaching for 10-15 years. Their use of analogue technology in fact gave better resolution than the early ISDN networks, and for that reason were favoured for medical applications.

In the late 1980s Sony and other video disk manufacturers produced systems that did not require pre-mastering. The SONY LVR laser video recording system was available in 1989 and was used to store and replay still and moving images (with audio) into lecture theatres under the control of the teacher. The LVR was located outside the lecture theatre and remotely controlled. By using the INTERNET signals were sent from other parts of Europe to control video being played over satellite. This system is still in use on the LIVENET network which is used for teaching in University College London, and handles up to 1000 hours of teaching per annum in a variety of subjects; including Medicine (Surgery), Classic, and Physics. Likewise in the SuperJANET video demonstrators and teaching programme video was transferred over SuperJANET to a number of remote sites, with the teacher at one of the sites controlling the system.

These systems used special video disks cartridges which allowed images, slides etc. to be laid down on the disc in real-time. The output from a video camera could be fed directly to the LVR recording unit and recording made onto pre-determined frames on the disk. The playback unit could then be set to the requisite frame numbers and the video played back to a monitor. This was a WORM (Write Once Read Many times) system and so the recording was available for some time. This was not a digital system and the quality of recording did deteriorate with time, due to deterioration in the surface of the disks. It was possible to record video directly from camera, or from previous recording on cassette or other video disks. The system was also able to input still images to a specified frame number and replay as selected. Thus a mixture of still and moving images could be stored on the disk cartridge to a maximum of 54,000 frames which corresponded to 36 minutes of video material. The LVR video disk system could be controlled remotely using RS232A protocol, which could be delivered over the INTERNET. The command set permitted the following commands:-
START
STOP
PAUSE
PLAY FORWARD
PLAY REVERSE
PLAY frame number XXX to YYY (Video sequence)
PLAY FAST
PLAY SLOW
SCAN FORWARD
SCAN REVERSE
DISPLAY frame number (Still image)
This command structure permits interaction between the user and the images being displayed by the video disk. The system recorded video in PAL and NTSC formats

2.3 DIGITAL SYSTEMS

2.3.1. The Multimedia Computer

It is now possible to digitise video and audio with desk-top computers such as the PC or Macintosh, which makes possible the handling of multimedia material which can be used in support of teaching. The resolution of the video and still images is now available at SVGA and even XVGA quality which results in images of very high quality, which are better than that seen on domestic television. If this multimedia material is to be displayed in the teaching space, often to several students at once, then large displays are necessary. This is achieved by the use of video projectors with computer input, by using LCD projectors displays or large television monitors connected to the computer by scanning devices. These scanning devices can handle high resolution, higher than that produced by PCs and their cost is affordable. Until recently this hardware was expensive and the result were barely acceptable at VGA level, e.g. the Mediator.

As multimedia computer systems handle digitised audio and video they need to be powerful systems and will commonly be using a fast Pentium processor. A multimedia computer system will be comprised of the following components:-
Powerful high speed central processor with clock speed of over 200 Mhz.
Random Access memory 32 Mbytes
Hard disk drive 2-5 Gbytes
Input through Floppy Disk Drive or CD-ROM
Audio input and output devices, e.g. microphones and loudspeakers
Video input and output devices, e.g. VCRs, video-cameras, video disks etc.
Still image input and output, e.g. slide scanner, monitors etc.
Graphics capability to handle both bitmap and vector graphics
Display of visual output on SVGA/XSVGA quality monitors
User input through Mouse, tracker ball etc.
CODECs - hardware but may become software in the near future.
The essential capability of any multimedia computer system is the ability to convert the analogue signal to a digital format and using standard algorithms compress this information. The power of the CPU will determine whether this process can be carried out in real-time or whether it has to be done off-line. Originally this power only existed in UNIX systems, but more recently it has become available in PCs and Macintosh. The compression process is necessary otherwise the quantity of data to be stored and transmitted would be excessive. Most PCs have until recently depended upon hardware encoding/decoding systems (CODECs), but the speed of current processors is such that software compression systems have been developed. The advantage of the software systems is that the compatibility and interoperability issues can be handle more easily, and the cost of the equipment is not raised by the need to purchase expensive hardware devices.

2.3.2. Audio Compressed File Formats

The computer system must handle compressed file formats. For audio common types of format include:

  • WAV- a digital sound file for Windows.
  • MIDI (Musical Instrument Digital Interface) - audio files created by connecting the computer to musical instruments and control devices.
  • Real-Time Audio - supercompressed digital audio information.

2.3.3. Video Compressed File Formats

Two types of file are used to store still images:-

JPEG - designed for compressing either full-colour or grey-scale images.

GIF - designed for compressing images with a few distinct colours and for line drawing and simple cartoons. GIF files can only handle 8-bit colour.

In the video domain it is desirable to have full-screen, full motion video straight form the video source, but this would require a data-stream with a bandwidth of 27Mbps, which is significantly higher than most PCs can handle.

Popular video files are Quicktime (.MOV) for the Macintosh and Video for Windows (.AVI) for PC systems. Both these compression files are proprietary system and it is more advisable to use the common standards such as H.261, H.263, MPEG-1, MPEG-2, MPEG-4, Moving-JPEG etc. (See below for more detailed discussion of these standards).

Graphics     Multimedia      Virtual Environments      Visualisation      Contents