This report is also available as an Acrobat file.
Evaluation of the Suitability of
Distributed Interactive Videoconferencing for use in Higher Education
APPENDIX 5- Experimenter checklist of issues
NB. Juan, Madrid to log the following (if technically possible):
-What site(s) the questions came from e.g. Naples, Berlin (including secondary sites), in each session.
-Technical problems. e.g. When did they occur, how long did they last, which site(s) went down, what was the nature of the problem, what kind of disturbance, if any was there at other sites, did both audio and visual break-down?
-Quality of audio/visual links? e.g. frame rates of video. Is there anything other measures that we could take to quantify the quality of the audio/visual during the conference sessions?
The following to be done, by session (NB: by designated sessions only, or every session? Probably every session?)
Details of any technical problems that occurred during session, e.g. network failure, breakdown in audio/visual.
- What was nature of problem?
- How long did it last?
- Which sites were affected (if known)?
- What happened during network failure?
- What did presenter at (Brussels only?) do to address problem?
- How disruptive was the interruption(s)?
Quality of comms. link
- Notes about audio, legible?, volume levels acceptable?, any feedback?, echo? (only for Brussels site)
- Notes about video, quality acceptable?, frame rates acceptable? Any disruptions?
How well did it work? Ask organisers, floor managers, operators what they felt about it, what problems they had with it, how appropriate was it for this type of event, adequate training given? etc.
Could carry out a brief expert evaluation if time, and access to system.
- Type of screen display adopted during session (although this will prob. be the same in most sessions e.g. 2 windows with presenter & lecturer). Was anything other than standard layout used?
- Presentation material. What material was used?, when in presentation?, how, legible for all of audience?
- Did this session use the facilities appropriately?
- Co-ordination (between floor manager, presenter, speaker). What type of interaction took place?, how did control pass between etc.
- Were there any problems of a control nature (e.g. problems during hand-overs from presenter etc.)? What was nature of the problem? Which sites were involved? How disruptive was the control problem?
- What was planned level of interactivity? (AMC also to look at this, if time)
- How much interactivity was there? (AMC also to look at this, if time)
- Who was involved in questions at end of session?, which site etc.
- Were questions understood by speaker? Were there any difficulties during the questioning?
Virtual Environments Visualisation