Exploiting Virtual Reality Techniques in Education and Training: Technological Issues
4. Developments in VR Technology (cont)
4.2 Functional Characterisation of a VR System
There is a tendency to think of head mounted displays when considering VR systems. Unfortunately, this narrow perspective leads to confusion when people talk of other forms of VR system. The solution is to use the term ‘VR’ as an all embracing term to cover all forms of VR system and to use other terms to describe the different approaches. To add further confusion some people use the term virtual environments instead of VR. However, it is better to think of a virtual environment as a computer representation of a synthetic world. This means a virtual environment can be defined irrespective of the delivery technology. If we take the broadest possible definition then it is feasible to characterise any VR system in a consistent manner. It is important to take a systems perspective of a VR system in its entirety and include the components that help construct the environment, refer to Figure 2. The jagged edges of the ellipses bounding the components in the diagram illustrate that there is poor integration between the component parts. It is not sufficient to just produce a virtual environment alone - there is a need to control the component parts of the environment in way that makes the user believe that they are actually immersed in a real environment. Therefore, it is necessary to provide some form of simulation that interacts with the virtual environment.
Figure 2 Systems Perspective of a Generic VR System
Neglecting the issues of system performance for the moment, the peripheral technologies define the different forms of VR system. For example, the term ‘desk-top VR’ has emerged which does not relate so much to the virtual environment but instead refers to the delivery technology used. Since a desk-top VR system is generally based on a PC platform there is a tendency to assume that a desk-top system offers inferior performance. This is not quite true since it is perfectly feasible to employ the very best graphics systems for a desk-top VR system. However, it is better not to think of the performance of the underlying technology at this stage because this will be dictated by the application.
If a generic VR system is defined as shown in Appendix A then it becomes possible to provide a consistent description for all VR systems. This approach is significantly clearer when it comes to determining which form of VR system to use for a particular application.
Refering to the diagram given in Appendix A, the most fundamental feature of any VR system is that it representsa human computer interface comprising:
Expanding these features:
- Direct man-machine interface: The direct man-machine interface refers to the user and the functional interface/devices that he or she would use to experience and control a virtual environment.
- User: The user is defined in terms of perceptual processes (vision, touch and hearing) as well as actions that can be initiated by (voice, hands, head and eyes). It is possible to define human capabilities at an empirical level in these terms. However, it must be stressed that this does not provide a complete description of human performance and other factors must be considered.
- Output interface: The output interface refers to techniques that can be used provide information for the human perceptual system.
- Input interface: The input interface refers to the means by which human initiated actions can be converted into appropriate information for use in the virtual environment.
- Information processing: The information processing section is the equivalent of the virtual environment where data is processed for delivery by the output interfaces. Correspondingly, data from the input interface are processed and used to control the virtual environment. This section also has links to the application environment which governs what is actually undertaken by the overall VR system.
- Application environment: The application environment is essentially the simulation software that dictates what the VR system will do in accordance with external input from the external environment or that initiated by the user. There is a very close relationship between this section and the information processing section. Example application environments include: engine simulations, flight simulations, molecular modelling, assembly plants etc. It is feasible for the application environment to be networked to other local or remote applications.
- External environment: The external environment represents the real (physical) world which may be linked to the VR system. For example, in a medical application it is feasible to overlay a virtual image onto a patient via an optical system. To achieve accurate registration of the real and virtual environments it is important to provide a link between the display technology, application environment and information processing sections. It should be noted that not all VR systems need to relate to the external environment. In fact in applications where abstract virtual environments are required it may not be appropriate to link the real and virtual environments together.
Having provided a basic framework for a functional decomposition of a VR system it is possible to identify a number of categories for each section. Table 1 shows a top level breakdown of each functional category into specific functions according to the functional decomposition given in Appendix A. The naming given to each specific function is broad enough to provide an overall category in which to lodge relevant VR technologies. Of special note is the decision to keep tactile and kinaesthetic stimulus generation together. There is an important link between these two processes in terms of the human perceptual system which makes it difficult to isolate one from the other.
Against each function is an index which is used later to point into a more detailed breakdown, Appendix H. Particular care has been taken to ensure a consistent approach was taken in the preparation of the functional analysis.
Table 4 Functional Characterisation of VR System Functions
|A Information Processing |
| Cognitive Agents ||A1|
| Data Management ||A2|
| Control-display Coordination ||A3|
| Data storage and Recording ||A4|
| Image Generation|| A5|
| Tactile Stimulus Generation ||A6|
| Kinaesthetic Stimulus ||A7|
| Auditory Signal Generation ||A8|
| Speech Processing ||A9|
| Switch Processing ||A10|
| Virtual Hand Controller ||A11|
| Head Sensor Processing ||A12|
| Eye Sensor Processing ||A13|
| Physiological Sensor Processing ||A14|
|B Direct Man-Machine Interface |
| Image Display ||B1|
| Tactile Feedback|
| Audio Production ||B3|
| Speech Transduction ||B4|
| Hand Operated Controls ||B5|
| Head Sensing ||B6|
| Eye Sensing ||B7|
| Physiological Sensing ||B8|
| External Environment Viewing ||B9|
| Visual Defect Correction ||B10|
4.2.1 Functional Categories
126.96.36.199 A Information Processing,
The information processing section is the equivalent of the virtual environment where data is processed for delivery to the output interfaces. Correspondingly, data from the input interface are processed and used to control the virtual environment. This section also has links to the application environment which governs what is actually undertaken by the overall VR system.
- Cognitive Agents:Cognitive agents refer to the aspects of the system that provide intelligent support to the user as a result of direct request or through some erroneous operation. Examples of a cognitive agent is a help system. In the future it is likely that this area will develop into intelligent advisors or even on-line personal tutors.
- Data Management: It is assumed that a virtual environment will employ considerable amounts of data in the creation of the environment and includes the objects in the virtual environment. Given the size of some virtual environments this is a complex task since it has to determine which data is required at a particular momement. For low cost VR systems this element of the system is very important. If the VR system is part of a wide area network then the data management section will also have to handle data consistency between the client and server systems.
- Control-display Coordination:One of the most important tasks that a VR system has to perform is the coordination of control inputs with various display outputs in a way that the user still believes that they are interacting with a real system.
- Data storage and Recording: This function provides data storage for the VR system and includes the database for the virtual environment.
- Image Generation: This part of the system is responsible for generating the graphics for display to the user. This system will probably be the most expensive part of the overall VR system and will tend to act as the host for the overall system. Today it is common to find powerful graphics systems supported by multi-processor acrhitectures.
- Tactile Stimulus Generation: In order to create a sense of touch in a virtual environment it is important to produce a sensation of surface texture. Whichever technology is used to create a tactile sensation it is necessary to impliment a biomechanical model of the skin.
- Kinaesthetic Stimulus: Force feedback systems will need to model the kinematics of the human being as well as ensuring safe limits for force interaction with the user are not exceeded. This aspect of the VR system is likely to be safety critical since it is feasible for force feedback technology to cause serious injury to the user.
- Auditory Signal Generation: This function is responsible for creating digital acoustic representation of an analogue signal. Depending on the functionality required considerable processing may be required as in the case of 3D localised sound.
- Speech Processing: This function performs a speech recognition function.
- Switch and inceptor processing: This function receives input from conventional input devices such as keyboards, switches and analogue joysticks.
- Virtual Hand Controller: The virtual hand controller function processes data from various 3D interaction devices such as glove devices and 3D joysticks. These devices will tend to provide an indication of hand orientation as well as information relating to finger position.
- Head Sensor Processing: This function takes head position data and converts this to a form that can be used by the graphics system. Depending on the tracking technology used the nature of this function will change. This function is extremely time critical.
- Eye Sensor Processing: It is possible to detect eye position and use this for hands free selection or designation purposes.
- Physiological Sensor Processing: This function is included for completeness and performs the task of processing data from various physiological sensors such as heart beat rate, blood pressure etc. It is unlikely that this function will be required in general educational systems. However, if sports applications are considered then the ability to accommodate physiological data into a training system might be very important.
188.8.131.52 B Direct Man-Machine Interface
The direct man-machine interface refers to the actual devices that are attached to or used by the user. It is assumed that if any processing is part of the particular interface then it is covered in the information processing section.
- Image Display: This function creates the image for the user and includes devices such as computer monitors and various visual display units
- Tactile Feedback: Provision of electromechanically actuated devices to permit a sensation of touch to be achieved.
- Kinaesthetic Feedback: This function employs a range of actuators to couple force feedback to the user.
- Audio Production:This function provides the means to listen to auditory signals through either loudspeakers or headphones.
- Speech Transformation: Refers to the means of converting speech into electrical signals and of course relates to microphones.
- Hand Operated Controls: Refers to any device that the user has to hold in order to make an input into the system. There are many options here and include conventional keyboards through to gesture recognition systems.
- Head Sensing: This function refers to the devices that are used to obtain head line of sight information from the user. Thes can include active systems such as electromagnetic tracking systems to passive systems where the user’s head line of sight is obtained from an image processing system.
- Eye Sensing: This refers to the actual process of determining eye point of regard.
- Physiological Sensing: Physiological sensing can be accomplished by many techniques. It is assumed that non intrusive techniques will be used for heart rate, etc.
- Visual Defect Correction: This function has been included for completeness to remind system designers that many users of VR systems rely on some form of correction for defective vision. In the case of spectacles, designers of head mounted displays will need to incorporate greater eye relief to accommodate spectacle frames.
Virtual Environments Visualisation