10
Pergamon Comput. & Graphics, Vol. 21. No. 4. pp. 459468. 1997 ‘(’ 1997 ElsevierScienceLtd. All rights reserved Printed in Great Britain 0097mx493/97 $17.00 + 0.00 PII: SOO97-8493(97)00029 Haptic Displays in Virtual Environments THE VIRTUAL TOUCH: HAPTIC INTERFACES IN VIRTUAL ENVIRONMENTS JOSE DIONISIO+, VOLKER HENRICH, UDO JAKOB, ALEXANDER RETTIG and ROLF ZIEGLER Visualization and Virtual Reality Department, Fraunhofer IGD, Wilhelminenstr. 7. D-64283 Darmstadt, Germany e-nrail: [email protected] Abstract-To meet the demands of complex VR applications. state-of-the-art systems generate visual representations of virtual environments combined with acoustical feedback. But in some application areas, for example medicine or virtual prototyping, ‘seeing’ and ‘hearing’ is not sufficient. Thus, additional human perceptual channels have to be stimulated: the haptic senses.Current research at the Fraunhofer IGD addresses concepts for interface and integration of haptic displays into virtual environments. These are able to feedback reaction forces and tactile stimuli as well as temperature and motion to the user. Examples include a training simulator for arthroscopy, and ThermoPad, a glove-like output device for thermal feedback based on Peltier technology. The additional feedback improves both the impression of reality and the capability to orientate oneself in virtual worlds. Further work addresses the integration of commercially available libraries and devices. namely the PHANTOM and a motion platform, within the VirtualDesign (VDII), a VR toolkit developed in-house. ‘c: 1997 Elsevier Science Ltd 1. JNTRODUCTJON Some of the projects with industrial partners strongly require the integration of haptic displays. To satisfy these requirements a general approach to integrate haptic displays into our VR system ‘Virtual Design’ will be described in the following sections. To date we are working on adding force feedback to a certain VR application, the ‘Virtual Reality Arthroscopy Training Simulator’ (VRATS). Temperature feedback is used to simulate a local temperature feeling (grasping and collision detection) as well as environmental temperature feeling, e.g. heat radiation from a virtual fireplace. The three thermal phenomena of conduction, convection, and radiation build the physical basis for the simulation of temperature feedback. The simulation processes as well as the development of a device, the ThermoPad, will be described in more detail. The third category of haptic feedback which will be addressed in this paper is motion feedback. Motion feedback, or motion simulation, is typically used in driving or in a flight simulator. The first application we have developed was also a flight simulator, not for the training of pilots, but for therapy against fear of flying. The integration of motion feedback and this first application is high- lighted. To fulfil these demands an overview on the implementation of software for the development of VR applications as well as a generic module for ’ Author for correspondence. hardware interfaces (drivers), among others, will be presented and discussed. 2. VJRTUALDESJGN JJ-JGD’S VR SYSTEM The aim of the research at our lab is the integration of haptic displays into Virtual Reality. The VR system we are working with is ‘VirtualDe- sign II’ (VDII) [l]. which has been developed at the Fraunhofer Institute for Computer Graphics in Darmstadt. At first we give a brief introduction into the components of the system: the rendering library, the interaction toolkit and the object handler. The rendering library is based on OpenGL and provides a highly developed programming interface for the implementation of realtime 3-D graphics applications. It contains functions for the construc- tion and handling of polygonal objects in hierarch- ical scene graphs, fast vector and matrix operations to transform single objects or subtrees of the scene graph and methods to control the actual rendering using different illumination models. Furthermore it includes a fast collision detection module and a file handling system to load complete VR scenes or parts of them. The interaction toolkit is a device independent system which establishes logical input classes, This allows applicuions to communicate with physical input devices in an abstract way without considering their specific properties. The physical devices are controlled by autonomous processes (server) which map their input data onto the logical input devices seen by the main application. The main application and the server may run on different workstations connected via LAN. 4.59

The virtual touch: Haptic interfaces in virtual environments

Embed Size (px)

Citation preview

Page 1: The virtual touch: Haptic interfaces in virtual environments

Pergamon Comput. & Graphics, Vol. 21. No. 4. pp. 459468. 1997

‘(’ 1997 Elsevier Science Ltd. All rights reserved Printed in Great Britain

0097mx493/97 $17.00 + 0.00

PII: SOO97-8493(97)00029 Haptic Displays in Virtual Environments

THE VIRTUAL TOUCH: HAPTIC INTERFACES IN VIRTUAL ENVIRONMENTS

JOSE DIONISIO+, VOLKER HENRICH, UDO JAKOB, ALEXANDER RETTIG and ROLF ZIEGLER

Visualization and Virtual Reality Department, Fraunhofer IGD, Wilhelminenstr. 7. D-64283 Darmstadt, Germany

e-nrail: [email protected]

Abstract-To meet the demands of complex VR applications. state-of-the-art systems generate visual representations of virtual environments combined with acoustical feedback. But in some application areas, for example medicine or virtual prototyping, ‘seeing’ and ‘hearing’ is not sufficient. Thus, additional human perceptual channels have to be stimulated: the haptic senses. Current research at the Fraunhofer IGD addresses concepts for interface and integration of haptic displays into virtual environments. These are able to feedback reaction forces and tactile stimuli as well as temperature and motion to the user. Examples include a training simulator for arthroscopy, and ThermoPad, a glove-like output device for thermal feedback based on Peltier technology. The additional feedback improves both the impression of reality and the capability to orientate oneself in virtual worlds. Further work addresses the integration of commercially available libraries and devices. namely the PHANTOM and a motion platform, within the VirtualDesign (VDII), a VR toolkit developed in-house. ‘c: 1997 Elsevier Science Ltd

1. JNTRODUCTJON

Some of the projects with industrial partners strongly require the integration of haptic displays. To satisfy these requirements a general approach to integrate haptic displays into our VR system ‘Virtual Design’ will be described in the following sections. To date we are working on adding force feedback to a certain VR application, the ‘Virtual Reality Arthroscopy Training Simulator’ (VRATS).

Temperature feedback is used to simulate a local temperature feeling (grasping and collision detection) as well as environmental temperature feeling, e.g. heat radiation from a virtual fireplace. The three thermal phenomena of conduction, convection, and radiation build the physical basis for the simulation of temperature feedback. The simulation processes as well as the development of a device, the ThermoPad, will be described in more detail.

The third category of haptic feedback which will be addressed in this paper is motion feedback. Motion feedback, or motion simulation, is typically used in driving or in a flight simulator. The first application we have developed was also a flight simulator, not for the training of pilots, but for therapy against fear of flying. The integration of motion feedback and this first application is high- lighted.

To fulfil these demands an overview on the implementation of software for the development of VR applications as well as a generic module for

’ Author for correspondence.

hardware interfaces (drivers), among others, will be presented and discussed.

2. VJRTUALDESJGN JJ-JGD’S VR SYSTEM

The aim of the research at our lab is the integration of haptic displays into Virtual Reality. The VR system we are working with is ‘VirtualDe- sign II’ (VDII) [l]. which has been developed at the Fraunhofer Institute for Computer Graphics in Darmstadt. At first we give a brief introduction into the components of the system: the rendering library, the interaction toolkit and the object handler.

The rendering library is based on OpenGL and provides a highly developed programming interface for the implementation of realtime 3-D graphics applications. It contains functions for the construc- tion and handling of polygonal objects in hierarch- ical scene graphs, fast vector and matrix operations to transform single objects or subtrees of the scene graph and methods to control the actual rendering using different illumination models. Furthermore it includes a fast collision detection module and a file handling system to load complete VR scenes or parts of them.

The interaction toolkit is a device independent system which establishes logical input classes, This allows applicuions to communicate with physical input devices in an abstract way without considering their specific properties. The physical devices are controlled by autonomous processes (server) which map their input data onto the logical input devices seen by the main application. The main application and the server may run on different workstations connected via LAN.

4.59

Page 2: The virtual touch: Haptic interfaces in virtual environments

460 J. Dionisio et ul.

The object handler, which is build using both systems introduced above, is a basic application which integrates loading of a VR scene, management of its graphic objects and input devices, echo generation and collision detection. These actions are stated in a configuration file, which contains information about, for example, which graphical output device is requested (e.g. HMD or stereoscopic large screen projection), the input devices to be used for navigation and interaction (e.g. spacemouse, dataglove, PHANTOM, ThermoPad), and the files, which describe the scene. Moreover the system provides a flexible interface to add special properties to the scene, which can be programmed in separate modules and linked to the system dynamically according to the configuration file.

3. INTEGRATION OF FORCE, TEMPERATURE AND MOTION FEEDBACK

The configuration of an integrated VR system is illustrated in Fig. 1. The system consists of the graphic subsystem and the haptic subsystem. The interaction toolkit, described in the last section. is responsible for the communication between both subsystems. The haptic subsystem computes the corresponding haptic simulation (force, tactile, tem- perature, motion feedback) and controls the haptic displays (PHANTOM, Darmstadt device, Thermo- Pad, motion base).

3.1. Force feedback The ‘Virtual Reality Arthroscopy Training Simu-

lator’ (VRATS) was developed in 1993 in corpora- tion with the BG-Unfallklinik Frankfurt am Main [2. 31 to train surgical skills of an arthroscopy (the investigation of a knee joint). The main drawback of this system is the lack of haptic feedback, or force feedback in particular. To add force feedback we have attached the tip of one of the instruments to the PHANTOM [4, 51. By touching an anatomical

I -

structure of the knee joint a reaction force will be computed to simulate particular object properties (rigid or smooth). Based upon the current position a collision algorithm [6] determines the surface point, the surface normal and the penetration vector. In order to perceive various properties of different anatomical structures, we have implemented the friction model described (and implemented) by Mark et al. 171.

As the reaction forces depend on the surface normal, distorted perception appears at sharp edges. This effect was described by Zilles and Salisbury [8] and they proposed a linear interpolation between two surface normals (a haptic Gouraud shading). At present we are working on the implementation of this linear interpolation method. Furthermore we will integrate the ‘Darmstadt Device’ (see Fig. 2) into the VRATS. This tool-handling device will be developed by the Institute for Mechatronics at the Darmstadt Technical University in cooperation with IGD. The device is still under construction and will be built to simulate the handling of an exploratory probe which is used during a real arthroscopy. This requires four degrees of freedom (DoF’s): the three possible rotations of the tool-handle (x-. j’-, and z-axis), and additionally one translatoric DoF along the instru- ment axis [9, lo].

The task of adding force feedback to the VR arthroscopy training simulator was solved with the intention to extend IGD’s VR system ‘Virtual Design’ [l] by a subsystem which simulates haptic feedback, in particular force feedback [l 1. 121. Therefore we also discussed a general approach of the integration of haptic displays. Similar to the Graphical Reference Model [13] the haptic subsys- tem can be embedded into an overall structure (see Fig. 3). We call the interface between VR system and haptic subsystem ‘Haptic Programming Interface’ (HPI). The HP1 allows a device independent programming considering different device categories

(position, orientation. and haptic values)

Fig. 1. Haptic and graphic subsystem of the current configuration.

Page 3: The virtual touch: Haptic interfaces in virtual environments

The virtual touch 461

Fig. 2. The Darmstadt Device (photography courtesy of C. Kunstmann. Institute for Mechatronics).

[ 141: exoskeleton and tool-handling devices; force feedback. tactile feedback, and shape forming devices. The ‘Haptic Device Interface’ (HDI) is responsible for the communication with the devices. The kernel system, the haptic subsystem, has the following tasks:

l It has to act as a simulation system to simulate haptic feedback.

l It is to be an interface to the VR system (synchronization of visual, acoustical, and haptic feedback) and haptic displays (haptic rendering).

l It must have ability in multi-tasking and multi- processing.

Simulation of haptic feedback was described by several researchers [S, 151. Some of them have implemented software libraries as functional inter- face to a certain amount of haptic displays (e.g. Armlib [7, 161, or the PHANTOM library from

SensAble Devices). In our approach we will include these software libraries and will implement addi- tional ones. Thus the whole system can be extended by new haptic displays in a modular and configur- able way.

One important aspect is to separate the haptic rendering from the graphic rendering: whereas a graphic update rate of 15-25 Hz is sufficient to get the impression of continuous motion, haptic render- ing using devices like the PHANTOM needs an update rate of 500-1000 Hz. Otherwise it is not possible to create for example the feeling of stiff surfaces and to prevent unwanted sudden forces. Therefore we will realize a system where an object handler controlling a ‘haptic scene’ (haptic object handler) is running separately from the graphic object handler (see Fig. 4).

One part of the system provides ‘logical haptic input devices’ and ‘logical haptic output devices’ according to the concept of ‘logical devices’ of the

Fig. 3. Interface structure of an integrated system

Page 4: The virtual touch: Haptic interfaces in virtual environments

462 J. Dionisio et a!.

Fig. 4. Haptic object handler embedded within the whole system structure.

interaction toolkit. This concept is necessary because haptic devices often will have more passive DoF’s than active ones. For example the PHANTOM with encoder gimbal has seven passive DoF’s (three for location, three for orientation and one switch) but only three active DoF’s. Suggested logical haptic devices are, e.g. ‘location’, ‘orientation’, ‘space’ (combination of them), ‘hand’ and ‘arm’. Using this interface a haptic application communicates with an autonomous haptic server implemented for a special device. This server initializes the device, checks its correct operating state and is responsible for secure operating. Furthermore the server is sending the tracking data to the application and converts the received feedback force data. In future extensions higher functionality may be added to the server, like the capability to render single surfaces as suggested by Mark et al. [16]. On the other hand a data structure for polygonal and other objects suitable for haptic rendering has to be developed, as well as functions to organize the scene in an hierarchical scene graph, and algorithms for haptic rendering. Desirable are library functions for friction, roughness (‘haptic texture’), rendering of polygonal objects, smoothing (haptic Gouraud shading), etc. To decide about useful simulation models our Experimentation Tool can be used. The function library also contains fast collision detection algorithms.

On this basis a haptic object handler has to be implemented which works similar to the object handler of VD II: according to a configuration file the haptic object handler loads a haptic scene represented in an extended file format. which both the graphic and the haptic object handler can interpret. The haptic device servers are started and special modules are linked as specified in the

configuration file. This works well as long as the haptic simulation runs standalone. When haptic feedback as well as visualisation is demanded. the object handlers have to be synchronized both. In a static virtual scene it will be sufficient that the graphic object handler interprets the haptic devices simply as another input device like 6-D-Tracker. To achieve this, the logical haptic input device represents a physical device to the graphic object handler and is mapped onto a logical device-the haptic object handler then has the status of a server for the graphic object handler. But if there is the possibility of complex interaction in the virtual environment, for example deformation or removing and adding of objects. both representations of the scene have to be adapted. For these purposes special modules must be linked to the object handlers. They have to commu- nicate with each other and have to modify the respective scene representation. In this concept it is possible to run an additional external simulation using the same mechanism to acquaint its results to the VR system.

3.2. Temperature feedback The studies actually performed, what we call in a

global sense Haptic Displays, aim at realistic interactive procedures. Thermal phenomena should be included since they are part of our daily reality. On the other hand, if the effects of presence and interactive experience should be emphasized, ther temperature feedback is a parameter to be included towards full senses engagement.

Heat transfer is the flow of heat from a region at 5 higher temperature to that at a lower temperature. II takles place in almost every phase of scientific ant engineering work. There are many processes whicl

Page 5: The virtual touch: Haptic interfaces in virtual environments

The virtual touch 463

describe the transmission of heat either within a body or between a body and its surroundings, and they are all subject to the laws of thermodynamics. The existence of a temperature difference is a distinctive feature of thermal energy which governs the rate of heat transfer.

3.2.1. Physically-hased modelling and the three heat transjtir modes. Heat transfer analysis is based on the equations of mass, momentum and conservation of energy, the second thermodynamic law and on the three phenomenological laws which describe the transfer rate in conduction, convection and radia- tion. These phenomenological laws are the mathe- matical expressions of the models which describe the heat transfer processes.

Heat transfer within a solid body is called conduction, and deals with a heat transfer process at the molecular level. Since grasping procedures in Virtual Environments means simulation of the behaviour of the virtual object as well as the actions exerted on its surfaces by the virtual hand, the basis of a realistic grasping requires the introduction of physically based modelling. In terms of tactile feedback this means representation of surface temperature both in space and time, e.g. haptic rendering. against object collision. At this level a complete thermal feedback, e.g. a body suit. is unnecessary (as well as expensive). A fairly good compromise can be achieved if only the hand (or hands) participate as an interaction agent. Radia- tion, in turn, is a mechanism of heat transfer by means of the propagation of a photon gas of one surface to another where there is no need of a participating medium between both surfaces. An- other difference to the previous phenomena is that the amount of heat transfer is not linear in the temperature difference. This led us to the fomralism of the basic laws of radiation. For flat surfaces with ideal diffuse behaviour, the formalism used in radiosity-like algorithms (applied for photo realistic presentation in Computer Graphics), as well as the thermal formalism in which they are based, resumes itself to an equation of energy balance applied to all the surfaces presented (Equation (1)):

Bi = Ei + PiHi (1)

where Bi stands for the radiant flux of surface i, Ei is the emitted flux of the same surface, pi represents the reflectivity, and finally, Hi is no more than the incident flux from all neighbour surfaces. This radiant incident flux is computed as follows (Equa- tion (2)):

Hi=f:BjFq$ j=l I

(2)

where Fij is called the form factor between surfaces 1 and j. Its value depends on the surfaces’ dimensions and their location (distance and visibility) (Equation (3)):

cos(!~~?) dA,dA r=

2 (3)

This leads to a set of equations in the form of

B, = Ei + p, eBjFi/ (4) j=l

which are we11 known from the radiosity literature. However, for thermal radiation the right-hand side of the equation (E,) adds some special difficulties. Since it represents the emissivity of a surface. and all surfaces have a temperature that must be computed for the thermal balance, all surfaces have a value different of zero. For light computation only the emitters (lights) would have a non-null value and the surrounding surfaces would only reflect the luminous flux. The abovementioned would not imply increased difficulty if, according to heat radiation theory, a linear dependence between reflectivity (p) and temperature (T) takes place. Unfortunately, the radiative thermal emission of a surface is non-linear. thus it depends on the fourth power of the corresponding temperature (Equation (5)):

Ei = t,~c (5)

Regarding the definition of a black body (an ideal body which has the capacity of absorbing all but the reflecting radiation impinging on it), its total emissive power is described by the Stefan-Boltzmann law, while the directional distribution of the radiation intensity follows Lambert’s cosine law. The spectral distribution of the radiation intensity could be given by the Plank distribution law but such computations are beyond the goals of this work. Resuming, for two Lambertian surfaces, with a constant proportion called emissivity (c), and if one considers the radiation between them at different temperatures and situated in a non-absorbing medium), the net exchange of energy between both is given by Equation (6):

Qdif = ql-2 - c-1 = .((I ~AI~,A~ - ~$&FA~A,)

(6)

On the other hand, heat transfer by convection depends on the process of conduction in the region of the fluid near the solid’s surface and on the fluid’s velocity. Its formalism is still rather empirical. The three mechanisms are distinctly different. Both are distinct in their analysis and are distinctly perceived by humans. ‘They do, however, retain common characteristics like the dependency on the tempera- ture and the dimensions of the objects in considera- tion. The proposed concept for a thermal-haptic display, due to the interdisciplinarity of evolved domains, brought together constraints and guidelines from human physiology to the specialized electro- nics. The surface temperature and directional heat fluxes should be taken into account, which means

Page 6: The virtual touch: Haptic interfaces in virtual environments

464 J. Dionisio et ai.

that the output devices should simulate the three heat transfer modes. On the other hand, on-line control and regulation of the power units for the tempera- ture sources is demanded (Fig. 5).

In order to test the environment under thermal properties a set of distinct objects with distinct semantics. i.e. with different physically-based beha- viour, were developed. Such objects stand for the heat sources representing the three heat modes.

3.2.2. Thermopad: a haptic-temperature display. The global integration is reached with the help of a special hardware output device [ 171. This is com- posed of a control system-SAB 80C535-and different instruments for temperature perception, namely, infrared bulb lamps, small fans, and Peltier elements. The infrared lamps serve to warm the large area, the ventilators serve to cool it. The Peltier elements are placed on the skin surface. Their purpose is the local warming and cooling of the skin. For the production of a spatial temperature feeling distinct types of devices were coupled, and each module was repeated in distinct directions. Ventilators and infrared lamps are joined into units. The Peltier elements can be placed at different places on the body surface. Given these attributes, the Peltier elements seem to be highly suitable for cooling tasks where space, performance and a natural environmental range of temperatures play an important role. Thus, on a surface of 15 mmx 15 mmx4.6 mm (in our case) each of the used Peltier elements attain a cooling capability up to approximately 20 W or a temperature difference of up to 60 K. Furthermore, unlike other well known

types of cooling units, the output of these devices can be constantly and relatively simple regulated. Basi- cally it consists of two semiconductor limbs. one being, n-type, the other p-type. The two limbs are linked at one end by a Cooper bridge. If a DC current flows through this element, one side cools and ‘consequently extracts thermal energy from its surroundings. The thermal energy absorbed by the cold side is given off again into the surroundings of the warm side. The principle consists in ‘pumping’ thermal energy from the cold side to the warm side. By changing the polarity of the current it is possible to reverse the direction of the heat flow.

3.23. Sensorial perception and interaction in virtual scmurios. Infrared lamps or distinct ventilators, positioned normal to the 3-D cursor on the virtual scene (the one which represents the human sensor), are able to simulate distributed sources of heat and cold with different intensities. For each set of (Lamp. Ventilator) distinct temperature levels could be used, depending on the variation of the power supply intensity. For the case of radiative heat transfer the cursor is a virtual pyramid with three faces, each representing a directional stimulus from the thermal environment against the virtual user. The resultant on-line computed heat flux will activate the lamp with distinct levels of intensity. This scheme is possible because of their directional behaviour. I f more directions should be obtained, then the cursor should be correspondingly improved to provide the same number of faces. Computed heat values are visualized as pseudo-color textures mapped on the object’s surface. A three-dimensional representation

Fig. 5. Concept for heat soutces distribution.

Page 7: The virtual touch: Haptic interfaces in virtual environments

is also possible by means of a virtual plane which can be interactively moved (see Fig. 6).

A slightly different case stands for the convection simulation by means of a toaster. This time the particles have a random path within a preferential direction and haptic values are obtained against collision of the hand and the particles through a bounding box approach. Different particles have different colors depending on radial position and distance from the toaster. Colors are assigned to temperatures and when collision occurs the hand gets the color of the particle. A callback action actualizes the status of the haptic devices (see Fig. 7). Changes in the devices’ activation and regulation of intensi- ties, among others, are tasks that an intelligent control device could perform. In this case, a microcontroller is the usual and suitable solution, since it allows a direct hardware communication with the Virtual Reality system, and the connection of peripherals (heat and wind sources, sensors). The connection is made over a serial interface (RS232) with the Virtual Reality system.

3.3. Motion fiedback Properties of the vestibular system in the inner ear,

which perceives translational and angular stimuli, form the base of motion simulation. Because of its physiological properties the Vestibular system is not able to distinguish whether it is accelerated transla- tionally or just slightly rotated. So it is possible to produce a sustained impression of acceleration (sustained motion cue) by rotating the user and presenting him/her with a visual feedback of translational acceleration in conformance with the degree of rotation. In this context the conformity of haptic and visual stimuli is of prime importance. To avoid motion sickness it is important to beware of contradicting haptic and visual stimuli.

A VR environment makes special demands on a motion simulation system. The following points should be considered:

l proper dimension and type of motion base: l no forging of tracking values through platform

motions;

Fig. 7. Convective heat distribution: a case for collision detection.

Page 8: The virtual touch: Haptic interfaces in virtual environments

466 J. Dionisio et al.

Q quick procedures to calculate the position and motion of the system;

l no decrease of navigation facilities due to motion feedback.

There are many possibilities to integrate a motion base into a VR environment [19]. To evaluate the different compositions the installation effort, appear- ing difficulties and especially the degree of immersion should be consulted. The tracking system of a possibly used HMD or dataglove should always be fixed directly on the motion base to avoid forged tracking values.

50% of all airline passengers suffer from fear of flight. Therefore a small team of psychologists and psychiatrists together with the IGD VR team have come up with the idea of using VR for therapy by confronting patients with their fears. Thereby the feeling of motion is, beside the visual and acoustic hints, a main trigger of this fear.

The process which approximates user demanded acceleration with possible movements and rotations of the motion base is called washout process. To separate this process from the rendering loop we developed a ‘motion simulation server’. It runs on the computer to which the motion base is physically connected and is controlled by the VR application. which communicates with the server using an interaction toolkit. Dependent on the chosen wash- out process the VR application has to send a combination of accelerations, velocities or orienta- tion values.

The resulting VR scenario makes it possible to experience a virtual flight, using visual, aural and haptic stimuli. Starting at the waiting room of the airport the user (patient), who can freely look around, is led to his seat in the airplane. The typical narrowness of this place was strengthened by additional virtual passengers nearby (see Fig. 9).

Besides the specific properties of the motion base, like range, velocity and accelerations, it is also possible to change the physiological properties (like perception and tolerating thresholds for translations and rotations respectively) for the user. This is of importance to obtain an effective simulation of accelerations, because of different perceptual cap- abilities of individuals.

The operator (psychiatrist) starts the simulated flight, which starts with a greeting from the captain and some safety announcements. Meanwhile the virtual airplane is rolling to the runway and is starting. Now the user perceives some curves and turbulence. The operator is able to start the landing process at any period. During all this time the operator is able to scale the strength of the motion rendering which effects the intensity of accelerations occurring during the start, as well as in turbulence and landing. Thus, it is possible to lead the patient slowly to more and more extreme situations. A sample testing of persons, suffering from fear of flying, showed that they felt present in the virtual scene [18].

4. DISCUSSION OF ACHIEVED RESULTS

Since February 1996 the Fraunhofer IGD owns a motion base (see Fig. 8) to simulate vestibular feedback. Based on a special construction, which was first described by Stewart in 1965, this hydraulic powered. silent running motion base is able to translate (up to 0.25 m) and to rotate (up to 45”) its payload (which may be up to 300 kg) in all DoF’s with short accelerations up to 5 g.

A first prototype application was developed in order to support psychologists to help people with fear of flying. International studies state that 25-

The inclusion of commercially available devices, namely the PHANTOM into the VR Arthroscopy Training Simulator, is emphasized. The VR Arthro- scopy Training Simulator was the first application where we integrated the PHANTOM. The controlling update rate of this force feedback device is very high (500&1000 Hz) compared to the graphics update rate (l&60 Hz). To guarantee both update rates and to synchronize both simulations (force feedback and visual feedback) the whole system is distributed. This allows the processing of the simulations on a multiprocessor workstation or on different work- stations. In the current configuration the force

Fig. 8. IGD’s motion platform.

Page 9: The virtual touch: Haptic interfaces in virtual environments

The virtual touch 467

transparent to radiation) but not within surfaces. Last but not Least, motion simulation is highlighted as a major contribution for navigation towards an immersive experience.

Fig. 9. Virtual passengers.

feedback simulation is running on a mid-sized graphics workstation whereas the visualization is running on a high-end graphics workstation.

We have given a description of a complete thermal feedback system consisting of different components devoted to model, control and physically generate the adequate stimuli on the operator’s skin (hand and exposed parts of the body). The so-called VR- thermal demonstrator, although still needing thor- ough technical improvement. offers a good starting point for broader investigations. Within a computer simulation, the prototype can be used for normal comfort temperature range ergonomics--e.g. simula- tion of an office occupation), telepresence (signal transfer from a remote site), chemical reactions, as well as a warning variable (proximity), collision recognition or just sensory substitution. In the simulation of environmental temperatures a direc- tional feeling, as well as a perception of different levels of intensity. was achieved by the devices used (infrared lamps and small fans). Variables like room temperature or occasional local changes were very difficult to control and only a qualitative analysis could be performed. In the case of the Peltier elements, the dimensioning had some practical problems too. One calculates of course, in principle, which performance is required, although the data about heat capacity and operational losses in contact with the skin surface is not constant. Especially interesting is how quickly the temperature of the skin can be changed. These characteristics are typical for heat phenomena since time is a fundamental element for heat propagation within a material. The mole- cular processes involved are not instantaneous (or even approximately so) and, therefore, distinguish themselves from the usual approximations of visual or acoustic simulation, which can be approximated to a level near the speed of light. These considera- tions can be used in terms of heat radiation through the air (which can be approximated as being

I.

2.

3.

4.

5.

6.

I.

8.

9.

10.

11.

12.

13.

Acknowledgements-We wish to thank Prof. Dr.-Ing. Jo& L. EncarnacBo for providing the environment in which this work was possible. Furthermore. we wish to thank Prof. Dr.-Ing. H. WeiBmantel and Christian Kunstmann from the Institute for Mechatronics for their support of the project. We also thank all our colleagues and the students at our labs. Without their work we would not have been able to achieve the results presented in this paper.

REFERENCES

Astheimer, P., Dai, F.. Felger. W.. GBbel, M., Haase. H., Mtiller, S. and Ziegler, R.. Virtual Design II-an advanced VR system for industrial applications. In Proceedings Virtual Reality World ‘95. Stuttgart. 1995. pp. 337-363. Bauer. A., ISoldner, E. H., Ziegler. R. and Mtiller. W.. Virtual reaity in the surgical arthroscopical training. Presented at Second International Symposium on Medical Robotics and Computer Assisted Surgery (MRCAS). November 5-7. 1995. Ziegler, R., Miiller, W.. Fischer. G. and Giibel. M.. Virtual reality medical training system. In Co,nputrp I’ision. Virtual Recrliw ujld Robotics in Medicine, ed. N. Ayache. Springer-Veilag. Berlin. 1995, pp. 282-286. Massie. T. and Salisburv. J.. The PHANTOM hantic interface: a device for probing virtual objects.- In Proceeding;, of the ASME Wirzter Annual Meeting, Symposium on Huptic Interfbces for Virtual Environ- nrertf cmd TcleoperLrrur S.t~tels. Chicago, IL, November 1994. Massie, T., Initial haptic explorations with the PHAN- TOM: virtual touch through point interaction. MSc Thesis. submitted to Department of Mechanical En- gineering, February 1996. Bockholt, U.. Deformation von anatomischen Struktu- ren in virtuellen Umgebungen. Thesis. Fachbereich Mathematik, Johannes Gutenberg Universitat Mainz. Mainz, Jan.rary 1997 (in German). Mark. W. R., Randolph, S. C., Finch, M.. Van Verth. J. M. and Taylor, II, R. M.. Adding force feedback to graphics systems: issues and solutions. In Computer Graphics Proc. SIGGRAPH ??6. Annual Conference Series, Addison-Wesley. Reading. MA. 1996. Zilles. L.B. and Salisbury, J.K., A constraint-based god-object method for haptic display. In Proc. IEEE; RSJ Inrernationul Collferencr on Inielli.qent Rohot.r and Systenzs. August 5-9, 1995. Buss, W., Kraftregelung fur haptisches Display eines VR-Arthroskopiesimulators. Report. Institute for Me- chatronics, TH Darmstadt, 1996 (in German). Ziegler, R., Brandt. C.. Kunstmann, C., Mtlller, W. and Werkhauser, H., Haptic display for the VR Arthro- scopy Training Simulator. To be published in Proc. of‘ IS&T/SPIE’s 9th Annual Syqwsimn EI’97-Elwtronic, Imaging: Science and Technologv. San Jose. CA. 1997. Brandt. C.. Konzept zur Integration eines haptischen Displavs in ein VR System. Thesis, Fachaebiet Gra- phisch:Interaktive Systeme, Fachbereich informatik. TH Darmstadt. Darmstadt, 1995 (in German). Henrich. V , Integration eines haptischen Displays in eine VR-Applikation. Thesis. Fachbereich Informatik. FH Darmstadt. Darmstadt, January 1997 (in German). Alheit. B.. Gobel. M.. Mehl. M. and Ziegler. R.. CGI und CGM-~Graphische Sttmdards fi?r - die Pin.xis. Springer-Verlag. Berlin. 199 I.

Page 10: The virtual touch: Haptic interfaces in virtual environments

468 J. Dionisio et al.

14. Ziegler, R., Haptic displays-how can we feel virtual environments. STAR Report; in Proc. Eurographics ‘96, Poitiers, France, August 2630, 1996.

15. Burdea, G., Research on portable force feedback masters for virtual reality. In Proceedings Virtual Rea1it.v World ‘95. 1995, pp. 317-324.

16. Mark. W. R.. Randolph, S. C., Finch, M. and Van Verth, J. M., UNC-CH force-feedback library, revision C.O. Computer Science Department, University of North Carolina at Chapel Hill, Chapel Hill, NC, February 1996.

17. Dionisio, J., Virtual hell: a trip through the flames. To appear in IEEE Computer Graphics and Applications.

18. Douloumi, E., Einsatz von virtueller Realitlt in der Behandlung von Flugangst. Thesis, Fachgebiet Gra- phisch-Interaktive Systeme, Fachbereich Informatik, TH Darmstadt, Darmstadt, June 1996 (in German).

19. Jakob, U. and Douloumi, E., Let’s move-on the integration of motion-rendering in VR. To be published in Proceedings of the IS&T/SPIE’s 9th Annual Svmpo- sium EI’97-Electronic Imaging: Science and Technol- ogy. San Jose, CA, 1997.