7
Immersive Escape-Route Scenario with Locomotion Devices Thomas Jung, Ivo Haulsen Institute for Computer Architecture and Software Technology German National Research Center for Information Technology Kekulestr. 7 D-12489 Berlin, GERMANY +49-30-6392-1800 tj,[email protected] ABSTRACT The simulation of damaged buildings is of great interest. Vir- tual-reality technologies allow us to experience such a situ- ation. A main concern here is escaping from the building. To simulate this scenario, a model of the environment and an interface for locomotion are needed. Locomotion inside virtual buildings requires special input devices. This paper discusses several simulation methods for human locomotion in an escape-route scenario. We present some results for a new locomotion device for escaping from of a virtual house that is on fire. KEYWORDS: Locomotion, full-motion-interface, escape- route scenario, cost-efficient device INTRODUCTION Technological advances in the areas of computer-graphics hardware and virtual-reality technology have made it possi- ble to develop virtual environments for a wide class of ap- plications. One of these application areas is architectural de- sign. Escape Route Scenario Buildings and architectural sites can be simulated visually to enable future users to inspect buildings during the planning phase. If users detect weak points during a virtual walk- through, the design of the building can be changed, thus avoiding costly reconstruction measures. To detect weak points, the relevant properties of the build- ing have to be simulated. For instance, to check the lighting conditions at a particular workplace, all the characteristics that influence illumination (daylight, artificial lighting, trans- parency of building materials, diffuse interreflections, etc.) have to be properly simulated. A key concern when planning a new building is user safety. Escape routes have to be carefully planned to ensure that vis- itors can leave the building quickly in case of an emergency. One such situation is a building that is on fire. Burning build- ings can be visualized and inspected interactively (e.g. [1]). In what follows we discuss how escaping from such a build- ing can be simulated by means of immersive visualization and interaction technology. Users are placed in a virtual burning building with which they have no previous acquain- tance. They are instructed to leave the building as quickly as possible following the escape-route signs. If the perception of users in the virtual environment and in reality were identical, users would not be able to distinguish between the simulation and reality and their behavior could be transferred to a real-life situation. To enable users to for- get that they are moving in a virtual environment the degree of immersion must be as high as possible. One main concern, then is to simulate the visual (e.g., impairment of vision by smoke) and auditive stimuli as realistically as possible. An- other concern is supporting the realistic impression of loco- motion. Locomotion scenarios using vehicles (e.g., flight simulation, driving simulation) can be simulated in acceptable quality using an existing human-machine interface. Since there is a real cockpit in most flight-simulation systems, cutaneous stimuli do not have to be simulated. The system only simu- lates vision, audio and aircraft dynamics. There is no consensus on how to technically realize natural human locomotion (e.g., walking, running). One approach designed to enhance the quality of perception is intersensory integration. The human sensory apparatus is able to combine information incoming from several sense modalities to create a single, coherent sensation [2]. Even where the same envi- ronmental status is channeled across multiple modalities, the redundancy is not wasteful [3, 2]. Virtual Reality Technology In multimodal virtual environments, not only the visual sense but also the auditive, cutaneous, kinesthetic or vestibular 1

Immersive Escape-Route Scenario with Locomotion Devices

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Immer sive Escape-Route Scenario with LocomotionDevices

Thomas Jung, Ivo HaulsenInstitutefor ComputerArchitectureandSoftwareTechnologyGermanNationalResearchCenterfor InformationTechnology

Kekulestr. 7D-12489Berlin, GERMANY

+49-30-6392-1800tj,[email protected]

ABSTRACT

Thesimulationof damagedbuildingsis of greatinterest.Vir-tual-realitytechnologiesallow us to experiencesucha situ-ation. A main concernhereis escapingfrom the building.To simulatethis scenario,a modelof the environmentandan interfacefor locomotionareneeded.Locomotioninsidevirtual buildings requiresspecialinput devices. This paperdiscussesseveralsimulationmethodsfor humanlocomotionin an escape-routescenario.We presentsomeresultsfor anew locomotiondevice for escapingfrom of a virtual housethatis onfire.

KEYWORDS: Locomotion,full-motion-interface,escape-routescenario,cost-efficientdevice

INTRODUCTION

Technologicaladvancesin the areasof computer-graphicshardwareandvirtual-reality technologyhave madeit possi-ble to develop virtual environmentsfor a wide classof ap-plications.Oneof theseapplicationareasis architecturalde-sign.

Escape Route Scenario

Buildingsandarchitecturalsitescanbesimulatedvisually toenablefuture usersto inspectbuildingsduring theplanningphase. If usersdetectweak points during a virtual walk-through, the designof the building can be changed,thusavoidingcostlyreconstructionmeasures.

To detectweakpoints, the relevant propertiesof the build-ing have to besimulated.For instance,to checkthelightingconditionsat a particularworkplace,all the characteristicsthatinfluenceillumination(daylight,artificial lighting, trans-parency of building materials,diffuse interreflections,etc.)have to beproperlysimulated.

A key concernwhenplanninga new building is usersafety.Escaperouteshaveto becarefullyplannedto ensurethatvis-itorscanleave thebuilding quickly in caseof anemergency.Onesuchsituationis abuilding thatis onfire. Burningbuild-ingscanbevisualizedandinspectedinteractively (e.g.[1]).

In whatfollowswediscusshow escapingfrom sucha build-ing can be simulatedby meansof immersive visualizationand interactiontechnology. Usersare placedin a virtualburningbuilding with which they have no previousacquain-tance.They areinstructedto leave thebuilding asquickly aspossiblefollowing theescape-routesigns.

If the perceptionof usersin the virtual environmentandinreality wereidentical,userswould not beableto distinguishbetweenthesimulationandreality andtheir behavior couldbetransferredto a real-life situation.To enableusersto for-getthat they aremoving in a virtual environmentthedegreeof immersionmustbeashighaspossible.Onemainconcern,thenis to simulatethevisual (e.g.,impairmentof vision bysmoke) andauditive stimuli asrealisticallyaspossible.An-otherconcernis supportingtherealisticimpressionof loco-motion.

Locomotionscenariosusingvehicles(e.g.,flight simulation,driving simulation)can be simulatedin acceptablequalityusingan existing human-machineinterface. Sincethereisa real cockpit in mostflight-simulationsystems,cutaneousstimuli do not have to besimulated.Thesystemonly simu-latesvision,audioandaircraftdynamics.

Thereis no consensuson how to technicallyrealizenaturalhumanlocomotion(e.g.,walking, running). Oneapproachdesignedto enhancethequalityof perceptionis intersensoryintegration.Thehumansensoryapparatusis ableto combineinformationincomingfrom severalsensemodalitiesto createa single,coherentsensation[2]. Evenwherethesameenvi-ronmentalstatusis channeledacrossmultiplemodalities,theredundancy is notwasteful[3, 2].

Vir tual Reality TechnologyIn multimodalvirtual environments,notonly thevisualsensebut also the auditive, cutaneous,kinestheticor vestibular

1

Figure 1: Full-motion interfaces: (a) Spatial Navigator, (b) Omni-directional treadmill [15] (copyright by Darken et al.), (c)Cybersphere [16] (copyright by Findlay Publications)

sensesmaybestimulatedto enhanceimmersion.Thevisual,auditive andcutaneoussensescanbestimulateddirectly bymeansof technicaldevices(display, headphones,hapticde-vices,etc.),but thekinestheticandvestibularstimuli canonlybe achieved indirectly becausethe receptorsare inside thehumanbody.

Therearetwo mainapproachesfor implementingimmersivevirtual environments: spatially-immersive displays(SIDs)andhead-mounteddisplays(HMDs). A SID is adisplaythatphysicallysurroundstheviewerwith apanoramaof imagery[4] (e. g., theCaveAutomatedVirtual Environment[5]) Au-ditivestimuli maybesuppliedvia multipleloudspeakers.AnHMD [6] consistsof two smalldisplaysmountedontheheadin front of the viewer’s eyes. Auditive stimuli may be sup-plied via headphones.In both approaches,headtrackingisusedto determinethe viewer’s locationandorientationforimagegeneration.

Someresearchis currentlybeingdoneon olfactorystimula-tion for virtual environments[7], but therealisticsimulationof odorsfor virtual buildingsthatareonfire hasnotyetbeenachieved.

The cutaneoussensesmay be stimulatedvia a haptic inter-face. It appliesforce feedbackto theuser’s skin while theyare interactingin theseenvironments. Therearesomeap-proachesthatapplyforcefeedbackto theuser’shandor arm(e.g.,[8, 9]), becausemostmanipulationsof theenvironmentinvolveuseof thehand.

BurdeaandCoiffet [10] proposeusinga light-weight force-amplifier suit called ”Jedi” to apply force feedbackto thewholebody. However, thereis currentlyno implementationof whole-bodyforce feedbackin virtual environments.Fornavigation tasks,force feedbackmaybeneglected.For ex-ample,doorscanbe openedautomaticallyandfeedbackincollisionsituationscanonly berenderedby sound.

Duringavirtualwalk-through,it is thekinestheticandvestibu-lar sensesthat are adressed. To properly stimulatethesesenses,usershave to performsimilar motionsto thosein anaturalenvironment.Humanbeingsareableto perceivepo-

sitions and accelerationsof their limbs as well as acceler-ation in space,rotationand orientationin the gravitationalfield. To achieverealisticsimulation,theexertioncausedbymovementsshouldbesimilar to naturallocomotion.

LOCOMOTION IN VIRTUAL ENVIRONMENTS

In many applications,thevirtual environmenthaslargerdi-mensionsthanthephysicalworkspace.A devicehasto trans-form physicalmotionsinto motionsin avirtual environment.Manual navigational control (e.g., using the mouse,key-board,or joystick)is providedfor many applications.Motioninterfacesthat involveotherpartsof thehumanbodyrelievethe handsof the taskof navigationalcontrol, leaving themfreefor othermorenaturalmanualmanipulations.

Sufficient-motioninterfacesactivatethekinestheticandvestibu-lar senses[11] suchthatself-motioncuesaresentto thebrainthroughthe initiation of the appropriatemuscularcontrac-tionsandtranslationalandrotationalaccelerations[12, 13].Onesuchinterfaceis theVirtual Motion Controller(VMC)[11]. Participantsstandon a platform, controlling the mo-tion directionby changingtheir centerof gravity. Headori-entationis trackedby a magnetictrackingsystem,enablingindependentcontrol of gazeand travel direction. Petersonshowed, that wayfinding is influencedby the motion inter-face.Usersperformingasurvey-relatedwayfindingtaskwiththeVMC in a spatiallycomplex environmentarefasterthanthoseusinga joystick device [14]. Petersonsuggestsincor-poratingthe naturalwalking motion to reduceparticipantsfatigueandincreasenaturalnessof interaction.

Full-motion interfaces

Full-motion interfacespromisea morenaturalinvolvementof mostof theintegratedmodalities.Theimplementationofthe interfacedependson the desiredmotion. In fact, therearefour typesof full-motion interfacesfor humanwalking:treadmills,foot followers,cybersphereandslidingdevices.

The simplesttype of treadmill, which is usedfor navigat-ing in virtual environments,is a user-drivenuni-directionaltreadmill(e.g.,[17, 18], seeFig. 1(a)). Theuserholdson toasteeringbarwhile walkingon thetreadmill.

2

HMDtracker VRML-based

visualizationenvironment

VirtualWalker I

VirtualWalker II

Positionserver

simulationFire

EAI

FMsystem

VRMLfile

HMD

Figure 2: Architecture of FACIT system

If the position of the user is tracked, the treadmill can bedrivenby amotor. It is thannotnecessaryto holdonto abar.The systemmust always bring the userback to the centerof the device. If the userchangesspeed,the speedof thebelt hasto be adaptedaccordingly. Abrupt changesin thespeedof thebelt resultin unnaturalforces,whichmaycausetheuserto stumble.Usersmustlearnto handlethis device.Treadmillscanalsobeusedto simulateslopinggroundandforcefeedback[19].

Walking in any direction is supportedby omni-directionaltreadmills(ODTs). Darken et al. built an ODT usingtwoperpendiculartreadmills[15] (seeFig. 1(b)). The top beltconsistsof freelyrotatingrollerswhicharedrivenbyanother,orthogonallyoriented,beltbelow. Maximumuservelocity is2m/sec.Darkenetal. observedthatusersturnslowly anddonotsuddenlyaccelerateor deceleratewhenusingtheODT inorder to avoid unexpectedsystemresponses.Darken et al.statethat”skill level playstoo importanta role in determin-ing theusabilityof thesystem.” [15].

Foot followersconsistof two small movableplatforms. Insimple types of foot-following systems,the platforms arefoot-driven. Examplesof suchdevices,called”Virtual Wal-ker”, aregivenin thenext section.

Lathamsuggestsusinga moresophisticatedfoot-followingdevice in which small platformsmove automaticallyto thepositionsuserswant to stepon [20]. SystranCooperationproposea device called LocoSim. Boot plates,which areconnectedto the users’feet, can be moved individually tovariouspositions.Thedeviceis ableto supportwalking,run-ning andcrawling over differentterrainsaswell asascend-ing or descendingstairs[21]. Thereare,to our knowledge,no publicationsso far on the implementationof thesetwoproposals.

A cybersphereis a large rotating spheredriven by a userwalking on the inside. Imagesof the environmentarepro-jectedonto thesurfaceof the sphere.Theusercanwalk in

any direction.Movementis detectedby anotherball pressedagainstthe undersideof the sphere.An implementationofthe cybersphere,with a diameterof 3.5m, was developedby J. Eyre (seeFig. 1(c)). It weights270 kg andpresentsabouttwice the inertia of a manstartingor stoppingwalk-ing. Abruptchangesof speedarethereforenotpossiblewiththecurrentdemonstrator. Eyreplansto integrateaservodrivesystem.Theball shouldberotatedby thesystemaccordingto themovementsmadeby theuser[16].

IwataandFujii developedanomni-directionalslidingdevicecalledthe ”Virtual Perambulator” [22]. Usershave to wearspecialshoeswith a low friction film on the sole. Rubbersolesattachedto thetopof theshoesareusedasbrakepads.Userscanphysicallywalk andturn insidea hoopwith a di-ameterof 70cm.Usersmusthold thehoopto keeptheir bal-ance.Trainedusersmaypushtheir waistsagainstthe hoopwhile walking [22].

Relation between gaze direction and motion directionOnebasicrequirementof motioninterfacesis thatusersareableto control the motion direction. Most of the describedinterfaces(e.g.,[15, 16, 22]) allow usersto look in any direc-tion while moving forward.

Independentcontrol of gazeand travel directionwasa ba-sic designgoalof theVMC [14]. Petersonassertsthatpeo-ple frequentlywalk alonga straightpathwhile rotatingtheirheadsto theleft andright to scantheenvironment.However,he observed that participantsdid not attemptto usethe in-dependentcontrol of gazeandtravel directionprovided byVMC. Participantsactuallybecomedisorientedif gazeandtravel directionaremisalignedwhile they aremoving [14].

Pauschet al. fielded a virtual environmentat the EPCOTcenterfor a period of fourteenmonths,during which theyobservedthereactionsof over45,000guests.Mostusersre-frainedfrom turningtheir headsmuchwhile flying a virtualmagiccarpet[23]. Pauschet al. assumethat usersdid notexploit this possibility becauseheadtracking is far enough

3

Figure 3: Bird-eye view of spread of temperature in a simulated floor

from mostguests’experiencewith film andtelevision. An-otherexplanationis thatuserspiloting avehicleusuallylookin thedirectionthey aremoving in.

We claim there is a dependencebetweenvelocity and thetendency to look in the directionof motion. Userswho arestandingstill maylook aroundfor orientation,if they arerun-ning, they alwayslook in thedirectionof motion. Often,theenvironmentis scannedby simply moving the eyes to theleft or right. Theorientationof theheadis alignedto themo-tion direction.Following thisassumptionfor anescape-routescenario,headtrackingcanbe usedfor controlling motiondirection,resultingin a muchsimplersystemdesign.

THE FACIT SYSTEMThe FACIT project (”ExtendingFacility ManagementSys-temsby Immersive RenderingandInteractionTechniques”)setsout to examinehow tasksfrom theareaof facility man-agement(FM) canbeperformedby usinganimmersiveuserinterface. Sincemany FM taskscan be satisfactorily per-formedby usingsimpleuserinterfaces,thereis no point indevelopinga new immersive FM system. Instead,an inde-pendentimmersive userinterfaceis beingdevelopedwhichcanbeconnectedto variouscommercialFM systems.

Thearchitectureof theFACIT systemis shown in Figure2.Theimmersiveuserinterfaceis basedonaVRML97 (VirtualRealityModelingLanguage)[24] browser. VariousFM sys-temsaswell asafire-simulationsystemcanbeconnectedviatheexternalauthoringinterface(EAI)[25]. Immersive inputdevicesmodify theviewer’spositionin theVRML sceneviathe EAI, too. Thebrowseris executedin full-screenmode.Theoutputis displayedonanHMD.

Escape-r oute scenearioWe testedseveralFM applicationswith our immersive userinterface.Oneof thesewastheescape-routescenario:a userhasto escapefrom an unknown virtual building that is onfire asquickly aspossible,guidedby escape-routesigns.Thesystemallowsusto studytheimpactmodifyingescape-routedesign(from changingthepositionsof escape-routesignstoacompleteredesignof floor architecture)to reducethetimesneededto reachthe exit (RE times). If a reductionin REtimesin thevirtual environmentcouldbetransferredto real-ity, we would have a powerful tool for planningandvalidat-ing escaperoutes.

Figure 4: Smoke-filled floor after (a) 170 sec. (b) 480sec.

TheCFAST system[26] is usedto computethespreadof fireandsmoke. It providesa simulationof theimpactof fire andits byproductsinsidea building. (Figure3 shows thespreadof temperaturein a virtual testfloor aftersomeminutes.)

As theuserwalksthroughthebuilding, visibility is changedaccordingto simulation results. We implementedsmoke-filled floorsby insertingsemi-transparentpolygonswith ani-matedtexturesinto thescene(seeFigure4), becausevolumerenderingis currentlynot supportedin VRML. To increaseimmersionandgive the usersomeideaof thesourceof thefire, a position-dependentstereosoundof crackling is sup-pliedvia theheadphonesof theHMD.

Locomotion devices

After conductinga few experiments,we realizedthata sim-ple hand-controllednavigationmetaphorwould not be real-istic enoughto transferRE timesto reality. We thereforede-cidedto integrateafull-motion interfaceinto ourvirtualenvi-ronment.Our first attemptwasto integratea uni-directionaltreadmillcalledthe”SpatialNavigator” [18] into oursystem.However, testswith severalusersdemonstratedthat the ap-proximationof humanlocomotionwasnot realistic. Sinceusershadto holdon to asteeringbar, they saidthey felt asifthey werepushinga perambulatoror a trolley.

4

Figure 5: Virtual Walker I in (a) walking mode and (b) orientation mode

Later, we consideredusinganODT or a cybersphere.How-ever, neitherof thesesystemssupportsrunningandchangingspeedsuddenly[15, 16]. Locomotionwith sliding [22] de-vice did not appearrealisticenoughbecauseusershave tohold on to a hoop. We thereforebegandevelopinga newdevicemeantto besuitablefor theescape-routescenario.

Thedevice hasto besafe,easyto useandsuitablefor highvelocitiesand abruptchangesof velocity. Theserequire-mentsaremetby fitness-trainingdevices. We implementedtwo locomotiondevices using mechanicalpartsof devicesfrom an”air walker” and”elliptical trainer”.

The first system,called”Virtual Walker I”, wasbuilt fromcomponentsof an”air walker”. It wasenhancedby a track-ingsystem,thatmeasurestheexcursionof theplatforms.Thecaptureddatais filteredandtransformedinto a linearmove-ment.Usershaveto swingtheir legsto moveforward.Giventhe device’s symmetryof motion, it is not possibleto dis-tinguishbetweenwalking forward andbackward. Walkingbackwardis thereforenot supportedby thedevice. Thesys-tem supportsa naturalwalking rhythm(stepsizeof 73 cm,107steps/min.).Sincetheresistanceof thedeviceis low, ve-locity caneasilybe changed.As the pelvis is nearlystablewhile walking,hands-freelocomotionis possible.While themotionpathsof the thighsandfeetarequite realistic,manyusersdonotbendtheir kneeswhenwalkingon thedevice.

Theseconddevice,called”Virtual Walker II”, supportsmorerealisticmotionpaths.Theelliptical motionpathof theplat-forms approximatethe naturalmotion path of humanfeetwhenwalking. Theplatformsserve aspedals,driving a fly-wheel. Unlike the first device, Virtual Walker II allows a

Figure 6: Virtual Walker II

distinction to be madebetweenforward and backward lo-comotion. The flywheel’s inertia impedesaccelerationanddeceleration.It takessomepracticeto usethedevicewithoutholdingon.

Contr olling motion direction

Both air walker andelliptical traineraredesignedfor walk-ing in a straightline. To control motion direction,we first

5

Immersion PerformanceDevice State Motion Acting Hands Velocity Changesin Subjective

paths forces free velocity assessmentHand-controlled tested - - - - - - - +++ +++ not immersivenavigationSpatial tested ++ - - - ++ ++ likepushingNavigator[18] a perambulatorCyberSphere not +++ - + ++ (?) - -[16] testedODT [15] not +++ - - + - - -

testedVirtual not ++ - - (?) + + (?)Perambulator[22] testedVirtual tested + + + ++ ++ like ”leg inWalker I plaster”Virtual tested ++ - + ++ ++ like ridingWalker II a bicycle

Table 1: Comparison of locomotion devices for the escape-route scenario

tried addingtracking componentsto the user’s body (e.g.,at the hip andshoulder).As the testpersonsusingour de-vice,wereunablewhile walking to rotatepartsof their bodyin onedirectionwhil looking in another, we decidedto useheadtrackingto controlthemotiondirection.

We distinguishbetweena walking modeandan orientationmode. Whenusersstandstill, regularheadtrackingis exe-cuted.Thisallowsthemto lookaroundfor orientation.Whenusersmove their legs,gazedirectionis employedto controlmotion direction. The gazedirectioncanbe split into yaw,pitch androll. Yaw is usedto determinewalking direction.As pitchandroll arestill usedfor changingorientation,usersare able to look up or down while walking througha vir-tual building. Yaw is mappedto an angularvelocity whichis combinedwith thelinearvelocityof theuseraccordingtothefollowing equations:

� � � � � � � � � � � � � � �

� � � � � � � � � � � � � � � ��� � �� � � � � � � � �

In our currentimplementation,� is set to one. A userthatlooks straight to the left rotatesat an angularvelocity of90 degreesper secondto the left, for example. Whenma-neuveringa tight turn, usershave to move their legsslowly,while looking straightto the left or theright. We foundthatuserslearn to solve maneuvering tasks(e.g. turning roundandwalkingaway from a wall aftercollision)veryquickly.

RESULTS AND FUTURE WORKWedevelopedsafe,cost-efficientandeasy-to-uselocomotiondevicesfor walking throughvirtual buildingsat high speed.

We tried out our deviceson a largenumberof testpersons.Nearlyevery visitor to our institutewasaskedto try out thesystem. The participantswereasked informally what theythoughtof thesystem.Table1 givesa summaryandevalu-ationof our observations(+++ for bestassessment,- - - forworst assessment).By way of comparison,we includedinthe tablethe locomotionapproachesthat werenot testedinour escape-routescenario,reflectingour assesmentof theirsuitability.

As almostall theparticipantsareableto useour full-motioninterfacesaftera few seconds’practice,theturninginterfacewould appearto beintuitive. Many participantswereimme-diatelyableto move throughthebuilding at high speed.Toaccelerateanddecelerate,usershave to exert extra forcebyusing their leg muscles. Although the force patternis notrealistic,thismayenhanceimmersion.

Our devicesare still underdevelopment. One of our nextstepswill beto addamotorto ourseconddeviceto drivetheflywheel. This shouldhelpovercomesomeof theproblemscausedby theflywheel’s inertia.Also, it shouldenableustoaddforcefeedback.

After completingthe motion interfaces,we plan to conductsystematictests.By comparingmotionpathsthrougha vir-tual floor with pathscapturedby a trackingsystemon a realfloor, we hopeto prove that reductionsin RE timesduetochangesin escape-routearchitecturecanbetransferredto re-ality.

6

REFERENCES1. Buk� owski, R. W., Sequin,C.: Interactive Simulationof

Fire in Virtual Building Environments,Proc. of SIG-GRAPH1997,35–44

2. Welch,R. B., Warren,D. H.,: IntersensoryInteractions,In K. Boff, L. Kaufman,J.Thomas(Eds.),Handbookofperceptionandperformance:Vol 1. SensoryProcessesandPerception,25-1–25-36,Wiley, New York, 1986

3. Welch, R. B., Warren, D. H.,: Overview, Section3:Basic sensoryprocessesII, In K. Boff, L. Kaufman,J. Thomas(Eds.),Handbookof perceptionandperfor-mance:Vol 1. SensoryProcessesandPerception,III-3–III-7, Wiley, New York, 1986

4. Bryson,S., Zeltzer, D., Bolas,M. T., De La Chapelle,B., Bennett,D.: The Futureof Virtual Reality: HeadMountedDisplaysVersusSpatiallyImmersiveDisplays,Proc.of SIGGRAPH1997,485–486

5. Cruz-Neira,C., Sandin,J., DeFanti, T. A.: Surround-screenProjection-basedVirtual Reality: The DesignandImplementationof theCAVE, Proc.of SIGGRAPH1993,134–142

6. Sutherland,I. E., A Head-MountedThree-DimensionalDisplay, AFIPS ConferenceProceedings1967,vol. 33,no1, 757–764

7. Youngblut, C., Johnson,R. E., Nash, S. H., Wien-claw, R. A., Will, C. A.: Review of Virtual En-vironment Interface Technology, IDA Paper P-3186,Institute for DefenseAnalysis, 1996, 209–216 alsohttp://www.hitl.washington.edu/scivw/IDA/

8. Massie,T. H., Salisburg, J. K.: The PhantomHapticInterface: A Device for ProbingVirtual Objects,Proc.of the 1994ASME InternationalMechanicalEngineer-ing CongressandExhibition,Chicago,VOL DSC55-1,295–302

9. Noma,H., Miyasato,T., Kishino,F.: A PalmtopDisplayfor DextrousManipulationwith HapticSensation,Proc.of CHI’96, 216–233,1996

10. Burdea,G.,Coiffet,P.: Virtual RealityTechnology, JohnWiley andSons,Inc. New York, 1994,338–340

11. Wells, M., Peterson,B., Aten, J.: The virtual motioncontroller:A sufficient-motionwalkingsimulator, HITLTechnicalReportR-96-4,Seattle,WA: HumanInterfaceTechnologyLaboratory, 1996

12. Durlach,N. I., Mavour, A. S.(Eds.):Virtual reality: sci-entific andtechnologicalchallenges,WashingtonD. C.:NationalAcademyPress,1995

13. Lampton,D. R., Knerr, B. W., Goldberg, S. L., Bliss,J.P., Moshel,J. J.,Blau,B. S.: Thevirtual environmentperformanceassessmentbattery(VEPAB): Develpomentandevaluation,Presence,3, 145–157

14. Peterson,B.: The Influenceof Whole-BodyInteractiononWayfindingin Virtual Reality, Mastersthesis,Univer-sity of Washington,1998

15. Darken, R. P., Cockayne,W. R., Carmein, D.: TheOmni-DirectionalTreadmill: A LocomotionDevice forVirtual Worlds,Proc.of UIST ’97, 213–221

16. VR is having a ball, The Eurekamagazine,April 1998Cover Story, alsohttp://www.shelleys.demon.co.uk/fea-apr8.htm

17. Brooks,F. P., Jr.: Walkthrough: A DynamicGraphicsSystemfor SimulatingVirtual Buildings.In Pizer, S.andCrow, F. (Eds.)Proc.1986Workshopon Interactive 3DGraphics,9–21,New York.

18. Fleischmann,M.: ImaginationSystems- InterfaceDe-sign for Science,Art and Education.Proc. IMAGINA’95, Montecarlo

19. Christensen,R., Hollerbach,J.M., Xu, Y., Meek, S.:Inertial force feedbackfor a locomotiondevice, Symp.on Haptic Interfaces,Proc. ASME Dynamic SystemsandControlDivision,DSC-Vol. 64,Anaheim,CA, Nov.1998,119–126

20. Latham,R.: Device for ThreeDimensionalWalking inVirtual Space,http://www.cgsd.com/OmniTrek.html

21. Youngblut, C., Johnson,R. E., Nash, S. H., Wien-claw, R. A., Will, C. A.: Review of Virtual En-vironment Interface Technology, IDA Paper P-3186,Institute for DefenseAnalysis, 1996, 197–198, alsohttp://www.hitl.washington.edu/scivw/IDA/

22. Iwata,H., Fujii, T.: Virtual Perambulator: A Novel In-terfacefor Locomotionin Virtual EnvironmentProc.ofVRAIS ’96, 60–65

23. Pausch,R.,Snoddy, J.,Taylor, R.,Watson,S.,Haseltine,E.: Disney’sAladdin: FirtsStepsTowardStorytellinginVirtual RealityProc.of SIGGRAPH1996,193–203

24. Hartman,J., Wernecke, J.: The VRML 2.0 Handbook:Building Moving Worlds on the Web, Addison-WesleyDevelopersPress,1996

25. Marrin, C.: Proposalfor a VRML 2.0 InformativeAnnex, ExternalAuthoring InterfaceReference,1997,http://www.cosmosoftware.com/developer/moving-worlds/spec/ExternalInterface.html

26. Peacock,R. D., Forney, G. P., Reneke,P. et al: CFAST,the ConsolidatedModel of Fire Growth and SmokeTransport,NIST technicallynote 1299, U. S. Depart-mentof Commerce,Feb. 1993

7