Virtual Reality - Sagarika Mehta

Preview:

Citation preview

Contents

• Virtual Reality• Need of Virtual Reality• Virtual Reality Systems• Hardware Used In Virtual Reality• Virtual Reality Developing Tools• The Future of Virtual Reality

Virtual RealityVirtual reality is, seeing an imaginary world, rather than the real one. Seeing, hearing, smelling, testing, feeling. The imaginary world is a simulation running in a computer. The sense data is fed by some system to our brain. A medium composed of interactive computer simulations giving users the feeling ofbeing present in the simulations.

Need Of Virtual Reality1. Operations in dangerous environments• There are still many examples of people working

in dangerous or hardship environments that could benefit from the use of VR-mediated teleportation.

• Workers in radioactive, space, or toxic environments could be relocated to the safety of a VR environment where they could 'handle' any hazardous materials without any real danger using tele-operation or tele-presence.

In the future - Tele-presence

In the future - Tele-presence

2. Scientific Visualization

• Scientific Visualization provides the researcher with immediate graphical feedback during the course of the computations and gives him/her the ability to 'steer' the solution process.

• Application at NASA Ames Research Center is the Virtual Planetary Exploration. It helps planetary geologists to remotely analyze the surface of a planet. They use VR techniques to roam planetary terrains.

NASA VR Mars Navigation Simulation

Geologists remotely analyzing the surface of Mars at NASA.

3. Medicine

• Until now experimental research and education in medicine was mainly based on dissection and study of plastic models. Computerized 3D human models provide a new approach to research and education in medicine. Experimenting medical research with virtual patients will be a reality.

• We will be able to create not only realistic looking virtual patients, but also histological and bone structures. With the simulation of the entire physiology of the human body.

Real 3D Ultrasound Experiment

4. Education and training

• The most common example is the flight simulator. This type of simulator has shown the benefits of simulation environments for training. They have lower operating costs and are safer to use than real aircraft.

• They also allow the simulation of dangerous scenarios that cannot be seen with real aircraft.

Virtual Reality Systems VR Systems can be divided into three groups :

– Non-Immersive Systems (like workstations)(See information about the real world, presented via computer - location based services, GIS)

– Hybrid systems (graphics on top of the real world) also called: Augmented Reality Systems(Stay in real world, but see simulated objects)

– Immersive systems (like HMD or CAVE) (See simulated world and "be" in that simulated world)

Non-Immersive Systems“Through- the - window”

Large display, but doesn’t surround the user.

Augmented realityStay in real world, but see simulated objects

Information Visualization

More Augmented realityStay in real world, but see simulated objects

More Augmented realityStay in real world, but see simulated objects

Augmented Reality can be used for training as well as for assembly purpose.

Immersive systems (CAVE)See simulated world and "be" in that simulated

world • The CAVE (Cave

Automatic Virtual Environment) provides the illusion of immersion by projecting stereo images on the walls and floor of a room-sized cube.

• Several persons wearing lightweight stereo glasses can enter and walk freely inside the CAVE.

Hardware Used In VR• Input devices:

A variety of input devices allow the user to navigate through a virtual environment and to interact with virtual objects. Directional sound, tactile and force feedback devices, voice recognition and other technologies are being employed to enrich the immersive experience and to create more "sensualized" interfaces. Some input devices used in virtual reality are:

Input Devices 1. The Data Glove

The sensors measure the bending angles of the joints of the thumb and the lower and middle knuckles of the others fingers, Attached to the back is a Polhemus sensor to measure orientation and position of the gloved hand. Thisinformation, along with the ten flex angles for the knuckles is transmitted through a serial communication line to the host computer.

2. 3D Mouse and Space Ball

The Logitech 3D mouse Figure is based on a ultrasonic position reference array, which is a tripod consisting of three ultrasonic speakers set in a triangular position, emits ultrasonic sound signals from each of the three transmitters. These are used to track the receiver position, orientation and movement. It provides proportional output in all 6 degrees of freedom: X, Y, Z, Pitch, Yaw, and Roll..

3. Motion Trackers

The Motion Tracking system is based on magnetic sensors which are attached to the user. Most common are sensors measuring the intensity of a magnetic field generated at a reference point. The motion of the different segments is tracked using magnetic sensors . These sensors return raw data (e.g. positions and orientations) expressed in a single frame system.

Output Devices1. Head-Mounted Displays (HMDs)

The head-mounted display (HMD) was the first device providing its wearer with an immersive experience. A typical HMD houses two miniature display screens and an optical system that channels the images from the screens to the eyes, thereby, presenting a stereo view of a virtual world. As a result, the viewer can look around and walk through the surrounding virtual environment.

2. BOOM (Binocular Omni-Orientation Monitor)

The BOOM (Binocular Omni-Orientation Monitor) from Fake space is a head-coupled stereoscopic display device. Screens and optical system are housed in a box that is attached to a multi-link arm. The user looks into the box through two holes, sees the virtual world, and can guide the box to any position within the operational volume of the device.

3. Haptic Interfaces And Tactile Feedback

Haptic feedback interface enables user to actually "touch" computer-generated objects and experience force feedback via the human hand. The CyberGrasp is a lightweight, unencumbering force-reflecting exoskeleton that fits over aCyberGlove and adds resistive force feedback to each finger. With the CyberGrasp force feedback system, users are able to explore the physicalproperties of computer-generated 3D objects they manipulate in a simulated 'virtual world.'

CyberGrasp

Virtual Reality Developing Tools(Virtual Reality Modeling Language)

• In addition to HTML (Hypertext Markup Language), that has become a standard authoring tool for the creation of home pages, VRML provides three-dimensional worlds with integrated hyperlinks on the Web.

• The viewing of VRML models via a VRML plug-in for Web browsers is usually done on a graphics monitor under mouse-control and, therefore, not fully immersive.

• However the syntax and data structure of VRML provide an excellent tool for the modeling of three-dimensional worlds that are functional and interactive and that can, ultimately, be transferred into fully immersive viewing systems.

• The current version VRML 2.0 has become an international ISO/IEC standard under the name VRML97.

The Future Of Virtual Reality

Virtual Reality is a growing industry.•PC and specialized hardware are getting better, faster and cheaper because of development in VR.•3D user interfaces will replace the windows based ones.•Huge demand for VRML programmers in near future.•Revolution in gaming industries.

Thank You

Sagarika Mehta