Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
Immersive Virtual Reality Research at Calit2
Jürgen P. Schulze, Ph.D.
California Institute for Telecommunicationand Information Technology
University of California San Diego
MURPA Seminar – June 17, 2009
Overview• Introduction of IVL•Mission•Hardware Infrastructure• Research Projects
Overview• Introduction of IVL•Mission•Hardware Infrastructure• Research Projects
UCSD• Founded 1960
• Jacobs School of Engineering ranks 10th internationally, according to the 2008 Academic Rankings of World Universities by Subject Field conducted by Shanghai Jiao Tong University in China
UC San Diego
California Institute for Telecommunications and Information Technology (Calit2)
• New Laboratory Facilities
– Nanotech, Chips, Radio, Photonics, Grid, Data, Applications
– Virtual Reality, Digital Cinema, HDTV, Synthesis
• Over 1000 Researchers in Two Buildings
– Linked via Dedicated Optical Networks
– International Conferences and Testbeds
UC Irvine
Calit2’ Mission: Preparing for an world in which distance has been eliminated
Immersive Visualization Lab (IVL)
• 3 staff programmers
• 3 PhD students
• 5 undergraduate students
• research focus on virtual reality (VR) applications
Overview• Introduction of IVL•Mission•Hardware Infrastructure• Research Projects
Towards the Holodeck•The Holodeck is a simulated reality facility located on starships and star bases in Star Trek
Overview• Introduction of IVL•Mission•Hardware Infrastructure• Research Projects
Calit2 Digital Cinema Theater: 4k Video
200 Seats, 8.2 Sound, Sony SRX-R110, SGI
Prism w/21TB, 10GE to Computers/Data
Stereo Wall (C-Wall)• Dual HD resolution
(1920 x 2048 pixels) with JVC HD2k projectors
• 2 Linux PCs
• Passive stereo
• Ascension Flock of Birds tracking system
• Software framework: COVISE
• Programming language: C++
• APIs: OpenGL, OpenSceneGraph, OpenAL
OptIPuter Scalable Display Systems
NCMIR
SIO
UIC USGS EDC
TAMU UCI
SARA KISTI
NCSA &
TRECC
CALIT2
AIST RINCON & Nortel
www.optiputer.net+ Dr. David Abramson’s tiled display walls!
Calit2’s 287 Megapixel HiPerSpace Tiled Display (70 30" displays)
Tiled Display Wall with Tracked Input Device
The Varrier Wall• 12 x 5 array of autostereoscopic displays
• 15 Linux PCs with dual Nvidia GeForce 7900 graphics cards
• 1600 x 1200 pixels on each display
• 115 Megapixels total
• Optical tracking
The StarCAVE• 18 Dell XPS PCs with Quad Core Intel
CPUs
• Dual Nvidia Quadro 5600 graphics cards per node
• CentOS 5.2 Linux
• 34 JVC HD2k (1920x1080 pixels), ~34 megapixels per eye
• Stereo: passive with circular polarization
• 15 screens on 5 walls, ~8 x 4 foot each, plus floor projection
• Optical tracking system with four cameras by ART
• Fifth wall on rails for full 360 degrees immersion
34-Megapixel, 17-Screen StarCAVE
4k Movie Research• 4k = 4096 pixels horizontally, 2160-2400 pixels vertically (~8 megapixels)
• Intensive use of 10Gbps networks: uncompressed 4k video is >600 MB/sec
• CinegridExchange: serve terabytes of 4k video and multi-channel audio to collaborators world wide
• Collaborating with iRods group at SDSC• www.cinegrid.org
Keio University
UCSD
4k Telepresence
Dozens of 10Gb Ethernet Links for R&E
Overview• Introduction of IVL•Mission•Hardware Infrastructure• Research Projects
Automated Building Extraction• Automatically reconstruct
building geometry
• Useful for:
– Visualisation, Simulation
• Used by:
– City Planning, Emergency Services, Tourism, Transport, etc, etc...
Gulch 1997Suveg and Vosselman 2004
Vosselman 1999
Automated Building Extraction
• Input:
– Aerial images
– Digital Surface Models
• Output:
– Building geometry
Sementation/Classification/
Building Reconstruction
“How Much Information?” Project
• A study of how much information is being produced worldwide by businesses and consumers
• Visualization of large datasets from prominent IT driven companies (AT&T, Cisco, IBM, etc.)
• Microsoft dataset: 70,000+ hard drive scans containing 500+ GB of file system information taken from 2000 - 2004
ORION visualization pipeline • Interactive real-time modeling and
visualization of oceanic data sets
• Pipeline spans data acquisition to 3D real-time visualization
• Continuous real-time data collection from ocean sensors (e.g. temperature, salinity and current)
• IsoSurface and streamline creation via user defined parameters
Tower Footing
• Structural Steel built in China. Transported and assembled in Corpus Christi, Texas
• Massive Section of the Tower Footing Crossing the Panama Canal 22 Feb. 2007
• Arrives to San Francisco Bay 7 Mar. 2007
The Virtual Reality Application
Collision Detection - Constructability
Hot Spot Mitigation• In projected screen enviroments, like the StarCAVE or also the Osaka CAVE:The point where the line from the projector to the viewer intersects the screen will be the brightest point on the screen.
• Can we mitigate this effect with software?
Hot Spot Mitigation - Results
•This shows the inverse mitigation pattern rendered at full intensity.
•The red dots mark the centers of the hotspots.
original mitigated
CAMERA Meta-Data Visualization• Uses GOS (Global Ocean Survey) data set
• Can show CAMERA server activity
• Visualize GOS meta-data on globe
• Offers various visualization and statistics modes
PDB and TOPSAN Databank Access• Protein structure visualization based on PyMol•On-line access to Wiki with protein meta-data
Archaeology• Area M excavation site in Jordan
• Users can interactively walk or fly through a real-time to-scale rendition of the excavation site
• Hi-precision radiocarbon dates (blue, red and yellow spheres) link the site to the 10th and 9th century B.C.E.
Multi-Spectral Artwork Visualization• Interactive visualization of gigapixel images and beyond
Calit2 Dynamic Building Model
Sensors detect user’s interactions with elevator and controls elevator’s operation.
• Built using 3ds Max, VRML, JavaScript
• Allows to see the Calit2 building’s architectural structure in VR
• Interact with automatic doors, elevators, light switches
• Conduct neuroscientific experiments
Implementation of sunlight allows the user to change the sunlight direction
according to the hour of the day.
Neuroscience in Architecture• EEG with 256 electrodes• 3D model of Calit2 building•Wayfinding task
Spatialized Sound• Audio system developed at Calit2
• Uses Dolby 5.1 speaker system or 42-speaker Yamaha YSP-4000 Digital Sound Projector
• System supports reverb, attenuation, Doppler effect
Biological Volume Rendering• Rendering large confocal image stacks•Superimpose multiple volume data sets•Supports data sizes greater than system RAM•Supports time series
Confocal and electron microscope image stacks superimposed
Medical Volume Rendering
Collaboration with Childrens Hospital in San Diego
Developing software tools to facilitate work with CT and MRI image stacks
Collaborative VR Applications
• Collaborative work in single VR environment• Collaborative work in networked VR environments
3D Teleconferencing• Collaboration with Osaka University (Drs. Date and Takemura) and NCHC (Dr. Lin)
• Driven by PRIME Students at Osaka U. and NCHC in summer 2008 and 2009
• Rendering of time series of 3D point clouds, scanned by array of video cameras
• Focus on user interface in VR environment (StarCAVE)
Data Flow
Our Camera Capturing Setup
Real-Time Point Rendering
• Implemented for COVISE
• Based on Stanford’s Qsplat algorithm
• Major contributions from PRIME students 2008
• Runs in StarCAVE and Osaka University’s CAVE
ATLAS in Silico: Multimedia Art• Visualization of CAMERA data
• Exhibited at SIGGRAPH 2007 (on Varrier), Ingenuity Festival Cleveland 2008 (on C-Wall)
• Funded by American National Science Foundation (NSF)
Thank You!
More information:
• http://www.calit2.net/~jschulze/• http://ivl.calit2.net/wiki/• [email protected]