2
A Hybrid Reality Environment and its Application to Earthquake Engineering Tara C. Hutchinson, Falko Kuester, Tung-Ju Hsieh, and Rebecca Chadwick University of California, Irvine Abstract In this paper, an interactive 2D and 3D (hybrid) en- vironment, which facilitates collaborative learning and re- search and utilizes techniques in visualization and VR, is described. The new environment incorporates both a spe- cially designed lecture room and laboratory integrating both 2- and 3-dimensional spatial learning by coupling a series of interactive projection display boards (touch sensitive white boards) and a semi-immersive 3D wall display. The hard- ware architecture designed for this new hybrid environment as well as an initial application within the environment to the study of a case history building subjected to a variety of earthquakes are discussed in the paper. 1 Introduction Though the field of engineering has changed dramatically in the last 20 years, the study of engineering has changed rela- tively little. Students continue to passively listen to lectures in chalkboard-based environments that afford them little op- portunity for visualization, hands-on manipulation, interac- tion, or creative design. Indeed, almost all of these skills, which are vital for engineering, are learned through on-the- job practice rather than at universities. A continuation of this trend threatens to weaken the relevancy of engineering and computer science study at the university. It is our contention that visualization can provide the much needed computer- assisted design and analysis environment to foster problem- based learning, while Virtual Reality (VR) can provide the environment for hands-on manipulation, stimulating interac- tive learning – helping to enhance the potential for success of engineering education. At the University of California, Irvine a new interactive 2D and 3D (hybrid) reality environment, termed VizClass, is being developed with these objectives in mind. VizClass in- corporates both a specially designed lecture room and labo- ratory integrating both 2- and 3-dimensional spatial learning by coupling a series of interactive projection display boards (touch sensitive white boards) and a semi-immersive 3D wall display. This paper describes the hardware architecture de- signed for this new hybrid environment as well as an initial application within the environment to the study of a real case history building subjected to a variety of earthquakes. Figure 1. Schematic plan illustrating VizClass layout. 2 Implementation of VizClass The new IT-Classroom being developed at the University of California, Irvine (UCI) provides a completely digital, inter- active workspace for research and education in the areas of Computer and Civil Engineering. The layout of VizClass, showing the 2D active display area and the 3D stereo work- ing area, which comprise the ’hybrid’ environment is shown in Figure 1. The 2D display component in VizClass consists of three interactive 1.8 meter diagonal digital whiteboards (SmartBoards TM ) connected to individual rendering nodes. The digital SmartBoards have multiple input/output sources - with the primary source being an optical on-screen, provid- ing hands-on input for annotating and controlling the envi- ronment. The 3D display component is provided by passive stereo, with two projectors vertically stacked and mounted at floor level projecting from the rear, onto a large 3 meter diagonal screen. Control of the entire environment (2D/3D displays, touch-sensitive LCD display, cameras, sound sys- tem), an 8-node PC cluster connected via a Gigabit ethernet is housed in the 3D projection room. 3 Case Study Building: Van Nuys, California Temporal loading plays a large role in the extreme load- ing conditions civil engineers must design for. New visual tools are desirable to allow articulation of these extreme movements, particularly considering temporal variations. In 229 IEEE Virtual Reality 2004 March 27-31, Chicago, IL USA 0-7803-8415-6/04/$20.00©2004 IEEE. Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

[IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - A hybrid reality environment and its application to earthquake engineering

  • Upload
    r

  • View
    213

  • Download
    1

Embed Size (px)

Citation preview

Page 1: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - A hybrid reality environment and its application to earthquake engineering

A Hybrid Reality Environment and its Application to Earthquake Engineering

Tara C. Hutchinson, Falko Kuester, Tung-Ju Hsieh, and Rebecca ChadwickUniversity of California, Irvine

Abstract

In this paper, an interactive 2D and 3D (hybrid) en-vironment, which facilitates collaborative learning and re-search and utilizes techniques in visualization and VR, isdescribed. The new environment incorporates both a spe-cially designed lecture room and laboratory integrating both2- and 3-dimensional spatial learning by coupling a series ofinteractive projection display boards (touch sensitive whiteboards) and a semi-immersive 3D wall display. The hard-ware architecture designed for this new hybrid environmentas well as an initial application within the environment tothe study of a case history building subjected to a variety ofearthquakes are discussed in the paper.

1 Introduction

Though the field of engineering has changed dramatically inthe last 20 years, the study of engineering has changed rela-tively little. Students continue to passively listen to lecturesin chalkboard-based environments that afford them little op-portunity for visualization, hands-on manipulation, interac-tion, or creative design. Indeed, almost all of these skills,which are vital for engineering, are learned through on-the-job practice rather than at universities. A continuation of thistrend threatens to weaken the relevancy of engineering andcomputer science study at the university. It is our contentionthat visualization can provide the much needed computer-assisted design and analysis environment to foster problem-based learning, while Virtual Reality (VR) can provide theenvironment for hands-on manipulation, stimulating interac-tive learning – helping to enhance the potential for successof engineering education.At the University of California, Irvine a new interactive

2D and 3D (hybrid) reality environment, termed VizClass, isbeing developed with these objectives in mind. VizClass in-corporates both a specially designed lecture room and labo-ratory integrating both 2- and 3-dimensional spatial learningby coupling a series of interactive projection display boards(touch sensitive white boards) and a semi-immersive 3D walldisplay. This paper describes the hardware architecture de-signed for this new hybrid environment as well as an initialapplication within the environment to the study of a real casehistory building subjected to a variety of earthquakes.

Figure 1. Schematic plan illustrating VizClass layout.

2 Implementation of VizClass

The new IT-Classroom being developed at the University ofCalifornia, Irvine (UCI) provides a completely digital, inter-active workspace for research and education in the areas ofComputer and Civil Engineering. The layout of VizClass,showing the 2D active display area and the 3D stereo work-ing area, which comprise the ’hybrid’ environment is shownin Figure 1.The 2D display component in VizClass consists of

three interactive 1.8 meter diagonal digital whiteboards(SmartBoardsTM) connected to individual rendering nodes.The digital SmartBoards have multiple input/output sources- with the primary source being an optical on-screen, provid-ing hands-on input for annotating and controlling the envi-ronment. The 3D display component is provided by passivestereo, with two projectors vertically stacked and mountedat floor level projecting from the rear, onto a large 3 meterdiagonal screen. Control of the entire environment (2D/3Ddisplays, touch-sensitive LCD display, cameras, sound sys-tem), an 8-node PC cluster connected via a Gigabit ethernetis housed in the 3D projection room.

3 Case Study Building: Van Nuys, California

Temporal loading plays a large role in the extreme load-ing conditions civil engineers must design for. New visualtools are desirable to allow articulation of these extrememovements, particularly considering temporal variations. In

229

IEEE Virtual Reality 2004 March 27-31, Chicago, IL USA 0-7803-8415-6/04/$20.00©2004 IEEE.

Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE

Page 2: [IEEE IEEE Virtual Reality 2004 - Chicago, IL, USA (27-31 March 2004)] IEEE Virtual Reality 2004 - A hybrid reality environment and its application to earthquake engineering

this section, we describe visualization and VR techniquesused to enhance our perception of the time-varying move-ments of a case study building subjected to several earth-quake records. The studied building is a 7-story reinforcedconcrete structure located in central San Fernando Valley,California. The California Strong Motion InstrumentationProgram (CSMIP)1 manages the seismic instrumentation in-stalled throughout this building. By the end of 1994, this in-strumentation system captured strong motion records from atotal of nine earthquakes and three aftershocks. TheMw 6.7Northridge earthquake in 1994 caused significant damage tothis building, both structural and nonstructural, rendering thebuilding unsafe for occupancy.Visualizing the Case Study Building – Since the great-est damage to the building was observed during the 1994Northridge earthquake, the example results presented focuson this event. To facilitate a realistic representation of thestructure, a detailed 3-dimensional model of the Van Nuysbuilding was constructed using the 3DStudioMax2 platform.The model was exported to Wavefront OBJ format for subse-quent visualization in our VR framework. Different textureskins were collected and used to create a photorealistic rep-resentation of the undamaged, damaged, and repaired struc-ture. At run-time, local damage texture skins can be appro-priately triggered (to blend from undamaged to damaged atthe correct time during the motion) and a more realistic senseof the state of the entire structure, relative to this damage cu-mulation is expressed.Field Measured Displacement Data – The field measureddatasets are discretely mapped onto the geometric model andused to generate temporal simulations reproducing the defor-mation patterns observed. Records from the 1994 Northridgeearthquake consist of 16 channels of seismograms measuringacceleration in both horizontal directions, and select mea-surements in the vertical direction. Since sensors are notplaced at each floor level, to fully describe the 3-dimensionalmovement of the structure, interpolation was applied.Figure 2 shows a sequence of time steps and snapshots of

the fully deformed building model using color coding appliedto the components of the model. Red is used to represent themaximum deformations (over the entire record), while bluerepresents minimum values. Interpolation is applied acrossthe color spectrum and mapped onto the model. Color pro-vides an intuitive, rapid assessment of the current state of thestructure during time, for example, one can quickly observefrom Figure 2 the maximum deformation occurs at time t =7.38 seconds.Path History using 3D Sensor Traces – Discrete point-based displacement histories can be articulated using theconcept of 3D sensor traces. In this case, we render a spher-ical proxy and use it to represent the original position of thesensor. A second spherical proxy is rendered and used to1CSMIP: http://www.consrv.ca.gov/cgs/smip/about.htm23DStudioMax: http://www.3dmax.com

(a) 6.96 s (b) 7.02 s (c) 7.08 s (d) 7.14 s

(e) 7.20 s (f) 7.26 s (g) 7.32 s (h) 7.38 s

Figure 2. Image sequence showing building deformations(color coded) between time t = 6.96 - 7.38 seconds (sub-sampled to a∆t = 0.06 seconds) -1994 Northridge earth-quake.

represent the current time location of the sensor (during theanimation). Figure 3 shows a sequence of screen shots of the3D sensor traces at the roof of the building illustrating thepath traces from time t = 4.0 - 10.0 seconds.

4 Summary Remarks

In this paper, we describe a new interactive learning and re-search environment, we have termed VizClass. VizClass is ahybrid reality environment in the sense that it incorporatesa specially designed lecture room and laboratory, integrat-ing both 2- and 3-dimensional spatial activities by couplinga series of interactive projection display boards (touch sen-sitive white boards) and a semi-immersive 3D wall display.We also illustrate an example of effective use of VizClass,through a case study building, where notable dynamic move-ments (from induced earthquake loading) are measured andassociated damage to the structure was observed.

5 Acknowledgements

This research is supported by NSF, under Grant NumberEIA-0203528, the Holmes Fellowship Foundation, and thePEER Internship program. We also appreciate the assistanceof the Van Nuys building owners in this work.

(a) 4.0 s (b) 10.0 s

Figure 3. 3D sensor traces at time t = 4.0 and 10.0 seconds(red sphere represents the original sensor location, bluesphere represents the current sensor location).

230Proceedings of the 2004 Virtual Reality (VR’04) 1087-8270/04 $ 20.00 IEEE