9
EUROGRAPHICS’2000 / M. Gross and F.R.A. Hopgood Volume 19 (2000), Number 3 (Guest Editors) Augmented Reality with Back-Projection Systems using Transflective Surfaces Oliver Bimber * , L. Miguel Encarnação # and Dieter Schmalstieg + * Fraunhofer Institute for Computer Graphics, Rostock, Germany, email: [email protected] # Fraunhofer Center for Research in Computer Graphics, Providence, RI, USA, email: [email protected] + Vienna University of Technology, Vienna, Austria, email: [email protected] Abstract In this paper, we introduce the concept of Extended VR (extending viewing space and interaction space of back-projection VR systems), by describing the use of a hand-held semi- transparent mirror to support augmented reality tasks with back-projection systems. This setup overcomes the problem of occlusion of virtual objects by real ones linked with such display systems. The presented approach allows an intuitive and effective application of immersive or semi-immersive virtual reality tasks and interaction techniques to an augmented surrounding space. Thereby, we use the tracked mirror as an interactive image- plane that merges the reflected graphics, which are displayed on the projection plane, with the transmitted image of the real environment. In our implementation, we also address traditional augmented reality problems, such as real-object registration and virtual-object occlusion. The presentation is complemented by a hypothesis of conceivable further setups that apply transflective surfaces to support an Extended VR environment. 1. Introduction Augmented Reality 3 (AR) superimposes graphical representations of virtual objects onto the user’s view of the real world. To achieve this, two main categories of display systems are commonly applied: semi-transparent head-mounted displays 9 (HMDs) and video composition systems 4 . Semi-transparent HMDs, such as Sony’s Glasstron 24 or Virtual I/O’s i-Glasses 28 , project the produced graphics directly in front of the viewer’s eyes. Such devices always overlay the computer graphics onto the image of the real environment that is transmitted through the HMD’s projection planes. The ratio of intensity of the transmitted light and the projected images can usually be adjusted electronically to integrate the images more effectively. Video composition systems use cameras to continuously capture images of the real environment. Video mixing is employed in a post- process to merge the video image with graphics on an opaque display. Immersive HMDs, such as nVision’s Datavisor family 18 generally serve as output devices to stereoscopically project the augmented images in front of the viewer’s eyes. AR-specific problems, such as the occlusion of virtual objects by real ones 7 , the registration of virtual objects to real ones 29 , or the calibration of the display system with the real environment 11 are solved differently in both scenarios, thus revealing some of the distinctive advantages and disadvantages of each approach. Large projection systems that make use of back- projection technology are becoming more and more common within the Virtual Reality (VR) community. Systems such as the Virtual Table 5 and Responsive Workbench 12 both employ a horizontal projection philosophy, whereas other devices such as The Eurographics Association and Blackwell Publishers 2000. Published by Backwell Publishers, 108 Cowley Road, Oxford OX4 1JF, UK and 350 Main Street, Malden, MA 02148, USA.

Augmented Reality with Back-Projection Systems using Transflective Surfaces

Embed Size (px)

Citation preview

Page 1: Augmented Reality with Back-Projection Systems using Transflective Surfaces

EUROGRAPHICS’2000 / M. Gross and F.R.A. Hopgood Volume 19 (2000), Number 3(Guest Editors)

Augmented Reality with Back-Projection Systemsusing Transflective Surfaces

Oliver Bimber*, L. Miguel Encarnação# and Dieter Schmalstieg+

*Fraunhofer Institute for Computer Graphics, Rostock, Germany, email: [email protected]#Fraunhofer Center for Research in Computer Graphics, Providence, RI, USA, email: [email protected]

+Vienna University of Technology, Vienna, Austria, email: [email protected]

AbstractIn this paper, we introduce the concept of Extended VR (extending viewing space andinteraction space of back-projection VR systems), by describing the use of a hand-held semi-transparent mirror to support augmented reality tasks with back-projection systems. Thissetup overcomes the problem of occlusion of virtual objects by real ones linked with suchdisplay systems. The presented approach allows an intuitive and effective application ofimmersive or semi-immersive virtual reality tasks and interaction techniques to anaugmented surrounding space. Thereby, we use the tracked mirror as an interactive image-plane that merges the reflected graphics, which are displayed on the projection plane, withthe transmitted image of the real environment. In our implementation, we also addresstraditional augmented reality problems, such as real-object registration and virtual-objectocclusion. The presentation is complemented by a hypothesis of conceivable further setupsthat apply transflective surfaces to support an Extended VR environment.

1. IntroductionAugmented Reality3 (AR) superimposes graphicalrepresentations of virtual objects onto the user’sview of the real world. To achieve this, two maincategories of display systems are commonly applied:semi-transparent head-mounted displays9 (HMDs)and video composition systems4.

Semi-transparent HMDs, such as Sony’sGlasstron24 or Virtual I/O’s i-Glasses28, project theproduced graphics directly in front of the viewer’seyes. Such devices always overlay the computergraphics onto the image of the real environment thatis transmitted through the HMD’s projection planes.The ratio of intensity of the transmitted light and theprojected images can usually be adjustedelectronically to integrate the images moreeffectively.

Video composition systems use cameras to

continuously capture images of the realenvironment. Video mixing is employed in a post-process to merge the video image with graphics onan opaque display. Immersive HMDs, such asnVision’s Datavisor family18 generally serve asoutput devices to stereoscopically project theaugmented images in front of the viewer’s eyes.

AR-specific problems, such as the occlusion ofvirtual objects by real ones7, the registration ofvirtual objects to real ones29, or the calibration of thedisplay system with the real environment11 aresolved differently in both scenarios, thus revealingsome of the distinctive advantages anddisadvantages of each approach.

Large projection systems that make use of back-projection technology are becoming more and morecommon within the Virtual Reality (VR)community. Systems such as the Virtual Table5 andResponsive Workbench12 both employ a horizontalprojection philosophy, whereas other devices such as

The Eurographics Association and Blackwell Publishers 2000. Published by Backwell

Publishers, 108 Cowley Road, Oxford OX4 1JF, UK and 350 Main Street, Malden, MA

02148, USA.

Page 2: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

© The Eurographics Association and Blackwell Publishers 2000.

the Powerwall23 and surround screen projectionsystems (SSPS) such as the CAVE8 also employvertical back-projection. The introduction of realobjects into virtual environments created by suchsystems, however, is much more difficult than theintegration of virtual objects with an augmented realenvironment. This is due to the fact that real objectsare always located between the viewer and theprojection plane(s), thus always occluding theprojected graphics and consequently the virtualobjects. In comparison to this, the projection planesof see-through HMDs, for instance, are alwayslocated between the viewer’s eyes and the realenvironment, and their projected graphics alwaysocclude the real environment. An example for a VRsystem supporting real-object occlusion would bethe projection-based virtual office21, where front-projection is used.

The occlusion problem is the main reason for thecommon belief that back-projection systems areimpractical for traditional augmented reality tasks.This paper introduces a possible solution to theocclusion problem using a semi-transparent mirrorthat allows back-projection systems to superimposethe displayed graphical elements onto thesurrounding real environment.

With the term Extended VR, we want to introducethe concept of extending the viewing space and theinteraction space of back-projection virtual realitysystems. The technique that is described belowoffers an extension to the surrounding realenvironment.

2. Related WorkAfter the presentation of his ultimate display26 in1965, Ivan Sutherland developed the first augmentedreality system, which consisted of two head-trackedCRTs that were worn on the sides of the user’s head.The optical path was folded using small half-silveredmirrors that were located in front of the user’s eyes.The physical characteristics of the mirrors allowedhim to overlay the transmitted image of the realworld with the graphics produced by the CRTs.Optical combiners, such as small half-silveredmirrors, led to the development of many see-throughHMDs over the last three decades and can still befound in many systems, such as Virtual I/O’s i-Glasses28. Due to an ongoing decrease in size as wellas technological advances of recent devices, half-silvered mirrors are often replaced by fullyelectronic solutions, such as the see-through LCDsused in Sony’s Glasstron24.

In 1993, Fitzmaurice10 introduced Chameleon, ahand-held LCD display. Equipped with a six-degrees-of-freedom (6DOF) tracker, he used it as amovable window into a 3D information space.MacIntyre14 notes that the techniques used byChameleon could be applied quite naturally to a see-through hand-held display to create an even richerinformation space.

In 1999, we presented the transflective pad6, asemi-transparent hand-held mirror. Applied in itsreflective mode, we use the 6DOF-tracked pad toincrease the viewing volume and interaction space ofa workbench-like projection system (supportingExtended VR in an exclusively virtual environmentwithout involving the surrounding real space) and toapproximate multi-user viewing. In its transparentmode, we apply the techniques for transparent propsthat are introduced by Schmalstieg22 et al.: Thiswork uses transparent props (a pad and a pen) tointeract with a Virtual Table. Due to thetransparency characteristics of the real pad and pen,this device overlays the projected graphics withoutcausing any occlusion problems. Thus the graphicsthat are projected onto the Virtual Table can be usedto augment the props.

Obviously, the material property (e.g.transparency or reflection) is a significant issue forthe seamless integration of real objects in a virtualenvironment that is displayed using back-projectiontechnology.

Inspired by Sutherland’s half-silvered mirror, weextended the transparent pad’s scope to make itapplicable to traditional augmented reality tasks incombination with back-projection systems.

3. The Transflective PadTo create the pad, we used a 15” x 11” sheet ofPlexiglass laminated with a semi-transparent foil,such as 3M’s Scotchtint P-181, which is normallyused to reduce window glare. The foil’s surface isidentical on both sides and either reflects ortransmits light, depending on its orientation to thelight source. If viewer and light source are locatedon opposite sides of the plane that is spanned by thepad, the foil transmits the light, and the environmentbehind the pad is visible to the observer by lookingthrough the pad. If they are located on the same side,the foil reflects the light and the viewer can observethe reflection of the environment in front of the pad.

However, if both sides of the pad are properlyilluminated, the pad simultaneously transmits and

Page 3: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

The Eurographics Association and Backwell Publishers 2000.

reflects the light, thus both images (the reflectedimage and the transmitted one) are visible to theobserver by looking at the pad (cf. Figure 1). Due tothese characteristics, we named our semi-transparentmirror device transflective pad.

We also attached a tracking emitter to the pad andused an electromagnetic tracking device toaccurately determine the position and orientation ofthe plane defined by the pad.

Figure 1: How the Transflective Pad integrates realand virtual images.

4. Augmenting the Surrounding EnvironmentA BARCO Baron Virtual Table5 serves as projectionsystem in our current setup. It consists of anintegrated RGB-projector that displays a 54” x 40”image on the backside of a ground glass screen. Theprojected image of the virtual scene is reflected inthe mirror, while at the same time the image of thereal world is transmitted through the mirror (cf.Figure 5* for our experimental setup).

In the following, we show how to apply thetransflective pad to fold the optical path between theobserver and the projection plane. Furthermore, wedescribe how to transform the displayed image in away that allows us to use the tracked pad as aninteractive image plane that merges the reflectedgraphics with the transmitted image of thesurrounding real world.

To gain the correct three-dimensional impressionof the projected scene, we have to apply head-tracking and stereoscopic viewing in combinationwith shutter-glasses, such as Stereographics’CrystalEyes25 or NuVision3D’s 60GX17.

By performing the computations described below,the stereoscopically projected images are reflectedby the pad in such a way that all graphical elementssatisfy the viewer’s binocular depth cue and appearto be three-dimensional when looking at the pad.

4.1 Using the Transflective Pad as an InteractiveImage PlaneThe tracked transflective pad divides theenvironment into two subspaces (cf. Figure 2). Wecall the subspace that contains the viewer and theprojection plane (or an appropriate portion of it) theprojection space (PRS) and the subspace thatcontains the physical (real) objects and additionalphysical light sources the physical space (PHS).

As in the immersive case, graphical elements(such as geometry, normals, textures, clippingplanes, virtual light sources, etc.) are defined withinthe 3D freespace (i.e. within the global coordinatesystem of the virtual environment). This coordinatesystem actually exceeds the boundaries of theprojection space and extends into the surroundingphysical space.

The challenge is to complement the physicalobjects contained by PHS with additional virtualaugmentations. To achieve this, the graphicalelements of the virtual objects are either defineddirectly within the PHS, or they are transformed tothe PHS during an object registration process. Inboth cases, the graphical elements are contained bythe PHS and, due to the lack of projectionpossibilities in the physical space, are not visible tothe observer without additional aids.

In the following explanation, we will call the fullyreflective surface of the transflective pad the mirror.

We now consider the mirror and compute thereflection of the viewer’s physical eye locations (aswell as possible virtual headlights). We then applythe inverse reflection to every graphical element thatis contained by the PHS. This way these graphicalelements are transformed and can be projected attheir corresponding inverse reflected position withinthe PRS. Thus, they are physically reflected back bythe mirror into the reflection space or RES (thespace that appears behind the mirror by looking atit).

Page 4: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

© The Eurographics Association and Blackwell Publishers 2000.

Figure 2: Transflective Image Plane Scenario.

When the mirror and the head-tracker aresufficiently calibrated, the PHS and the RES overlayexactly. The graphical elements appear in the sameposition within the RES as they would within thePHS without the mirror (if a projection possibilitywas given within the PHS).

To compute the needed transformations, we haveto determine the reflection of the viewer’s eyepositions and headlights, as well as the inversereflection of the graphical elements contained by thePHS, with respect to the mirror plane. Since themirror is tracked within its 6DOF, we can derive theplane defined by the pad from the spatial trackinginformation.

If the mirror’s plane was represented as:

0),,( =+++= dczbyaxzyxf

with its normal vector [ ]cbaN ,,=v , then 'P

v is the

reflection of Pv

with respect to the mirror plane:

( )( )NdPNN

PPvvv

vvv

+−= 2' 2

To make use of the binocular parallax, we have tocompute the reflection of both eyes, thus thephysical eye locations are transformed from the PRSto their corresponding reflected eye locations withinthe RES.

The inverse reflection of a point that is containedby the PHS is simply computed from the reflectionwith respect to the mirror plane. Since we assumethat the PHS and the RES exactly overlay, we canalso assume that the reflection of the graphicalelements contained by the PHS results in the inversereflection of the RES, that is, they are transformed to

their corresponding positions within the PRS andcan be displayed on the projection plane.

If we properly illuminate the real objectscontained by the PHS using the real light sources,the transflective pad transmits their images. Thetransflective pad also physically reflects the imageof the inverse reflected graphical elements, which isprojected within the PRS. Thus both the transmittedimage of the real objects and the reflection of thedisplayed graphics are simultaneously visible to theobserver by looking at the transflective pad.

The transflective pad is then used as a hand-heldinteractive image plane, whereby (in contrast totraditional back-projection) the virtual objectsalways overlay the real ones. If we continuouslyapply the computations that are described above, wecan reorient the transflective pad while the registeredvirtual objects appear to remain in the samepositions relative to the physical objects (whenlooking at the transflective pad).

The ratio of intensity of the transmitted light— theimage of the PHS and the reflected light (i.e. theimage of the RES)— can intuitively be adjusted bychanging the angle between the transflective pad andthe projection plane. While acute angles highlightthe virtual augmentation, obtuse angles let thephysical objects shine through brighter.

4.2 The Reflection MatrixAn easy way of applying the reflection computationsis to express them as a transformation matrix thatcan be set before rendering. Note, that the following4x4 reflection matrix is affine.

With reference to the mirror plane ),,( zyxf andthe reflection of a point ( )fPP ,'

rv :

−−+−−−−−+−−−−−+

=

2

222

222

222

2

000222222222

1

Ncdcbabcacbdbcbcaabadacabacb

NM

rr

By applying the reflection matrix, every graphicalelement (i.e. geometry, normals, texture, clippingplanes, virtual light sources, etc.) is reflected withrespect to the mirror plane. A side effect of this is,that the order of reflected polygons is also reversed(e.g. from counterclockwise to clockwise) which,due to the wrong front-face determination, results ina wrong rendering (e.g. lighting, culling, etc.). This

Page 5: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

The Eurographics Association and Backwell Publishers 2000.

can easily be solved by explicitly reversing thepolygon order.

The following OpenGL pseudo code exampleillustrates the effected extract of the renderingprocess:...// set polygon order to clockwise// (OpenGL default: counterclockwise)glFrontFace(GL_CW); // backup current transformation// matrixglPushMatrix();

// apply reflection matrixglMultMatrixd(M );

// render all graphical elements that// have to be reflected(with respect// to reflected eye position and// reflected headlights)renderEverything();

// restore transformation matrixglPopMatrix();

// set polygon order back to default// (counterclockwise)glFrontFace(GL_CCW);...

5. ImplementationOur software system has been implemented in C++using OpenGL30 and additional utility libraries. It isrunning under IRIX 6.5 on a Silicon GraphicsIndigo2 High Impact, and applies an AscensionFlock-Of-Birds2 electromagnetic tracking device.

We use an admeasured wooden frame to calibratethe tracker’s emitters to a predefined position andorientation within our global coordinate system. Inaddition, we apply smoothening operators andmagnetic field distortion correction to the trackingsamples.

While one emitter is attached to the viewer’sshutter-glasses to support head-tracking, a secondone is connected to the transflective pad tocontinuously determine its plane parameters. A thirdemitter is used to track a pen-like input device (cf.Figure 5*). Furthermore, non-stationary real objectsare continuously tracked with additional emitters.

To register the virtual representations to theircorresponding real objects, we apply the pointer-based object registration method introduced byWhitaker29 et al. Triggered by an attached button,our pen is used to capture predefined landmarkpoints on the real object’s surface within the globalcoordinate system. The problem is computing therigid transformation between the set of predefined3D points on the virtual representation’s surface andtheir measured counterparts. We apply Powell'sdirection set method20 to solve this minimizationproblem and to compute the required object-to-world transformation.

Occlusion caused by real objects (static or non-stationary ones) is handled by renderingcorresponding invisible (black) geometry-phantoms,as described by Breen7 et al. This model-basedapproach registers geometric virtual representationsto real objects and renders them in black, whichmakes them invisible within the augmented space. Ifthe virtual models accurately represent their realcounterparts, they can be used to produce occlusionwith visible virtual objects. Since both the invisiblephantoms and the visible virtual objects are reflectedby the pad and overlay the real environment, theobserver gets the impression that real objects, whichare fully visible through their phantoms, couldocclude virtual objects.

Physical objects are either placed directly on theVirtual Table’s projection plane or on additionaltables that surround the projection plane.

The transflective pad is held with one hand, whilethe other hand can be used to interact within theaugmented physical space (e.g. with augmentedphysical objects— as illustrated in Figure 9*— orwith additional interaction devices, such as the pen,etc.).

We use speech commands to switch between anAR mode that is supported by the transflective pad,and an immersive VR mode that makes use of theVirtual Table’s projection plane only. In theimmersive VR mode, however, the transflective padsupports interaction techniques for transparentprops22 (i.e. object palette, magic lenses, through-

Page 6: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

© The Eurographics Association and Blackwell Publishers 2000.

the-plane tools, etc.) as well as transflective padtechniques6 (i.e. difficult-to-reach viewing, multipleobserver viewing, clipping-plane-in-hand, etc.).

Figures 6–9* show photographs of the transmittedphysical space and the reflected projection space.Note that the photographs are not embellished. Theywere taken as seen from the viewer’s perspective.However, the printouts appear darker and with lessluminance than in reality.

6. ConclusionsWe have introduced the concept of Extended VR,and described the possibility of supportingaugmented reality tasks with back-projectionsystems by avoiding the occlusion problem that islinked with such display systems. A semi-transparent hand-held mirror is applied to fold theoptical path between the projection plane and theobserver and to merge the reflected graphics withthe transmitted image of the surrounding realenvironment. Using the so-called transflective padas an interactive image plane that is located betweenthe observer and the real environment, the reflectedgraphics always overlay the real space and maketraditional augmented reality tasks possible withback-projection systems.

The transflective pad offers a cheap and easilyapplicable way of combining the advantages ofimmersive and semi-immersive VR environmentswith less obtrusive AR techniques. It extends theviewing space and interaction space of suchenvironments towards the surrounding real world.Those spaces would otherwise be limited by thecorresponding projection planes. Especially incombination with workbench-like projectionsystems in which the user normally focuses on theworkbench’s surface, the surrounding environmentcan be intuitively incorporated into the interaction.As an example, consider mechanical design tasks,which can be performed on the immersiveworkbench and can then be evaluated by projectingthe virtual model or part onto the corresponding realassembly.

However, there are several limitations comparedto traditional AR systems. One of them is theindirect line-of-sight problem with the projectionplane (i.e. a portion of the projection plane must bevisible by looking at the mirror) that influences theapplication range of the transflective pad. This,however, is well manageable in combination withworkbench-like display systems that are used toaugment the close-by surrounding environment.

Another drawback is the fact that the reflectioncomputations (i.e. the augmentation) have to rely onat least two tracker sensors— the one attached to thepad, and the one attached to the observer’s head. Incontrast to single sensor setups (e.g. as see-throughHMDs), time lag, nonlinear distortions and noise areslightly higher in our approach. These problemshave to be compensated (e.g. by using precisetracking technology or by applying additionalsoftware solutions) to support a high-qualityaugmented reality system.

Note, that compared to traditional AR setups,Extended VR systems lack in mobility, which,however, does not represent a conflict with theirVR-focused applications.

While some shortcomings must be addressed infuture work, the transflective pad offers a uniqueway of extending back-projection based VR systemstowards augmented reality, thus paving the pathtowards Extended VR environments.

The proposed setup is especially versatile wheninteraction techniques for transparent andtransflective tools6,22 are complemented by thepossibility of supporting augmented reality for realobjects.

7. Future WorkOur immediate work on the transflective pad willfocus on applying more reliable tracking technology,and on replacing the currently applied semi-transparent mirror foil (3M’s P-181), by a half-silvered mirror that is commonly used for opticaldevices. A disadvantage of the P-18 foil is anywaythat its integrated sun protection falsifies the colortransmission of the real environment.

In addition, we intend to integrate collisiondetection algorithms that can be used to detectcollisions between virtual objects and real ones.Conventional object space collision detectionmethods13, 16 can be used to detect collisions betweenthe 3D geometric models that represent the virtualobjects and the real ones (i.e. their black phantoms)in the augmented environment.

Once the fundamental collision detection isprovided, we can extend our system to supportmodel-based automatic object placement methods,such as the one described by Breen7 et al. Collisiondetection and automatic object placement willdefinitely improve the interaction capabilities of oursystem.

Page 7: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

The Eurographics Association and Backwell Publishers 2000.

Furthermore, we want to adapt existing two-handed interaction techniques (e.g. pen and padapproaches) to simultaneously offer both an intuitiveinteraction with the immersive VR environment aswell as with the augmented surrounding space, andto consider multiple modalities to support a human-centered interaction.

With a reliable tracking technology, for instance,a remote image-based object registration method,such as the one described by Whitaker29 et al. couldalso be implemented. Thereby, the locations of thelandmark points could be marked on our interactiveimage plane (i.e. the pad) using the pen. With knowncamera parameters, Whitaker uses the 2D landmarkpoints and solves a linear equation system tocompute the object-to-camera transformation of thereal objects with known geometric models.

Figure 3: Conceivable future setup, using amounted transflective surface in combination with aworkbench-like back-projection system.

Further setups that apply larger mountedtransflective surfaces are conceivable. In bothsetups, the surface would be adjustable in terms ofproviding the user with a good interaction-viewingratio.

In combination with workbench-like projectionsystems, for instance, a transflective surface can beused to extend the application range of theimmersive desk by the surrounding real environment(cf. Figure 3). This scenario, however, must berealized at the expense of interactivity, since the usercannot directly interact with the augmented spacebehind the mirror. For this, only indirectmanipulation27 (e.g. ray casting, gaze-directedinteraction and pointing) would be supported.

Another possibility (cf. Figure 4) applies a morehorizontally aligned transflective surface in

combination with an overhead projection plane (e.g.a front-projected ceiling or a back-projected canvas).This scenario supports direct interaction with theaugmented space below the transflective surface andoffers the possibility of transmuting to a semi-immersive VR environment, by not illuminating thearea below the surface.

Figure 4: Conceivable future setup using a mountedtransflective surface in combination with anoverhead projection plane.

Similar desktop-based reach-in systems, such asPoston’s Virtual Workbench19 and related work,already exist to support an in-place manipulation ofvirtual objects. They usually superimpose a virtualvisual space with a virtual tactile space (featured byforce-feedback devices such as Phantoms15).However, the current setups are limited in theirfield-of-view and do barely address the integrationof real visual and real tactile feedback.

References1. 3M, Corp., P-18 is the most efficient Scotchtint Sun

Control film.URL:http://www.mmm.com/market/construction/html/products/product73_p.html, 1999.

2. Ascension Technology, Corp., Flock of Birds.URL:http://www.ascension-tech.com/products/flockofbirds/flockofbirds.htm, 2000.

3. R. Azuma. A Survey of Augmented Reality. Presence,vol. 6, no. 4, pp. 355-385, 1997.

4. M. Bajura, H. Fuchs, and R. Ohbuchi. Merging VirtualObjects with the Real World: Seeing UltrasoundImagery within the Patient. Computer Graphics(Proceedings of SIGGRAPH’92), vol. 2, pp. 203-210,1992.

Page 8: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

© The Eurographics Association and Blackwell Publishers 2000.

5. Barco, Inc., BARON.URL:http://www.barco.com/projecti/products/bsp/baron.html, 1997.

6. O. Bimber, M.L. Encarnação, and D. Schmalstieg.Real Mirrors Reflecting Virtual Worlds. Accepted toIEEE VR2000 Conference, New Brunswick, N.J.,USA, March 2000.

7. D.E. Breen, R. T Whitaker, E. Rose and M. Tuceryan.Interactive Occlusion and Automatic ObjectPlacement for Augmented Reality. Computer andGraphics Forum (Proceedings ofEUROGRAPHICS’96), vol. 15, no. 3, pp. C11-C22,1996.

8. C. Cruz-Neira, D. Sandin, and T. DeFanti. Surround-Screen Projection-Based Virtual Reality: The Designand Implementation of the CAVE. ComputerGraphics (Proceedings of SIGGRAPH’93), pp. 135-142, 1993.

9. S. Feiner, B. MacIntyre, and D. Seligmann.Knowledge-Based Augmented Reality.Communications of the ACM, vol. 36, no. 7, pp. 53-62,1993.

10. G.W. Fitzmaurice. Situated Information Spaces andSpatially Aware Palmtop Computer. CACM, vol. 36,no.7, pp. 38-49, 1993.

11. A. Janin, D. Mizell, and T. Caudell. Calibration ofhead-mounted displays for augmented realityapplications. Proceedings of the Virtual RealityAnnual International Symposium (VRAIS’93), pp. 246-255, 1993.

12. W. Krüger, C. Bohn, B. Fröhlich, H. Schüth, W.Strauss, and G. Wesche. The Responsive Workbench:A Virtual Work Environment. IEEE Computer, vol.28, no.7, pp. 42-48, 1995.

13. M. Lin and L. Canny. Efficient collision detection foranimation. Proceedings of Third EurographicsWorkshop on Animation and Simulation, Cambridge(UK), 1992.

14. B. MacIntyre and S. Feiner. Future Multimedia UserInterfaces. Multimedia Systems, vol. 4, pp. 250-268,1996.

15. T.H. Massie and J.K. Salisbury. The PHANTOMHaptic Interface: A Device for Probing Virtual Objets,Proceedings of the ASME Dynamic Systems andControl Division, vol. 55-1, pp.295-301, 1994.

16. M. Moore and J. Wilhelms. Collision detection andresponse for computer animation. Computer Graphics(Proceedings of SIGGRAPH’88), vol. 22, no. 4, pp.289-298, 1988.

17. NuVision Technologies, Inc. 60GX,URL: http://www.nuvision3d.com/the60gx.html,1997.

18. N-Vision, Inc., Datavisor,URL: http://www.nvis.com, 1999.

19. T. Poston and L. Serra. The virtual workbench:Dextrous VR, Proceedings of Virtual Reality Systemsand Technology (VRST’94), vol. 1, pp. 111-121, 1994.

20. W.H. Press, S.A. Teukolsky, W.T. Vetterling and B.P.Flannery. Numerical Recipes in C - The Art ofScientific Computing (2nd edition), ISBN 0-521-43108-5, pp. 412-420, Cambridge University Press,1992.

21. R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin andH. Fuchs. The Office of the Future: A UnifiedApproach to Image-Based Modeling and SpatiallyImmersive Displays. Computer Graphics(Proceedings of SIGGRAPH’98), Annual ConferenceSeries, pp. 179-188, 1998.

22. D. Schmalstieg, L.M. Encarnação, and Zs. Szalavári.Using Transparent Props for Interacting with theVirtual Table. Proceedings of the ACM SIGGRAPHSymposium on Interactive 3D Graphics (I3DG’99),pp. 147-153, 1999.

23. Silicon Graphics, Inc., Powerwall,URL:http://www.stereographics.com/html/body_powerwall.html, 1997.

24. Sony, Corp., Glasstron,URL:http://www.ita.sel.sony.com/products/av/glasstron, 1999.

25. StereoGraphics Corp., CrystalEyes,URL:http://www.stereographics.com/html/-body_crystaleyes.html, 1998.

26. I. Sutherland. The Ultimate Display. Proceedings ofIFIP’65, pp. 506-508, 1965.

27. R. Van De Pol, W. Ribarsky, L. Hodges. InteractionTechniques on the Virtual Workbench. Proceedings ofVirtual Environments’99, pp. 157-168, 1999.

28. Virtual I/O Display Systems, LLC., i-Glasses,URL:http://www.vio.com/html/body_i-glasses.php3,1999.

29. R. Whitaker, C. Crampton, D. Breen, M. Tuceryan,and E. Rose. Object Calibration for AugmentedReality. Computer and Graphics Forum (Proceedingsof EUROGRAPHICS'95), vol. 14, no. 3, pp. 15-27,1995.

30. M. Woo, J. Neider and T. Davis. OpenGL ProgramGuide. Addison-Wesley Developers Press. Reading,Massachusetts, 1997.

Page 9: Augmented Reality with Back-Projection Systems using Transflective Surfaces

Bimber, Encarnação and Schmalstieg / Augmented Reality with Back-Projection Systems using Transflective Surfaces

The Eurographics Association and Backwell Publishers 2000.

Figure 5: Experimental Setup.

Figure 7: Real printer augmented with a virtualcartridge.

Figure 6: X-Ray: Overlaying forearm-bones.

Figure 8: Scenario of Figure 7, observed from adifferent point of view.

Figure 9: Real workpiece complemented with ameasured construction drawing of an extension.