8
0018-9162/02/$17.00 © 2002 IEEE 2 Computer Developing a Generic Augmented-Reality Interface A n augmented-reality (AR) interface dynam- ically superimposes interactive computer graphics images onto objects in the real world. 1 Although the technology has come a long way from rendering simple wire- frames in the 1960s, 2 AR interface design and inter- action space development remain largely unex- plored. Researchers and developers have made great advances in display and tracking technologies, but interaction with AR environments has been largely limited to passive viewing or simple browsing of virtual information registered to the real world. 3 To overcome these limitations, we seek to design an AR interface that provides users with interactiv- ity so rich it would merge the physical space in which we live and work with the virtual space in which we store and interact with digital information. In this single augmented space, computer-generated enti- ties would become first-class citizens of the physical environment. We would use these entities just as we use physical objects, selecting and manipulating them with our hands instead of with a special-pur- pose device such as a mouse or joystick. Interaction would then be intuitive and seamless because we would use the same tools to work with digital and real objects. Tiles is an AR interface that moves one step closer to this vision. It allows effective spatial composi- tion, layout, and arrangement of digital objects in the physical environment. The system facilitates seamless two-handed, three-dimensional interac- tion with both virtual and physical objects, with- out requiring any special-purpose input devices. Unlike some popular AR interfaces that constrain virtual objects to a 2D tabletop surface, 4 Tiles allows full 3D spatial interaction with virtual objects any- where in the physical workspace. A tangible inter- face, it lets users work with combinations of digital and conventional tools so that, for example, they can use sticky notes to annotate both physical and virtual objects. Because our approach combines tan- gible interaction with a 3D AR display, we refer to it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable applications. Therefore, although our interface techniques can be applied broadly, we ground our design in a real-world appli- cation: the rapid prototyping and collaborative evaluation of aircraft instrument panels. INTERFACE DESIGN SPACE DICHOTOMY As the “Short History of Augmented Reality Interface Design” sidebar indicates, interface design space has been divided along two orthogonal approaches. A 3D AR interface provides spatially seamless augmented space where the user, wearing a head- mounted display, can effectively interact with both 2D and 3D virtual objects anywhere in the working environment. To interact with virtual content, how- ever, the user must rely on special-purpose input devices that would not normally be used in real- world interactions. Thus, interaction discontinu- ities break the workflow, forcing the user to switch between virtual and real interaction modes. Tangible interfaces, on the other hand, propose using multiple physical objects tracked on the aug- mented surface as physical handlers or containers The Tiles system seamlessly blends virtual and physical objects to create a work space that combines the power and flexibility of computing environments with the comfort and familiarity of the traditional workplace. Ivan Poupyrev Sony CSL Desney S. Tan Carnegie Mellon University Mark Billinghurst University of Washington Hirokazu Kato Hiroshima City University Holger Regenbrecht DaimlerChrysler AG Nobuji Tetsutani ATR MIC Labs COVER FEATURE

COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

0018-9162/02/$17.00 © 2002 IEEE2 Computer

Developing a GenericAugmented-RealityInterface

An augmented-reality (AR) interface dynam-ically superimposes interactive computergraphics images onto objects in the realworld.1 Although the technology has comea long way from rendering simple wire-

frames in the 1960s,2 AR interface design and inter-action space development remain largely unex-plored. Researchers and developers have made greatadvances in display and tracking technologies, butinteraction with AR environments has been largelylimited to passive viewing or simple browsing ofvirtual information registered to the real world.3

To overcome these limitations, we seek to designan AR interface that provides users with interactiv-ity so rich it would merge the physical space in whichwe live and work with the virtual space in which westore and interact with digital information. In thissingle augmented space, computer-generated enti-ties would become first-class citizens of the physicalenvironment. We would use these entities just as weuse physical objects, selecting and manipulatingthem with our hands instead of with a special-pur-pose device such as a mouse or joystick. Interactionwould then be intuitive and seamless because wewould use the same tools to work with digital andreal objects.

Tiles is an AR interface that moves one step closerto this vision. It allows effective spatial composi-tion, layout, and arrangement of digital objects inthe physical environment. The system facilitatesseamless two-handed, three-dimensional interac-tion with both virtual and physical objects, with-out requiring any special-purpose input devices.Unlike some popular AR interfaces that constrain

virtual objects to a 2D tabletop surface,4 Tiles allowsfull 3D spatial interaction with virtual objects any-where in the physical workspace. A tangible inter-face, it lets users work with combinations of digitaland conventional tools so that, for example, theycan use sticky notes to annotate both physical andvirtual objects. Because our approach combines tan-gible interaction with a 3D AR display, we refer toit as tangible augmented reality.

We do not suggest that tangible AR interfaces areperfect for all conceivable applications. Therefore,although our interface techniques can be appliedbroadly, we ground our design in a real-world appli-cation: the rapid prototyping and collaborativeevaluation of aircraft instrument panels.

INTERFACE DESIGN SPACE DICHOTOMYAs the “Short History of Augmented Reality

Interface Design” sidebar indicates, interface designspace has been divided along two orthogonalapproaches.

A 3D AR interface provides spatially seamlessaugmented space where the user, wearing a head-mounted display, can effectively interact with both2D and 3D virtual objects anywhere in the workingenvironment. To interact with virtual content, how-ever, the user must rely on special-purpose inputdevices that would not normally be used in real-world interactions. Thus, interaction discontinu-ities break the workflow, forcing the user to switchbetween virtual and real interaction modes.

Tangible interfaces, on the other hand, proposeusing multiple physical objects tracked on the aug-mented surface as physical handlers or containers

The Tiles system seamlessly blends virtual and physical objects to create a work space that combines the power and flexibility of computingenvironments with the comfort and familiarity of the traditional workplace.

IvanPoupyrevSony CSL

Desney S.TanCarnegie MellonUniversity

MarkBillinghurstUniversity of Washington

HirokazuKatoHiroshima City University

HolgerRegenbrechtDaimlerChrysler AG

Nobuji TetsutaniATR MIC Labs

C O V E R F E A T U R E

Page 2: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

for interacting with virtual objects projected ontothe surface. Tangible interfaces do not require anyspecial-purpose input devices, and thus provide anintuitive and seamless interaction with digital andphysical objects. However, spatial discontinuitiesdo break the interaction flow because the interfaceis localized on augmented surfaces and cannot beextended beyond them. Further, tangible interfacesoffer limited support for interacting with 3D vir-tual objects.

We believe that these opposing approaches alsocomplement each other. In Tiles, we design inter-faces that bridge 3D AR and tangible interactionsand thus produce a spatially and interactively seam-less augmented workspace.

DESIGNING TILESIn Tiles, the user wears a lightweight head-

mounted display with a small camera attached,both of which connect to a computer. The video

March 2002 3

In 1965, Ivan Sutherland built the firsthead-mounted display and used it toshow a simple wireframe cube overlaidon the real world, creating the first aug-mented-reality interface. Developers ofearly AR interfaces, who followedSutherland, similarly designed themmostly to view 3D virtual models in real-world contexts for applications such asmedicine,1 machine maintenance,2 or per-sonal information systems.3 Althoughthese interfaces provided an intuitivemethod for viewing 3D data, they typi-cally offered little support for creating ormodifying the augmented-reality content.

Content ModificationRecently, researchers have begun to

address this deficiency. Kiyoshi Kiyokawa4

uses a magnetic tracker, while theStudierstube5 project uses a tracked penand tablet to select and modify aug-mented-reality objects. More traditionalinput techniques, such as handheld mice6

and intelligent agents,7 have also beeninvestigated.

However, in all these cases the usermust work with special-purpose inputdevices, separate from tools used to inter-act with the real world. This limitationeffectively results in two different inter-faces—one for the physical workspaceand another for the virtual one.Consequently, interaction discontinuities,or seams, disrupt the natural workflow,forcing the user to switch between virtualand real operation modes.

Tangible InterfacesFor more than a decade, researchers

have been investigating an alternativeapproach: computer interfaces based onphysical objects. The Digital Desk pro-ject8 uses computer vision techniques to

track the position of paper documentsand the user’s hands on an augmentedtable. The user can seamlessly arrangeand annotate both real paper and virtualdocuments using the same physical tools—a pen and a finger. Graspable9 and tan-gible user interfaces further explore theconnection between virtual objects andthe physical properties of input devices,using simple wooden blocks to manipu-late virtual objects projected on a table’ssurface, for example.

Most importantly, tangible interfacesallow for seamless interaction, because asingle physical device represents eachinterface function or object. The deviceoccupies its own location on an aug-mented surface, and users can accessinterface functions and use traditionaltools in the same manner—through man-ual manipulation of physical objects.

Information display in tangible inter-faces can be a challenge, however. Becausechanging an object’s physical propertiesdynamically is difficult, these interfacesusually project virtual objects onto 2D sur-faces.10 The users, therefore, cannot pickvirtual objects off the augmented surfaceand manipulate them in 3D space as theywould a real object—the system localizesthe interaction to an augmented surfaceand cannot extend beyond it. Given thatthe tangible interface user cannot seam-lessly interact with virtual objects any-where in space—by, for example, movinga virtual object between augmented andnonaugmented workspaces—the tangibleinterface introduces spatial seam into inter-action.

References1. M. Bajura, H. Fuchs, and R. Ohbuchi,

“Merging Virtual Objects with the RealWorld: Seeing Ultrasound Imagery Within

the Patient,” Proc. SIGGRAPH 92, ACMPress, New York, 1992, pp. 203-210.

2. S. Feiner, B. MacIntyre, and D. Selig-mann, “Knowledge-Based AugmentedReality,” Comm. ACM, vol. 36, no. 7,1993, pp. 53-62.

3. J. Rekimoto, and K. Nagao, “TheWorld through Computer: ComputerAugmented Interaction with Real WorldEnvironments,” Proc. UIST 95, ACMPress, New York, pp. 29-36

4. K. Kiyokawa, H. Takemura, and N.Yokoya, “Seamless Design for 3DObject Creation,” IEEE Multimedia,vol. 7, no. 1, 2000, pp. 22-33.

5. D. Schmalstieg, A. Fuhrmann, and G.Hesina, “Bridging Multiple User Inter-face Dimensions with Augmented Real-ity Systems,” Proc. ISAR 2000, IEEE CSPress, Los Alamitos, Calif., 2000, pp.20-29.

6. T. Hollerer et al., “Exploring MARS:Developing Indoor and Outdoor UserInterfaces to a Mobile Augmented Real-ity System,” Computers & Graphics,vol. 23, 1999, pp. 779-785.

7. M. Anabuki et al., “Welbo: An Embod-ied Conversational Agent Living inMixed Reality Spaces, Proc. CHI 2000,ACM Press, New York, 2000, pp. 10-11.

8. P. Wellner, “Interaction with Paper onthe Digital Desk, Comm. ACM, vol. 36,no. 7, 1993, pp. 87-96.

9. G. Fitzmaurice and W. Buxton, “AnEmpirical Evaluation of Graspable UserInterfaces: Towards Specialized, Space-Multiplexed Input, Proc. CHI 97, ACMPress, New York, 1997, pp. 43-50.

10. J. Rekimoto and M. Saitoh, “Aug-mented Surfaces: A Spatially Continu-ous Work Space for Hybrid ComputingEnvironments,” Proc. CHI 99, ACMPress, New York, 1999, pp. 378-385.

Short History of Augmented Reality Interface Design

Page 3: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

4 Computer

capture subsystem uses the camera’s output tooverlay virtual images onto the video in real timeas described in the “Tracking and Registration inTiles” sidebar. The system then shows the resultingaugmented view of the real world on the head-mounted display so that the user sees virtualobjects embedded in the physical workspace, asFigures 1 and 2 show. Computer-vision trackingtechniques determine the 3D position and orien-tation of marked real objects so that virtual mod-els can be exactly overlaid on them.5 Bymanipulating these objects, the user can controlthe virtual objects associated with them withoutusing any additional input devices anywhere inspace.

Design requirementsAlthough we developed the Tiles system specifi-

cally for rapid prototyping and evaluation of air-craft instrument panels, we believe that this task’srequirements have broad application to many com-mon AR interface designs.

Aircraft instrument panel design requires the col-laborative efforts of engineers, human factors spe-cialists, electronics designers, and pilots. Designersand engineers always look for new technologiesthat can reduce the cost of developing the instru-ment panels without compromising design quality.Ideally, they would like to evaluate instrument pro-totypes relative to existing instrument panels, with-out physically rebuilding them. This inherently

An augmented-reality system’s fundamental ele-ments include techniques for tracking user positionand viewpoint direction, registering virtual objectsrelative to the physical environment, then renderingand presenting them to the user. We implemented theTiles system with our ARToolKit, an open sourcelibrary for developing computer-vision-based ARapplications.

To create the physical tiles, we mark paper cardsmeasuring 15 cm × 15 cm with simple square pat-terns consisting of a thick black border and uniquesymbols in the middle. We can use any symbol foridentification as long as each symbol is asymmetri-cal enough to distinguish between the square bor-der’s four possible orientations.

The user wears a Sony Glasstron PLMS700 head-set. Lightweight and comfortable, the headset providesan 800 × 600-pixel VGA image. A miniature NTSCToshiba camera with a wide-angle lens attaches to theheadset. The system captures the camera’s videostream at 640 × 240-pixel resolution to avoid inter-lacing problems. The image is then scaled back to 640× 480 pixels using a line-doubling technique.

By tracking rectangular markers of known size, thesystem can find the relative camera position and ori-entation in real time and can then correctly render vir-tual objects on the physical cards, as Figure A shows.Although the wide-angle lens distorts the video image,our tracking techniques are robust enough to correctlytrack patterns without losing performance. We imple-mented our current system on a 700 MHz PentiumIII PC running Linux, which allows for updates at 30frames per second. For more information on theARToolkit, see http://www.hitl.washington.edu/artoolkit/.

Figure A. The three-step process of mapping virtualobjects onto physical tiles so that the user can viewthem with a head-mounted display.

Tracking and Registration in Tiles

Page 4: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

collaborative design activity involves heavy use ofexisting physical plans, documents, and tools.

Using observations of current instrument paneldesign, DASA/EADS Airbus and DaimlerChryslerengineers produced a set of interface requirements.6

They envisioned an AR interface that would letdevelopers collaboratively outline and lay out a setof virtual aircraft instruments on a board that sim-ulates an airplane cockpit. Designers could easilyadd and remove virtual instruments from the boardusing an instrument catalog. After placing theinstruments on the board, they could evaluate andrearrange the instruments’ position. The interfaceshould also allow use of conventional tools—suchas whiteboard markers—with physical schemesand documents so that participants could docu-ment problems and solutions.

Basic conceptsWe designed the Tiles interface around a set of

simple interface principles that produce a genericand consistent AR interface.

The Tile is a small cardboard card with a marker.It serves as a physical handle for interacting withvirtual objects. Conceptually similar to icons in agraphical user interface, the tile acts as a tangibleinterface control that lets users interact with vir-tual objects just as they would with real objects,by physically manipulating the corresponding tiles.The resulting seamless interface requires no addi-tional input devices to interact with virtual objects.

Although the tiles resemble physical icons, orphicons,3 they exhibit important differences.Phicons propose a close coupling between physicaland virtual properties so that their shape andappearance mirror their corresponding virtualobject or functionality.

The Tiles interface decouples the physical prop-erties of interface controls from the data to createuniversal and generic data containers that can holdany digital data or no data at all: The tile is a blankobject that has no associated data until the usercreates an association at run time. Hence, tech-niques for performing basic operations, such asattaching data to tiles, remain the same for all tiles,resulting in a consistent and streamlined user inter-face.

We use two separate tile classes. Operation tilesdefine the interface’s functionality, and providetools to perform operations on data tiles. The Tilessystem always attaches animated 3D icons tooperation tiles so that the user can identify them.Data tiles act as generic data containers. The usercan associate or disassociate any virtual objects

with data tiles at any time using operator tiles.The user physically manipulates tiles to invokeoperations between them, such as controlling theproximity between tiles, their distance from theuser, or their orientation. The augmented work-ing space is spatially seamless—aside from cablelength, the user has no restriction on interactingwith the tiles.

Tiles interfaceThe Tiles interface consists of a metal white-

board, a book, and two stacks of magnetic tiles thateach measure approximately 15 cm × 15 cm. Sittingin front of the whiteboard, the user wears a light-weight, high-resolution Sony Glasstron head-mounted display with a video camera attached, asFigure 1 shows.

The various tangible interface elements serve dif-ferent purposes. The whiteboard provides theworkspace where users can lay out virtual aircraft

March 2002 5

Figure 1. Tiles users collaboratively arrange tangible data containers and datatiles on the whiteboard and use traditional tools to add notes and annotations.

Figure 2. The user, wearing a lightweight head-mounteddisplay with an attached camera, can see real objectsand the virtual images registered on tiles.

Page 5: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

6 Computer

instruments. The book serves as a catalog, or menuobject that shows a different virtual instrumentmodel on each page. One tile stack stores blankdata containers, which show no content until userscopy virtual objects onto them. The remaining tilesfunction as operator tiles that perform basic oper-ations on the data tiles. Each operation has aunique tile associated with it. Currently supportedoperations include deletion, copying, and a helpfunction. Each operation tile bears a different 3Dvirtual icon that shows its function and differenti-ates it from the data tiles, as Figure 3 shows.

Invoking operations. All tiles can be freely manip-ulated in space and arranged on the whiteboard.The user simply picks up a tile, examines its con-tents, and places it on the whiteboard. The usercan invoke operations between two tiles by mov-ing them next to each other.

For example, to copy an instrument to a data tile,the user first finds the desired virtual instrument inthe menu book, then places any empty data tile nextto it. After a one-second delay, a copy of the instru-ment smoothly slides from the menu page to thetile and can be arranged on the whiteboard.Similarly, the user can remove data from a tile bymoving the trash can tile close to the data tiles,which removes the instrument from it, as Figure 4shows.

Using the same technique, we can implementcopy and paste operations using a copy operationtile. The user can copy data from any data tile to theclipboard, then from the clipboard to any numberof empty data tiles by moving empty tiles next tothe virtual clipboard that has data in it, as Figure 5shows. The clipboard’s current contents can alwaysbe seen on the virtual clipboard icon. Users can dis-play as many clipboards as they need—the currentimplementation has two independent clipboards.

Getting help. The Tiles interface provides a helpsystem that lets the user request assistance with-out shifting focus from the main task. Thisapproach is more suitable for AR interfaces thantraditional desktop help systems, which either dis-tract users with a constant barrage of help mes-sages or interrupt their work by making themsearch explicitly for help.

Figure 6 shows the two AR help techniques thatTiles provides. With Tangible Bubble Help, simplyplacing the help tile beside the tile the user requireshelp with brings up a text bubble next to the helpicon, as Figure 6a shows. In some cases, however,users only need short reminders, or tips, about aparticular tile’s functionality. Alternatively, theTangible ToolTips technique triggers the display ofa short text description associated with a tile whenthe user moves the tile within arm’s reach and tiltsit more than 30 degrees away from the body, asFigure 6b shows.

Mixing physical and virtual tools. The Tiles interfacelets users seamlessly combine conventional physi-cal and virtual tools. For example, the user canphysically annotate a virtual aircraft instrumentusing a standard whiteboard pen or sticky note, asFigure 7 shows.

Multiple users. We designed Tiles with collabora-tion in mind. Thus, the system lets several users

Figure 4. The usercleans a data tile bymoving the trashcan operator tilenext to it.

Figure 3. Operationtiles. (a) The printeddesign of the physi-cal tiles that repre-sent the delete,copy, and help oper-ations, and (b) thevirtual icons thatrepresent the samethree operations inthe augmented-reality interface.

Figure 5. Copyingdata from the clip-board to an emptydata tile. The usermoves the tile close to the virtualclipboard and, aftera one-second delay,the virtual instru-ment slidessmoothly into the data tile.

(a)

(b)

Page 6: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

interact in the same augmented workspace. Allusers can be equipped with head-mounted displaysand can directly interact with virtual objects.Alternatively, users who do not wear head-mounted displays can collaborate with immersedusers via an additional monitor that presents theaugmented-reality view. Because all interface com-ponents consist of simple physical objects, boththe nonimmersed and immersed user can performthe same authoring tasks.

TILES IN OTHER APPLICATIONSOur initial user observations showed that devel-

opers of tangible AR interfaces must focus on boththe interface’s physical design and the virtual icons’computer graphics design. Physical componentdesigns can convey additional interface semantics.For example, the physical cards can be designed tosnap together like pieces in a jigsaw puzzle, result-ing in different functionality profiles depending ontheir physical configuration.

The interface model and interaction techniquesintroduced in Tiles can be extended easily to otherapplications that require AR interfaces. Object mod-ification techniques, for example, can be introducedinto Tiles by developing additional operator cardsthat would let the user dynamically modify objectsthrough scaling, color changes, employing hand ges-tures, and so on. The interaction techniques we pre-sent here will remain applicable, however.

We found that the tight coupling of 3D input anddisplay in a single interface component—the tile—let users perform complex functions through essen-tially simple spatial manipulation and physicalarrangements of these tangible interface compo-nents. Thus, Tiles provides an application-inde-pendent interface that could lead to thedevelopment of generic AR interface models basedon tangible augmented-reality concepts.

A lthough developing additional interactiontechniques would let users apply Tiles to manydifferent application scenarios, in AR envi-

ronments the user can already switch easily betweenthe AR workspace and a traditional environment.Some tools and techniques better suit augmentedreality, while others work best in traditional form.Therefore, we believe that development of AR inter-faces should not focus on bringing every possibleinteraction tool and technique into the AR work-space. Instead, it should focus on balancing and dis-tributing features between the AR interface andother interactive media so that they all can be used

within a single seamless augmented work space.AR interfaces also offer an ad hoc, highly recon-

figurable environment. Unlike traditional GUI and3D VR interfaces, in which the designer determinesmost of the interface layout in advance, in Tilesusers can freely place interface elements anywherethey want: on tables or whiteboards, in boxes andfolders, arranged in stacks, or grouped together. Theinterface configuration and layout of its elementsemerges spontaneously as the results of users’ workactivity, and evolves together with it. How the inter-face components should be designed for such envi-ronments, and whether these systems should beaware of the dynamic changes in their configura-tion, pose important research questions. �

AcknowledgmentsThis work represents a joint research initiative

carried out with support from DASA/EADS AirbusDaimlerChrysler AG and ATR International. We

March 2002 7

Figure 6. Tangiblehelp in Tiles. (a) Toaccess TangibleBubble Help , usersplace the help tilenext to the tile theyneed assistancewith, which causestextual annotationsto appear within abubble next to thetile. (b) For lessdetailed explana-tions, TangibleToolTips displays anassociated shorttext descriptionwhen the usermoves the tilecloser and tilts it.

Figure 7. Physicallyannotating virtualobjects in Tiles.Because the printedtiles offer a physi-cal anchor for vir-tual objects, userscan make notesadjacent to themusing marking pensand sticky notes.

(a)

(b)

Page 7: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

8 Computer

thank Keiko Nakao for designing the 3D modelsand animations used in Tiles.

References1. R. Azuma, “A Survey of Augmented Reality,” Pres-

ence: Teleoperatore and Virtual Environments, vol. 6,no. 4, 1997, pp. 355-385.

2. I. Sutherland, “The Ultimate Display,” Proc. Int’lFederation of Information Processing, SpartanBooks, Wash., D.C., 1965, pp. 506-508.

3. H. Ishii and B. Ullmer, “Tangible Bits Towards Seam-less Interfaces Between People, Bits and Atoms,”Proc. CHI 97, ACM Press, New York, 1997, pp.234-241.

4. B. Ullmer and H. Ishii, “The MetaDesk: Models andPrototypes for Tangible User Interfaces,” Proc. UIST97, ACM Press, New York, 1997, pp. 223-232.

5. H. Kato and M. Billinghurst, “Marker Tracking andHMD Calibration for a Video-Based AugmentedReality Conferencing System,” Proc. 2nd Int’l Work-shop Augmented Reality, IEEE Press, Los Alamitos,Calif., 1999, pp. 85-94.

6. I. Poupyrev et al., “Tiles: A Mixed Reality Author-ing Interface,” Proc. Interact 2001, IOS Press,Netherlands, 2001, pp. 334-341.

Ivan Poupyrev is an associate researcher in SonyComputer Science Labs’ Interaction Lab, Tokyo,where he investigates the interaction implications ofcomputer-augmented environments. Poupyrevreceived a PhD in computer science from HiroshimaUniversity. Contact him at poup@csl. sony.co.jp.

Desney S. Tan is a PhD candidate at Carnegie Mel-lon University. His research interests include design-ing human computer interfaces that augmenthuman cognition, specifically to leverage existingpsychology principles on human memory and spa-tial cognition in exploring multimodal, multidis-play information systems. He received a BS incomputer engineering from the University of NotreDame. Contact him at [email protected] or seehttp:// www.cs.cmu.edu/~desney.

Mark Billinghurst is a researcher at the Universityof Washington, Seattle’s Human Interface Tech-nology Laboratory. His research interests includeaugmented and virtual reality, conversational com-puter interfaces, and wearable computers, with hismost recent work centering around using aug-mented reality to enhance face-to-face and remoteconferencing. Billinghurst received a PhD in elec-trical engineering from the University of Washing-ton Contact him at [email protected].

Hirokazu Kato is an associate professor at Facultyof Information Sciences, Hiroshima City Univer-sity. His research interests include augmented real-ity, computer vision, pattern recognition, andhuman-computer interaction. Kato received a PhDin engineering from Osaka University. Contact himat [email protected].

Holger Regenbrecht is a scientist at the Daimler-Chrysler Research Center in Ulm, Germany. Hisresearch interests include interfaces for virtual andaugmented environments, virtual-reality-aideddesign, perception of virtual reality, and AR/VR inautomotive and aerospace industry. Regenbrecht re-ceived a doctoral degree from the Bauhaus Univer-sity Weimar, Germany. Contact him at [email protected].

Nobuji Tetsutani is the head of Department 3 inATR Media Information Science Laboratories. Hisresearch interests include application developmentfor high-speed networks. Tetsutani received a PhDin electrical engineering form Hokkaido Univer-sity, Hokkaido, Japan. Contact him at [email protected].

Page 8: COVER FEATURE Developing a Generic Augmented-Reality Interface€¦ · it as tangible augmented reality. We do not suggest that tangible AR interfaces are perfect for all conceivable

March 2002 9

XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX