4
Massively Multiplayer Online Worlds as a Platform for Augmented Reality Experiences Tobias Lang * Ludwig-Maximilians- Universit ¨ at unchen Blair MacIntyre School of Interactive Computing, GVU Center, Georgia Institute of Technology Iker Jamardo Zugaza University of Deusto ABSTRACT Massively Multiplayer Online Worlds (MMOs) are persistent vir- tual environments where people play, experiment and socially in- teract. In this paper, we demonstrate that MMOs also provide a powerful platform for Augmented Reality (AR) applications, where we blend together locations in physical space with corresponding places in the virtual world. We introduce the notion of AR stages, which are persistent, evolving spaces that encapsulate AR experi- ences in online three-dimensional virtual worlds. We discuss the concepts and technology necessary to use an MMO for AR, includ- ing a novel set of design concepts aimed at keeping such a system easy to learn and use. By leveraging the features of the commer- cial MMO Second Life, we have created a powerful AR authoring environment accessible to a large, diverse set of users. Keywords: Augmented Reality, Virtual Reality, Massively Multi- player Online Worlds, User Interfaces, Second Life Index Terms: I.3.7 [Three-Dimensional Graphics and Realism]: Virtual reality—; H.5.1 [Multimedia Information Systems]: Artifi- cial, augmented, and virtual realities—; 1 I NTRODUCTION Despite years of research and the availability of many tools, Aug- mented Reality (AR) is not widely used. The limitations (and expense) of current tracking and display systems are partially to blame, particularly for casual and home use. Yet, AR has not man- aged to attract a base of users even among those that would be will- ing to work within the limitations of the current technology. Ex- isting dedicated frameworks for AR development aim to address many challenges faced by AR developers. Unfortunately, they are typically rooted in academic environments and require strong tech- nical knowledge with only a small amount of support. Likewise, these tools are usually based on custom rendering, interaction and scripting libraries. This complicates collaborative development and efficient asset exchange among different applications. The work reported here was originally motivated by the desire to explore the idea of AR machinima 1 , where physical and virtual content can be easily mixed in real-time. Machinima productions often rely on multiplayer gaming platforms to combine the efforts of several participants. Each of them controls parts of the movie, such as the camera or virtual actors. Given the growing success of machinima [12], we began exploring how to use similar tools to incorporate compelling AR content rather than the purely virtual films associated with the machinima movement. MMOs are particularly intriguing as a platform for AR due to their impressive feature sets. The always-available and persistent * e-mail: [email protected]fi.lmu.de e-mail: [email protected] e-mail: [email protected] virtual space usually is a complete synthetic world, offering a rich source of artifacts that can be used to augment physical environ- ments. In combination with real-time text and voice chat facilities, MMOs also enable distributed, collaborative work. Remote users can join together, author and perform in high-quality virtual sets. The typically huge user base of MMOs ensure a rapidly evolving platform and a potentially large community unrivaled by existing AR development tools. By maintaining compatibility with an ex- isting MMO infrastructure, an AR editor benefits from the existing support facilities and assets, such as 3D models, animations and behaviors. An MMO-based AR authoring tool is not a panacea for the prob- lem of AR authoring. While MMOs cannot be used to develop every AR application, this approach can be very successful for ap- propriate experiences. MMOs are rooted in entertainment, one of the major application areas we foresee for MMO-based AR. For that reason, our particular implementation offers novices a unified environment to augment physical locations with arbitrary graphics. Possible applications range from pure visual augmentation, such as a furniture design tool, to AR machinima and sophisticated games similar to ARQuake. Limitations arise when spatial AR techniques or mobile setups, such as handheld AR, are used, or when multiple users need to see variations of the same graphical world. The contribution of this paper is to demonstrate how an MMO can be used as a platform for AR experiences and the reasoning be- hind our approach. To illustrate how MMOs can provide a kind of collaborative WYSIWYG 2 authoring environment for AR, we have built a custom AR interface for the MMO Second Life. Associ- ated three-dimensional in-world tools and a novel design metaphor called the AR Stage are used to design and control an AR applica- tion. AR Stages encapsulate an AR experience in a defined area in virtual space. Our approach goes beyond current AR tools in terms of its capabilities and ease-of-use, since users can craft an AR stage collaboratively and intuitively in three-dimensional space. 2 RELATED WORK Existing authoring frameworks for Augmented Reality take a va- riety of approaches to address the complexity of fusing virtual and physical worlds. The widely used ARToolkit [6] provides C/C++ developers fiducial-based real-time tracking, as well as sim- ple video capture and rendering support. More recent developments like OSGART [10] leverage open source graphics engines to pro- vide a more streamlined and powerful system architecture, but are still aimed at programmers instead of designers. Other research projects have tried to provide a data-driven de- velopment model. Systems such as April [8] are organized in a component-based structure. They aim to be flexible and abstract away the underlying low-level APIs. The usage of most of these au- thoring tools requires considerable technical knowledge to create an AR application. The Designer’s AR Toolkit (DART) [11] attempts 1 Machinima is a production technique for creating videos using real-time graphics and game engines. 2 What you see is what you get. 67 IEEE Virtual Reality 2008 8-12 March, Reno, Nevada, USA 978-1-4244-1971-5/08/$25.00 ©2008 IEEE

[IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Massively Multiplayer Online Worlds as a Platform for Augmented

Embed Size (px)

Citation preview

Page 1: [IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Massively Multiplayer Online Worlds as a Platform for Augmented

Massively Multiplayer Online Worlds as a Platform for Augmented RealityExperiences

Tobias Lang∗

Ludwig-Maximilians-UniversitatMunchen

Blair MacIntyre†

School of InteractiveComputing, GVU Center,

Georgia Institute ofTechnology

Iker Jamardo Zugaza‡

University of Deusto

ABSTRACT

Massively Multiplayer Online Worlds (MMOs) are persistent vir-tual environments where people play, experiment and socially in-teract. In this paper, we demonstrate that MMOs also provide apowerful platform for Augmented Reality (AR) applications, wherewe blend together locations in physical space with correspondingplaces in the virtual world. We introduce the notion of AR stages,which are persistent, evolving spaces that encapsulate AR experi-ences in online three-dimensional virtual worlds. We discuss theconcepts and technology necessary to use an MMO for AR, includ-ing a novel set of design concepts aimed at keeping such a systemeasy to learn and use. By leveraging the features of the commer-cial MMO Second Life, we have created a powerful AR authoringenvironment accessible to a large, diverse set of users.

Keywords: Augmented Reality, Virtual Reality, Massively Multi-player Online Worlds, User Interfaces, Second Life

Index Terms: I.3.7 [Three-Dimensional Graphics and Realism]:Virtual reality—; H.5.1 [Multimedia Information Systems]: Artifi-cial, augmented, and virtual realities—;

1 INTRODUCTION

Despite years of research and the availability of many tools, Aug-mented Reality (AR) is not widely used. The limitations (andexpense) of current tracking and display systems are partially toblame, particularly for casual and home use. Yet, AR has not man-aged to attract a base of users even among those that would be will-ing to work within the limitations of the current technology. Ex-isting dedicated frameworks for AR development aim to addressmany challenges faced by AR developers. Unfortunately, they aretypically rooted in academic environments and require strong tech-nical knowledge with only a small amount of support. Likewise,these tools are usually based on custom rendering, interaction andscripting libraries. This complicates collaborative development andefficient asset exchange among different applications.

The work reported here was originally motivated by the desireto explore the idea of AR machinima1, where physical and virtualcontent can be easily mixed in real-time. Machinima productionsoften rely on multiplayer gaming platforms to combine the effortsof several participants. Each of them controls parts of the movie,such as the camera or virtual actors. Given the growing successof machinima [12], we began exploring how to use similar toolsto incorporate compelling AR content rather than the purely virtualfilms associated with the machinima movement.

MMOs are particularly intriguing as a platform for AR due totheir impressive feature sets. The always-available and persistent

∗e-mail: [email protected]†e-mail: [email protected]‡e-mail: [email protected]

virtual space usually is a complete synthetic world, offering a richsource of artifacts that can be used to augment physical environ-ments. In combination with real-time text and voice chat facilities,MMOs also enable distributed, collaborative work. Remote userscan join together, author and perform in high-quality virtual sets.The typically huge user base of MMOs ensure a rapidly evolvingplatform and a potentially large community unrivaled by existingAR development tools. By maintaining compatibility with an ex-isting MMO infrastructure, an AR editor benefits from the existingsupport facilities and assets, such as 3D models, animations andbehaviors.

An MMO-based AR authoring tool is not a panacea for the prob-lem of AR authoring. While MMOs cannot be used to developevery AR application, this approach can be very successful for ap-propriate experiences. MMOs are rooted in entertainment, one ofthe major application areas we foresee for MMO-based AR. Forthat reason, our particular implementation offers novices a unifiedenvironment to augment physical locations with arbitrary graphics.Possible applications range from pure visual augmentation, such asa furniture design tool, to AR machinima and sophisticated gamessimilar to ARQuake. Limitations arise when spatial AR techniquesor mobile setups, such as handheld AR, are used, or when multipleusers need to see variations of the same graphical world.

The contribution of this paper is to demonstrate how an MMOcan be used as a platform for AR experiences and the reasoning be-hind our approach. To illustrate how MMOs can provide a kind ofcollaborative WYSIWYG2 authoring environment for AR, we havebuilt a custom AR interface for the MMO Second Life. Associ-ated three-dimensional in-world tools and a novel design metaphorcalled the AR Stage are used to design and control an AR applica-tion. AR Stages encapsulate an AR experience in a defined area invirtual space. Our approach goes beyond current AR tools in termsof its capabilities and ease-of-use, since users can craft an AR stagecollaboratively and intuitively in three-dimensional space.

2 RELATED WORK

Existing authoring frameworks for Augmented Reality take a va-riety of approaches to address the complexity of fusing virtualand physical worlds. The widely used ARToolkit [6] providesC/C++ developers fiducial-based real-time tracking, as well as sim-ple video capture and rendering support. More recent developmentslike OSGART [10] leverage open source graphics engines to pro-vide a more streamlined and powerful system architecture, but arestill aimed at programmers instead of designers.

Other research projects have tried to provide a data-driven de-velopment model. Systems such as April [8] are organized in acomponent-based structure. They aim to be flexible and abstractaway the underlying low-level APIs. The usage of most of these au-thoring tools requires considerable technical knowledge to create anAR application. The Designer’s AR Toolkit (DART) [11] attempts

1 Machinima is a production technique for creating videos using real-timegraphics and game engines.

2 What you see is what you get.

67

IEEE Virtual Reality 20088-12 March, Reno, Nevada, USA978-1-4244-1971-5/08/$25.00 ©2008 IEEE

Page 2: [IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Massively Multiplayer Online Worlds as a Platform for Augmented

to circumvent this problem by leveraging Macromedia Director, anexisting media production software. A familiar multimedia author-ing tool opens up AR authoring to a broader user group. Thomaset al. [14] integrated AR technology in the commercial video gameQuake. Their ARQuake application immersed its players in a mixedreality indoor/outdoor environment perceived from a first-personperspective. However, the system was not designed to be an ARauthoring tool.

In so-called immersive authoring environments, a Virtual Real-ity experience is created while in the running system, supportingintuitive and collaborative design coupled with instant testing andfeedback. The idea of immersive 3D authoring has gained wide at-tention in the research community (e.g. [9]) and in commercial sys-tems such as Virtools. Freeman et al. [4] apply immersive authoringprinciples to AR in order to interactively model virtual objects re-sembling their physical counterparts. The physical environment isperceived via video with virtual graphics modeled on top. Once therelationship between the virtual and physical objects is established,the information can also be used for tracking purposes.

Shared augmented environments present an augmented view toseveral participants at the same time. All participants perceive thesame augmented artifacts inhabiting the same position in the realworld, enabling collaborative tasks (compare Billinghurst et al. [2]).Benford et al. [1] examined the nature of shared spaces, such as col-laborative virtual environments, and developed classifications forthose shared spaces. They also put into account a mixed-realityapproach and staged a poetry performance that occurred simulta-neously in physical and virtual space. Their findings regarding thesocial interactions between both spaces are of interest for futuremixed-reality performances.

3 LEVERAGING MMOS FOR AUGMENTED REALITY

Massively Multiplayer Online Worlds aim to provide a single vir-tual world that is shared among all users at the same time. MMOsare often used by their players as social spaces for experimentation,exploration and self-expression. MMOs are designed to be alwaysonline and persistent. Changes made to the environment remainuntil they are explicitly modified.

While many MMOs provide mechanisms suitable for AR author-ing, only a few are suitable for use as general purpose AR author-ing environments. At a practical level, the source to the client mustbe available to implement the required AR related modifications.More generally, users must be able to freely create and modify 3Dmodels, textures, sounds and animations, without needing specialpermissions from those running the MMO. The commercial MMOSecond Life provided an ideal mix of capabilities, combined with ahuge user base.

3.1 Benefits for Augmented Reality

Two key features of an MMO that simplify AR authoring are thepersistent 3D world and support for a virtual economy.

The persistent 3D world at the heart of MMOs like Second Lifeis a key enabler for authoring. The entire description of the virtualworld is not stored locally on one computer, but hosted on remoteservers which run and simulate the world continuously, whether aclient is connected or not. The mechanisms of the virtual space arecompletely transparent to the user, who simply visits the appropri-ate 3D locations with his avatar. All content is cached transparently;no files have to be loaded or updated manually. We leverage the per-sistent nature of MMOs for collaboration, setup and publishing ofan AR experience. Every alteration is immediately sent to othercollaborators and conflicts are handled by the MMO infrastructure.Persistence also useful for runtime control of an AR experience andits distribution. In our approach, an AR experiences is completelyencapsulated in a 3D space. We call this design concept AR Stagesand will describe it later in more detail.

Pervasive support for user-created content, accompanied by eco-nomic and copy protection mechanisms, foster the creation of anin-world economy. In Second Life in particular, a vast virtual mar-ketplace of rich and varied content exists, simplifying the creationof the necessary content for AR experiences. Many AR experiencesrely on rich 3D content, and creating these 3D models from scratch,with realistic textures and animations, is a time-consuming processthat requires expertise and artistic skills. The virtual marketplaceprovides an easy way to obtain pre-made 3D content, or hire dis-tributed experts to create custom content, and all of its mechanismsare integrated into the virtual world.

These features point to the value of a future “3D web” for bothAR and VR experience and application development. A more gen-eral, world-wide and open MMO would likely have both of thesefeatures, which will be key to their success.

3.2 AR ComponentsNone of the existing Massively Multiplayer Online Worlds weredesigned with Augmented Reality in mind. Hence, all AR-relatedfunctionality has to be integrated before an MMO can be used forAR authoring. This section presents these AR components and pro-vides an abstract view of their integration into Second Life.

3.2.1 Custom In-World CommandsMMOs are based on network messages to display and control thevirtual world. Often new messages can only be introduced by theoriginal developer of the MMO. However, AR requires a specific setof messages to function. For example, messages are necessary tocontrol the spatial relationships between physical and virtual space.If custom messages are not supported a priori, they must be inte-grated by leveraging other mechanisms such as instant messages(IM). Custom commands can then be encoded in a short text mes-sage and sent to other clients. There, these strings can be inter-cepted in the client API to control AR functionality on a sourcecode level. In our Second Life client, the user sends specially for-matted messages via the in-word scripting language LSL. Our soft-ware checks every incoming IM to see if its syntax resembles acustom function call by using a regular expression. If such a test issuccessful, the message is parsed and split into the actual commandand an arbitrary amount of function parameters.

3.2.2 Real-Time Video-Mixed ARVideo-mixed augmentation, the superimposition of virtual graph-ics on top of a video image, is a common approach to AR. Theimplementations range from simple AR applications that displayaugmented video streams on a computer screen, to immersive ex-periences using video-mixed head-mounted displays. In our im-plementation, the camera images are acquired using an extensiblevideo capture library which abstracts platform, hardware and API-dependent differences. Inside Second Life, a custom user interfacepanel allows the user to control one or more cameras. As in mostvideo-mixed AR applications, we display the video in the back-ground superimposed by virtual graphics. Existing backgroundgraphics that might occlude the video, such as sky and water, canbe disabled in our viewer. Since some AR experience might stillwant to use these graphics, we provide programmers with flexiblerun-time control of their display via LSL.

3.2.3 Spatial RegistrationSpatial registration encompasses the accurate acquisition of posi-tion and orientation data of physical entities and is needed for tworeasons. First, the pose of the virtual camera must be matched withthe physical video camera. Only then can virtual graphics be per-ceived from the same point of view as the physical environment,leading to a perspectively-correct graphics overlay. Second, vir-tual objects should be able to be controlled by real-time tracking of

68

Page 3: [IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Massively Multiplayer Online Worlds as a Platform for Augmented

objects in the world. The movement of a physical prop should bereflected by some action in the virtual world.

In our implementation, we incorporate the open source libraryVRPN3 to support a wide range of tracking hardware. Careful con-sideration has been given to the variety of cartesian coordinate sys-tems that are inevitable when combining different devices into a3D graphical world. We provide the user with a 2D user interfaceto specify necessary coordinate transformations along with the pos-sibility to store these values persistently in the virtual world for laterretrieval.

In most MMOs, the virtual camera can be controlled locallyin real-time, independent from the user’s avatar position. No in-formation has to be shared over the network, since the personalview is not shared with other participants. In contrast, control-ling 3D objects via tracking requires constant synchronization withthe servers, which distribute the updated pose of an object to otherusers, and perform additional calculations such as collision detec-tion and physics. This behavior is an inherent limitation of MMOs,which tend not to be optimized to send high frequency real-time ed-its to other clients. Thus, changes will not be perceivable by otherusers immediately, due to architectural optimizations (e.g., buffer-ing updates) and network delays.

3.2.4 Occlusion of Graphics by Physical Objects

Occlusion between objects provides important clues to humandepth perception. Given the importance of this phenomenon, it isvery desirable to account for occlusion in Augmented Reality aswell. To decide the spatial order between a physical and virtual ob-ject, the depth information about the physical scene is necessary,but difficult to acquire. The captured video is just a 2D projec-tion of the environment and any information besides color is lost.To circumvent this problem, the shapes of fixed physical objects inthe virtual space can be rebuilt and the “phantom objects” placedaccordingly [5]. The immersive authoring capabilities of MMOstremendously simplify this time-consuming process since the resultcan be evaluated instantly and the phantom objects modified if nec-essary.

The phantom objects have to be treated specially by the render-ing pipeline, which requires them to be distinguished from normalgeometry. In MMOs that do not allow users to attach custom datato entities, such as Second Life, the property needs to be encoded inthe object in some other way. In our implementation, we use a spe-cial object color. Our software recognizes this specific RGB valueand treats the objects accordingly.

3.2.5 Limitation of Viewable Space

Most of the virtual land in an MMO such as Second Life alreadycontains virtual content, or has nearby content created by otherusers. Virtual graphics which are not intended for the augmentationhave to be prevented from rendering (see Figure 1(a)). In our imple-mentation, we typically want to hide graphics outside of a certainarea in the virtual space, the aforementioned AR Stage. We achievethis by leveraging OpenGL clipping planes. In-world, we allow thespecification of certain planes in 3D which mark the boundary be-tween visible and invisible space. Users can intuitively set theseclipping planes, represented by 3D thin cuboids, and enable or dis-able them at any time.

3.2.6 Post-Processing Effects

The visual quality of the superimposed graphics influences the over-all realism of an Augmented Reality experience. Today, real-timecomputer graphics achieve a very high visual quality, but artifi-cial graphics are still easily distinguishable from physical artifacts.Fischer et al. [3] proposed the use of post-processing effects to

3 http://www.cs.unc.edu/Research/vrpn/

equalize the fidelity of both the video image and the virtual graph-ics. We support such effects by providing a flexible compositingmechanism inside Second Life. The rendered 2D output can be pro-cessed by programmable pixel shaders in one or several renderingpasses, and the post-processing functionality is fully controllableduring runtime by using text-based effect specifications stored per-sistently inside the virtual world.

4 THE AR STAGE DESIGN CONCEPT

The previous section identified the components and functionalitythat have to be integrated into an MMO in order to potentially useit for AR purposes. To leverage the immersive authoring, persis-tence and collaboration capabilities of the MMO, it is important tobuild an AR design process around these features. Our proposedparadigm for Augmented Reality authoring in MMOs is called ARStage. An AR Stage is a defined area in virtual space that encap-sulates all relevant information and controls to perform the AR ex-perience. Using this paradigm, AR experiences are less like “ap-plications” that are designed, loaded and executed and more like“installations” which are built and evolve in a distinct area in thevirtual world. Users, embodied through their avatars, visit an ARStage to design and participate in an AR experience.

The notion of AR Stages is inspired by theatrical stages or moviesets, which also create a different reality. This reality is limited bythe area a stage encompasses and is entirely defined by its contents.Like its physical counterpart, an AR Stage has an actual perfor-mance area, along with a “backstage” holding all supporting equip-ment (see Figure 1(b)). This equipment is represented in the form ofinteractive 3D objects. The purpose of these “AR control objects”can be marked with visual clues, encoding so-called affordances intheir appearance [7] (see Figure 1(c)). 3D control objects tremen-dously simplify the creation of AR experiences. Users can exper-iment with, and design for, AR in an immediate manner, withoutrelying on offline input or external configuration files.

AR Stages are created with one particular AR experience inmind. However, many of these stages can exist in parallel, eachrepresenting a variation of the same experience, a completely dif-ferent AR experience, or a distinct part of a combined larger one.Given a physical room, this room can be easily augmented usingdifferent AR stages.

4.1 Performance Area

The unique feature of an AR Stage is its actual performance spacethat is mapped to the physical world. Hence, every action in the vir-tual world is reflected in the real world. This uni-directional map-ping can be extended to a bi-directional one by sensing mechanismsin the real world, such as a tracking sensor which sends its pose in-formation to the virtual world. This pose can be used to controlthe position and orientation of virtual objects, bridging virtual andphysical reality.

As soon as a designated area is chosen in the virtual space, thevirtual content can be created and placed using the immersive mod-eling tools the MMO provides. This process can be done collabora-tively, by several users at the same time. To evaluate the appearanceof the virtual graphics in the final augmented view, it is possible toswitch to the Augmented Reality mode at any time and perceivethe virtual graphics superimposed on the live video feed. Featureslike the MMO scripting language and animations can be used toautomate the AR experience and incorporate interactive behavior.AR stages also support acting in mixed space, delineating the spacein which other avatars can enter the performance area to becomepart of the augmentation. This allows scenarios in which real peo-ple can interact and perform with remotely-controlled avatars (seeFigure 1(d)).

69

Page 4: [IEEE 2008 IEEE Virtual Reality Conference - Reno, NV, USA (2008.03.8-2008.03.12)] 2008 IEEE Virtual Reality Conference - Massively Multiplayer Online Worlds as a Platform for Augmented

4.2 Backstage AreaWhile the performance area entails all assets that appear in the aug-mented view, the actual backstage area is used to control the AR ex-perience. Various aspects of an AR experience require input fromthe user to specify certain configurations or to change the experi-ence during runtime. These settings are stored persistently in anAR Stage and can be retrieved by anyone at any time.

In the AR stage paradigm, most of the controls are actual 3Dobjects themselves, in contrast to 2D user dialogs or text files. The3D control objects are programmed and created in the same wayas other virtual objects of the MMO. Hence they can be createdand changed quickly and easily, using AR related extensions to theMMO scripting language. The purpose of the control objects relatesto the AR components mentioned earlier. They are used to controlthe appearance of background graphics and clipping planes, storetracking transformations, depict post-effect descriptions and spec-ify the mapping between physical and virtual space.

5 CASE STUDIES

So far, we used our AR interface for Second Life to build severalAR experiences, focusing on rapid prototyping, AR machinima andlive mixed-reality performance involving actors and bystanders.

To test the prototyping capabilities of MMO-based AR author-ing, we re-created another research project from our lab. There, weare adapting the UNC Pit experiment [13] to study the concept ofpresence in AR environments. Our Second Life based prototypewas used to quickly test depth perception, different sound effectsand to change the virtual content in combination with various post-processing effects (see Figure 1(e)). Our setup utilized a 6-DOFtracker from InterSense in combination with a video-mixed HMD.It took the first author less than one hour to build a working proto-type and try a range of design ideas with the experiment team.

Our authoring tool was designed for easy use. We have tested thesystem in the hands of undergraduate media design students that arenot experienced in AR, or even 3D authoring. Students created aneight minute long AR machinima piece (see Figure 1(f)) involvingreal and virtual actors, as well as working with local acting compa-nies to explore the possibilities of live AR performance. Interviewsshowed an overall satisfaction with the our system and the appealof the AR Stage design concept.

6 DISCUSSION AND CONCLUSIONS

The major strength we foresee in applying MMOs to AR lies inthe immersive, intuitive and collaborative nature of MMO-basedAR design; the same can also be said for their potential for VRexperience design. The evolving AR experiences are always on-line, persistent and open to changes. The AR stage concept allowssimultaneous design and encapsulates the configuration of an ARapplication in an intuitive and flexible way. By incorporating andextending the MMO scripting language, there is no shortage of ex-perienced programmers which can join the efforts from all over theworld.

In the future, we will use our AR interface for Second Life to cre-ate additional AR experiences and refine and expand our softwareaccording to emerging needs. Special attention will be given to userinteraction because existing mechanisms already allow the creationof tangible user interfaces based on, for example, ARToolkit mark-ers. Finally, we will assess the limits of our approach by creating abroader range of AR experiences such as games or mobile setups.

REFERENCES

[1] S. Benford, C. Greenhalgh, G. Reynard, C. Brown, and B. Koleva. Un-derstanding and constructing shared spaces with mixed-reality bound-aries. ACM Trans. Comput.-Hum. Interact., 5(3):185–223, 1998.

[2] M. Billinghurst, S. Weghorst, and T. Furnes. Shared space: An aug-mented reality interface for computer supported collaborative work.In Proc. Collaborative Virtual Environments ’96. 1996.

(a) Existing graphics disturb theaugmented view.

(b) An AR Stage consists of a per-formance and backstage area.

(c) Interactive 3D objects to con-trol an AR experience.

(d) Real people can interact withvirtual avatars.

(e) A viewer perceives an artifi-cial pit by augmentation.

(f) An AR machinima piece in-volving virtual and real actors.

Figure 1: Augmented Reality with MMOs.

[3] J. Fischer and D. Bartz. Stylized augmented reality for improved im-mersion. In VR ’05, pages 195–202, 325, Washington, DC, USA,2005. IEEE Computer Society.

[4] R. Freeman and A. Steed. Interactive modelling and tracking formixed and augmented reality. In VRST ’06, pages 61–64, New York,NY, USA, 2006. ACM.

[5] A. Fuhrmann, G. Hesina, F. Faure, and M. Gervautz. Occlusion incollaborative augmented environments. Computers and Graphics,23(6):809–819, 1999.

[6] H. Kato and M. Billinghurst. Marker tracking and HMD calibrationfor a video-based augmented reality conferencing system. In IWAR’99, page 85, Washington, DC, USA, 1999. IEEE Computer Society.

[7] D. Kirsh. The intelligent use of space. Artif. Intell., 73(1-2), 1995.[8] F. Ledermann and D. Schmalstieg. April a high-level framework for

creating augmented reality presentations. In VR ’05, pages 187–194,Washington, DC, USA, 2005. IEEE Computer Society.

[9] G. A. Lee, G. J. Kim, and C.-M. Park. Modeling virtual object behav-ior within virtual environment. In VRST ’02, pages 41–48, New York,NY, USA, 2002. ACM Press.

[10] J. Looser, R. Grasset, H. Seichter, and M. Billinghurst. OSGART - apragmatic approach to MR. New Zealand, 2006.

[11] B. MacIntyre, M. Gandy, S. Dow, and J. D. Bolter. DART: a toolkitfor rapid design exploration of augmented reality experiences. In UIST’04, pages 197–206, New York, NY, USA, 2004. ACM Press.

[12] A. Mazalek and M. Nitsche. Tangible interfaces for real-time 3d vir-tual environments. In ACE ’07, pages 155–162, New York, NY, USA,2007. ACM Press.

[13] M. Meehan, B. Insko, M. Whitton, and J. Frederick P. Brooks. Phys-iological measures of presence in stressful virtual environments. InSIGGRAPH ’02, pages 645–652, New York, NY, USA, 2002. ACMPress.

[14] B. Thomas, B. Close, J. Donoghue, J. Squires, P. D. Bondi, andW. Piekarski. First person indoor/outdoor augmented reality appli-cation: ARQuake. Personal Ubiquitous Comput., 6(1):75–86, 2002.

70