12
Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1 Playing with the Real World Paul Holleis, Matthias Kranz, Anneke Winter, Albrecht Schmidt University of Munich Research Group Embedded Interaction Amalienstraße 17 80333 Munich, Germany {paul,matthias,anneke,albrecht}@hcilab.org www.hcilab.org Abstract In this paper we provide a framework that enables the rapid development of applications using non-standard input devices. Flash is chosen as programming lan- guage since it can be used for quickly assembling graphical applications. We overcome the difficulties of Flash to access external devices by introducing a very generic concept: The state information generated by input devices is transferred to a PC where a pro- gram collects them, interprets them and makes them available on a web server. Application developers can now integrate a Flash component that accesses the data stored in XML format and directly use it in their appli- cation. We show two examples, one from a pervasive gaming background and one from an installation in an office setting. Keywords: Design, Experimentation, Human Fac- tors, Pervasive Computing, Input Device, Output De- vice, Sensors, Game Controller, Playful Computing Digital Peer Publishing Licence Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the current version of the Digital Peer Publishing Licence (DPPL). The text of the licence may be accessed and retrieved via Internet at http://www.dipp.nrw.de/. First presented at the Workshop PerGames 2005 held at PERVASIVE 2005, extended and revised for JVRB 1 Introduction Gaming is and always was ubiquitous and pervasive: children play card games on long travels in the back seat of the car, teenagers use their high-end personal computers and business people play with their highly sophisticated mobile devices - as is stated in [Nie05] killing time is the killer application and gaming cer- tainly is designated for this purpose. In this paper we present a general concept for rapid prototyping games that reach out in the physical world. Our focus is on non-standard input techniques using physically embedded controllers. We concentrate on prototyping devices beyond mouse and keyboard as we think that special purpose devices are much more interesting and suitable to gaming. We provide a general architecture for communica- tion between input and output devices and applications using existing standards and protocols. The data, es- pecially sensor data, representing the physical states of the input device is transferred to a server where it is made available to applications. The actual sen- sor data is provided in human readable form coded in XML. The server is a lightweight web server that can be accessed via the HTTP protocol which is available in a great variety of programming platforms and lan- guages. On the gaming application side, we demonstrate the integration of novel input devices in Macrome- dia Flash. This multimedia authoring tool and the programming language ActionScript offer great flex- ibility and possibilities to integrate all kinds of media (sound, graphics and movies). This platform has also been widely accepted for small-scale applications, es- pecially games. We therefore provide a Flash compo- urn:nbn:de:0009-6-2948, ISSN 1860-2037

Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

  • Upload
    vucong

  • View
    213

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

Playing with the Real World

Paul Holleis, Matthias Kranz, Anneke Winter, Albrecht Schmidt

University of MunichResearch Group Embedded Interaction

Amalienstraße 1780333 Munich, Germany

{paul,matthias,anneke,albrecht}@hcilab.orgwww.hcilab.org

Abstract

In this paper we provide a framework that enables therapid development of applications using non-standardinput devices. Flash is chosen as programming lan-guage since it can be used for quickly assemblinggraphical applications. We overcome the difficultiesof Flash to access external devices by introducing avery generic concept: The state information generatedby input devices is transferred to a PC where a pro-gram collects them, interprets them and makes themavailable on a web server. Application developers cannow integrate a Flash component that accesses the datastored in XML format and directly use it in their appli-cation. We show two examples, one from a pervasivegaming background and one from an installation in anoffice setting.

Keywords: Design, Experimentation, Human Fac-tors, Pervasive Computing, Input Device, Output De-vice, Sensors, Game Controller, Playful Computing

Digital Peer Publishing LicenceAny party may pass on this Work by electronicmeans and make it available for download underthe terms and conditions of the current versionof the Digital Peer Publishing Licence (DPPL).The text of the licence may be accessed andretrieved via Internet athttp://www.dipp.nrw.de/ .First presented at the Workshop PerGames 2005held at PERVASIVE 2005, extended and revisedfor JVRB

1 Introduction

Gaming is and always was ubiquitous and pervasive:children play card games on long travels in the backseat of the car, teenagers use their high-end personalcomputers and business people play with their highlysophisticated mobile devices - as is stated in [Nie05]killing time is the killer application and gaming cer-tainly is designated for this purpose.

In this paper we present a general concept for rapidprototyping games that reach out in the physical world.Our focus is on non-standard input techniques usingphysically embedded controllers. We concentrate onprototyping devices beyond mouse and keyboard aswe think that special purpose devices are much moreinteresting and suitable to gaming.

We provide a general architecture for communica-tion between input and output devices and applicationsusing existing standards and protocols. The data, es-pecially sensor data, representing the physical statesof the input device is transferred to a server whereit is made available to applications. The actual sen-sor data is provided in human readable form coded inXML. The server is a lightweight web server that canbe accessed via the HTTP protocol which is availablein a great variety of programming platforms and lan-guages.

On the gaming application side, we demonstratethe integration of novel input devices in Macrome-dia Flash. This multimedia authoring tool and theprogramming language ActionScript offer great flex-ibility and possibilities to integrate all kinds of media(sound, graphics and movies). This platform has alsobeen widely accepted for small-scale applications, es-pecially games. We therefore provide a Flash compo-

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 2: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

nent that entirely hides the complexity of the data re-trieval, pre-processing and transfer to the application.The Flash programmer can easily and directly accessthe values delivered by the input device as local vari-ables. No special knowledge of the input device likedesign, electronic layout or hardware used, is neededby the application programmer. These facts are com-pletely hidden and shielded.

2 Programming Beyond the Desktop

The focus of traditional programs and especiallygames has primarily been on running these applica-tions on standard desktop computers, mobile devices,and game consoles. Such devices unify input and out-put (e.g. in the case of a standard PC mouse, key-board and screen). Even with the emergence of multi-player games and later on internet based online gam-ing platforms, the computer screen and speakers re-main the only output devices (besides force feedbackcontrols) and mouse, keyboard and joystick or con-ceptually similar devices like rumble pads or steeringwheels are still the dominant input devices used.

From this perspective, we see a large potential forthe development of ubiquitous gaming devices. Wethink, that this area is another fertile field of researchwhich can benefit from rapid prototyping. In recentworks on ubiquitous gaming, novel interaction deviceslike a torche [GSK+02], a flying jacket [RSH+02] ora cushion [SHK04] are proposed.

We contribute to this by providing a generic archi-tecture and implementation for connecting novel andnon-standard input devices with applications.

2.1 New Input and Output Devices

The computer domain has largely been dominated bysystems with a relatively large display, capable ofshowing (fast moving) high-resolution images in fullcolor and spatial sound output of high-quality. On theinput side a wide range of game controllers and point-ing devices are available. Most of them are very simi-lar in function and handling. The keyboard is still thestandard way to enter text and the mouse plays a majorrole in interacting with different parts displayed on themonitor.

More recently, camera based approaches have beenintroduced as generic controllers for games. The SonyEyeToy, e.g., allows interacting in the physical spacewith games on a Playstation 2 [Son05].

Input Devices Since the invention of the computermouse, a great varity of input devices has been devel-oped. Most input and interaction devices are not asgeneral as the mouse and hence, they are of great valuein specific domains.

Recently, research and industry focus on interac-tions by directly observing movements or gesturesof users and translating them into interaction events.However, this kind of processing needs an augmen-tation of the users’ environment or the users them-selves. Examples are presented in [VRC99],[MAS04]and [Rek01].

Our approach differs from this in that we augmentexisting objects or appliances that people have alreadygot used to and know their affordances. We then aimfor an easily understandable way of translating inter-action with these devices into actions and events inter-pretable by applications. By matching the affordanceof the object with the interaction to perform, we canbe sure that users will quickly be able to start usingthe application in the intended way.

For enhancing existing appliances, we identifiedseveral different types of sensors that can be cheaplybought and easily integrated:

• Accelerometers: Can be used to detect absoluteangles with respect to the earth’s gravity field aswell as dynamic acceleration along their respec-tive axes.

• Magnetoscopes: Can be used to detect absoluteangle with respect to earth’s magnetic field andrelative rotational changes.

• Pressure Sensors: Can be used to sense whetherusers hold or squeeze a device in their hands, putit into a pocket or exert pressure to initiate someaction, etc.

• Light Sensors: Can be used to decide whether adevice is put in some bag or left outside; can alsogive information on the time of day and the typeof environment the user is in.

• Distance Sensors: Can be used to measure thespace between two objects or the user and an ob-ject.

This is by no means an exhaustive list as there existmany more types of sensors (temperature sensors, mi-crophone, or more general, sonic sensors, etc.). How-ever, the sensors listed above prove to be interesting inour experiments for creating engaging and animating

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 3: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

Figure 1: Visualization of the basic architecture.

input devices. In many ways users can interact withdevices like a cube ([KSHS05]) or a chair ([Coh03].This can easily be detected and interpreted.

We will show in Section 3.1 how we use a subsetof the mentioned sensors to capture movements of in-terest of a user on a augmented IKEA balance cushion([SHK04]).

Output Devices On the other end of the application,we observe the trend to use more and different types ofscreens and displays for output, depending on the typeof data that is to be visualized.

We also do not narrow the term output devices tomere visual screens, but also include various means ofoutput devices, e.g., for tactile or haptic output. Sim-ple occurrences of showing a one bit state can include,e.g., LEDs or switching on and off any appropriate ap-pliance.

Particularly interesting seems to be to use a com-bination of distributed large public displays placed atvarious points of interest in the environment and smallprivate displays visible only to a specific user or groupof users. Such forms of multi-display game environ-ments have been suggested in [MMES04].

From an architectural point of view, we treat sensorsand actuators similar as will be shown in Section 2.2.

2.2 Basic Architecture

As is shown in Figure 1, the application is strictlyseparated from any issues regarding sensor or actua-tor hardware. There is a clear interface to access thedifferent devices. A realization for that (indicated bythe cloud in Figure 1) will be presented later in Sec-tion 3. An application needs to send information tooutput devices and receive information from input de-vices. Communication in these two directions is aidedby two helpers, namely the Sensor Server and the Ac-tuator Server. They are similar in structure in that theyhave one or several registered devices with which theycommunicate. The Sensor Server for example collects

data from all sensors known to the component. Thisinformation can then be queried and used by the appli-cation. The Actuator Server on the other hand has a listof actuators, i.e., displays, lights, etc. The applicationis then granted access to those via the communicationlayer according to the capabilities of the respective de-vice.

Of course, it is neither needed nor sensible thatevery sensor sends its data to the Sensor Server by it-self. In most hardware platforms, those will be col-lected and sent by a central component. On the otherhand, this approach allows an arbitrary number of sen-sors / actuators in the environment to be used withoutneeding any sophisticated knowledge in hardware orcommunication protocols.

2.3 Abstraction from the Hardware

To accomplish the design goal described in the lastsection, we must provide an abstraction from the avail-able hardware. Especially to enable application devel-opers and device builders to easily integrate their prod-ucts into the architecture. In particular, we want toease the job for people building and integrating hard-ware components and for game application develop-ers. The field of virtual reality has brought with itsome platforms and architectures that use device ab-straction. Open Tracker ([RS01]), e.g., bases its track-ing framework on XML and defines interfaces to eas-ily support different devices. The Virtual-Reality Pe-ripheral Network (VRPN) [THS+01] includes a setof servers that are designed to implement a network-transparent interface between application programsand a set of physical devices. In the following we de-scribe how our concept can simplify the developmentprocess of a real world computer application.

2.3.1 Support for Device Developers

The first thing to build a new interaction method, inthe sense we use it, is to search for a suitable deviceand then decide upon which sensors can be integrated.Subsequently these sensors have to be connected tosome hardware platform like Smart-Its [GKBS04] orParticles [DKBZ05] that supports retrieving the data,maybe combine them and send out the information.

The task of a device developer is then to enhance theSensor Server to receive the data and make it availablethrough a clean interface. Similar actions apply to newactuators. Basic functionality must be provided to be

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 4: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

able to control the device. For displays, this may in-clude writing text, drawing lines and displaying im-ages, for others it might only mean specifying somecolor or switching them on or off. The implementationof these methods will of course again benefit from anabstraction mechanism that hides the need of sendingsequences of high and low voltages and offers, e.g.,access to each pixel.

2.3.2 Application Programmer

Somebody who wants to develop a game or some otherkind of application does not want to have to care at allabout hardware details, communication protocols etc.Ideally, he or she does not even need to pay attentionto the type of input or output device. Much researchis currently done on automatically adapting content tobe able to display it equally well on a desktop PC, asmaller PDA display and on a mobile phone. Sim-ilarly, using different kinds of input devices that pro-vide the same amount of information and therefore canbe interpreted in a similar way should be interchange-able. In our example application (see Section 4.1) thegame can be controlled by a new interaction device orby a standard keyboard.

We are therefore hiding as much of the hardwaredetails as possible from the developer providing himor her with an already abstracted interpretation of rawsensor values. As an example, consider a simple ballswitch and an accelerometer. The first is either 0 or 1,depending on the way it is placed. The accelerometercan convey exactly the same information when usedappropriately. However, the developer will probablynot be able to decide that so quickly. Therefore, therewill be an abstraction and the developer can build on aset of small events and does not need to cope with rawsensor values if not needed.

2.4 Middle Layer

To be able to get this kind of architecture, a middlelayer is needed that is responsible for providing eas-ily accessible interfaces to the application and sensorsides as well as managing the communication betweenthem.

We draw heavily on available standards to ensurethat the largest possible number of applications can beused and that the learning effort for developers is min-imized. As is described in more detail in Section 3.2where our implementation of the layer is shown, wecurrently favor XML as the data wrapping format as it

can be easily validated against, is human readable andparsers exist for most applications and programminglanguages.

As protocol to access the data, one of the mostwidely supported formats is HTTP. Nearly all applica-tions or languages that allow some kind of external ac-cess are capable of reading web pages. The infrastruc-ture needed can be found nearly everywhere and it isparticularly easy to create small viewers for prototyp-ing and testing devices.

3 Real-World-InteractionArchitecture and Implementation

Our experiences from previous projects showed that,for application developers and designers, the integra-tion of non-standard hardware is extremely difficult.To open up design options for interaction with thephysical world, we looked for a solution to easily in-tegrate novel input and output devices into a program-ming and authoring environment.

We chose Flash MX because it is commonly used bydevelopers and designers. It is suitable for easily andquickly creating games and other small applicationsthat are accessible using web technology. Beginningwith version 2004, Flash provides an object orientatedprogramming language called Actionscript 2.0. Wetherefore created an Actionscript 2.0 component thatcan be incorporated in any Flash application by simpledrag and drop. After dragging the component onto theFlash ’stage’ one has to configure the component bysetting two parameters: one for a link to the configu-ration URI and one for the variables URI on the server- both are described later in more detail. Afterwardsthe developer has access to all variables made avail-able from the component on the main time-line (called’root-time-line’ in Flash).

From there, these can be used like every other globalvariable for the application semantics. As has beenshown in the previous sections, all kinds of input de-vices can be imagined. These produce sensor data incompletely different ways. The Sensor Server is re-sponsible to convert this data into a specific XML for-mat (see Section 3.2). The converted data is then avail-able to the Flash application. The same applies to ac-tuators where the flow of actions is reversed. The ap-plication needs not be running on the same machine orserver, it only needs to know the URI where the XMLfile is hosted or generated. Thus, it can read and under-

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 5: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

stand the data from the server. The information flowis illustrated in Figure 2. It is a very powerful mecha-nism to enable communication between hardware de-vice and user interface even when those are not in thesame room, building or even country, enabling all sortsof remote controlled applications.

Figure 2: Setup of the Virrig game application.

To be sure that only one device (or, more general,the correct number of devices) is connected to the soft-ware, each device is assigned a specific unique ID. Atthe start of the program, there is a small dialog thatlets the user choose which of the devices found in theenvironment should be chosen for a particular appli-cation. Of course, a fixed coupling can also be pro-grammed into the application by removing the dialogand filter only those messages which originate fromthe device with the specified ID. Since the device IDscould be faked and the RF transmission currently isnot encrypted, this does not prevent malicious inter-ference. Systems which rely on tight security wouldneed more sophisticated authentication and authoriza-tion mechanisms.

It makes sense to enable the coupling of one devicewith several programs. This provides different viewson one and the same set of sensors. It is also allowedto connect several software programs with one and thesame output device. This implies that changes mightbe overwritten by later messages. Although this mightnot be desirable in some cases (e.g. when controllingone light), there are many application areas where sucha behavior is wanted like competitive or collaborativesettings. A display used stand-alone or integrated intoan output device, for example, can easily cope withinput from many sources by listing them one after theother and maybe tagging them.

In this section, we describe the implementation andfocus on sensors. The connection of actuator systemsis similar.

3.1 Sensor Hardware Architecture

We have implemented several sensing devices whichshare the same basic hardware architecture. The sen-sors are connected to a micro-controller. The micro-controller is responsible for basic data acquisition andprocessing. Via RF, the data is then sent to a base unitthat is connected to a computer in the network.

In one implementation we used the Smart-Itsplatform [GKBS04] and attached a custom sensorboard. The Smart-Its provide a programmable micro-controller (PIC 18F452) and several analog as well asdigital inputs and outputs. Sensor data can be sampledat frequencies of several hundred Hertz (if supportedby the sensor). The RF sender can send data at a max-imum rate of 14400 bps (including overhead for con-trol, etc.) enabling even those applications that rely onquick updates. This data is then transferred wirelesslyusing a transceiver of the type Radiometrix SPM 2-433-28. At the PC side, a similar construct is used:Another SPM module receives the data and communi-cates it to the PC via serial line input. From there, itcan be processed by software as is described in furtherdetail in Section 3.2.

In a new implementation we used Particles, devicessimilar to Smart-Its. Particles are developed at TecO,Karlsruhe [DKBZ05]. They are smaller and have moresophisticated ways of transmitting data (including ac-knowledgment etc.). This change has shown one of thestrengths of our architecture: since input devices areseparated from processing and the final application, ithas been very easy to switch to the new platform. Onlythe receiving part of the communication server had tobe adjusted. The data is no longer sent over serial inputbut in UDP packets.

Apart from the loose coupling to the used sensorand actuator hardware, the framework also abstractsfrom the actual object that is used as input device andinto which we embed the sensors. We deliberatelysearched for objects that are known to most people, butare not yet used as input devices. In this section, twoout of the many possibilities we found are presented.First, we used an IKEA balance cushion named Vir-rig shown in Figure 3 [SHK04]. It is a flat cushionmounted on a robust hemisphere. Thus, it can be ro-tated and tilted in all directions. It is very flexible inuse as the user can sit, stand, kneel or lie on it and it isvery robust, too, as it is designed for use by children. Itcan be seen as a regular cushion or as a toy to practicebalance.

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 6: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

Figure 3: The Virrig input device shown from the side.

The digital device we attached inside the hemi-sphere does not change the affordance or the physi-cality of the cushion. The user still can sit or standon it as before, and since we use radio technology fordata transmission there are no cables leaving it, so itcan still be tilted and rotated like before. We showhow to use the cushion in an edutainment applicationin Section 4.1.

Figure 4: Opened Virrig with the integrated Smart-Its.

Inside the cushion, attached to a wooden plate, thereis a Smart-Its (see Figure 4) with the following com-ponents:

• 4 large batteries are used as power supply; thisensures that the device needs not be opened evenfor long term user studies and during everydayuse

• 4 ball switches indicate the tilt of the cushion in8 directions

• a compass that shows relative rotational move-ments as well as the absolute rotation of the cush-ion

• a pressure sensor is used to sense if a user is cur-rently sitting on the cushion and detect his or hermovements

• a radio transmitter sends the data to a receiverconnected to the PC

The overall hardware architecture is depicted in Fig-ure 5.

Figure 5: Hardware architecture of the input device.

As another input device, we chose a small appliancethat is mounted at the doors of many offices, lecturehalls or assembly rooms. It shows the current state ofthe room or its occupant. A magnetic pin is placedon certain spots of a ferromagnetic plate to indicatewhether the person working in that room is in, busy,out for lunch, etc. We enhanced such boards with mag-netic switches that recognize where the button / pin isplaced and put a Smart-Its in each of them that com-municate this state to a central receiver. The applica-tion is briefly described in Section 4.2.

3.2 Sensor Server / Communication Server

After having described the parts connected to the in-put device, this section goes into details about the PCside of the system. A Smart-Its equipped with a radioreceiver is attached to the PC. The Smart-Its receivessensor data from the cushion and forwards it to a pro-gram called Serial Server over RS232 serial line. TheSerial Server interprets all received signals and trans-forms them into XML format. This data is then storedon a web server making it available for any applicationcapable of using the HTTP protocol.

The architecture of the receiver is outlined in Fig-ure 6. As has already been stated, we believe that pro-viding information via the HTTP protocol is one of

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 7: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

the best methods to allow a very high number of dif-ferent applications as well as programming languageseasy access to this data. The choice of using XMLas storage and wrapping format has been made in thesame sense. A large number of applications and pro-gramming languages inherently support reading andwriting data coded in XML structures. The fact thatthe data is human readable and conveys some seman-tics in its structure and named tags helps to imple-ment applications using devices for which there is nointernal knowledge. It also enables the specificationof content structure and easy validation of incomingdata. The part that generates sensor values as well asthe application can therefore rely on a specific DTDor schema being followed by transmitted sensor data.This dramatically reduces the complexity of imple-menting both sides. In our prototypes we did not expe-rience any significant slow-down using XML insteadof, for example, using a file with comma separated val-ues (CSV). Since, however, some system resources areneeded for parsing and processing of XML data, thisis not the optimal solution for devices with very lowprocessing power. The first step toward tackling thisproblem is to send only that part of the data that reallychanged (event based mechanism). Additionally, weare planning on making the protocol XML exchange-able with any other protocol like CSV or binary for-mats.

3.3 Component and Test Application

To explain how to use the Flash component and how tobuild applications on top of it we provide a small sam-ple application that outlines the basic concepts in moredetail. After that, a more sophisticated program is de-scribed for which we plan to undertake user studies toevaluate some assumptions on the impact of physicalinteraction in learning applications.

Figure 6: Hardware architecture of the receiver.

The component itself requires two input parameters:the locations (URIs) of the XML configuration file and

the XML variables file. The XML configuration filespecifies the variables for faster parsing in the Flashapplication. The component reads the file so it canname and initialize variables and set an interval for thereload rate.

<?xml version="1.0" encoding="UTF-8"?>

<!DOCTYPE eventlist SYSTEM "configdtd" ><config>

<interval><min unit= "sec" value="1" /><max unit= "sec" value="3" />

</interval><vars>

<var name="rotation" startvalue="0" /><var name="left" startvalue="0" /><var name="right" startvalue="0" /><var name="up" startvalue="0" /><var name="down" startvalue="0" />

</vars></config>

Tag Tag Explanation<interval> minimal and maximal sensor

refresh rate in seconds<vars> block with available variables<var> names, start values and types of

the variables delivered by the sensorinput device

XML Variables File: This file delivers the sensordata to the application. Only the variables that havedifferent values from the last update appear so that thecomponent does not need to read all the variables eachtime. This renders the Flash application notably fasterand also reduces traffic.

<?xml version="1.0" encoding="UTF-8"?>

<!DOCTYPE eventlist SYSTEM "vars.dtd" ><changedVars senderId="...">

<var name ="rotation" value="30" /><var name ="left" value="1" />

</changedVars>

Tag Tag Explanation<changedVars> variables whose value changed<var> names and values of the variables

delivered by the sensor device

When the application starts, the first thing for thecomponent is to analyze the configuration file, to set

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 8: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

up and initialize the variables and to set the minimumand maximum interval for reading the XML variablesfile. Then the application starts reading the variablesfor the first time. From now on it reads the variablesfile as often as defined by the minimum interval in theconfiguration file. It sets the received variables ontothe ’root-time-line’. That process is repeated as longas the application is running. This is also shown in theactivity diagram in Figure 7. For the rare case of sen-sors dynamically changing their update interval, theconfiguration file could be read from time to time toupdate the minimum nd maximum rate.

Figure 7: Activity diagram of the Flash component.

After programming the component, we imple-mented the first test application. This application vi-sualizes movements made by the Virrig cushion. Theapplication shows the cushion in the center of the win-dow and indicates tilt and rotation. All possible move-ments of the input device are shown on the screen. Ascreen dump of the application is shown in Figure 8.The green ball in the middle represents the Virrig cush-ion. The arrow (black triangle in the top right in Fig-ure 8) indicates the rotation sent by the sensor device.The tiny black lines around the ball show, if paintedred, the activated ball sensors. The text window on theright is an output window for testing and tracing thecorrect functionality of the component.

Figure 8: Screen dump of the test application.

4 Sample Applications

In this section we introduce some applications basedon the introduced architecture demonstrating the fea-sibility of our approach.

4.1 Virrig Race Game

The Virrig Race Game is a game application with thesensor cushion as physical controller. It is a mixture ofcar race and learning program, i.e. an edutainment ap-plication. The car is controlled by the cushion. Sincethe expected audience will be primary school children,the whole design is aimed to be interesting for smallerkids. Depending on the complexity and type of thequestions, the game is also fun and a nice learning ex-perience for people of all ages.

The first step in integrating the cushion into theframework is to augment the Sensor Server such thatit converts the sensor values it receives into an XMLtree. Second, the flash component used as describedin Section 3.3. The flash application can now use thegiven sensor values to define its behavior. As one im-plementation, we designed it the following way: Theuser controls the car by tilting the cushion forward andbackward to accelerate or brake and left or right to goin that direction, respectively (see Figure 10). Thereare stop signs on the crossings where a question is dis-played (see Figure 11). In this case, the rotational sen-sor (compass) is used to implement a simple selectionmechanism by turning the Virrig. The user then com-mits the choice by clicking, i.e. tilting forward. Onecould also think of realizing the this by means of hop-ping on the cushion as a load sensor is placed directlybelow the user in the inner part of the cushion.

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 9: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

Figure 9: Start screen of the game application. Theuser can choose to control the game with a standardkeyboard or with the Virrig.

Figure 10: Screen dump of the game application.

The user can continue driving after giving a (poten-tially wrong) answer. If the answer was wrong, thesign gets transparent until he or she has answered an-other question. So the memory effect is not that se-rious but imposes at least some pressure on the userwho can now try to answer it again. If the questionwas answered correctly this time, the sign disappears.After all signs have disappeared (i.e. all correct an-swers have been given) the race has been successfullycompleted. It could be imagined to display a scoreboard showing the 10 fastest users or those with theleast number of wrong answers.

The framework makes it particularly easy for devel-opers to quickly change the interpretation of the sen-

sor values. The compass could for example be usedas sensor controlling instead of tilting. It also enablesexchanging the particular device that is producing thesensor values or is consuming the software input withanother device, possibly with another set of sensors /actuators.

Figure 11: Screen dump of the quiz window showing aquestion on top and three possible answers below. Theone in the center has been selected by the user.

4.2 Room Occupancy System

This application visualizes the occupation of rooms ina building, e.g. if the room is used for a meeting, alesson, or is empty (see Figure 13). The natural choicefor controlling such an application is an input deviceplaced at the entrance to rooms. This device is very in-tuitive to use since a simple version of it can already befound at many office doors. It is a board with switchesthat are activated by magnets (see Figure 12). The stateof a room is set by putting a magnet on the sensitivearea of the Hall-effect switch. This data is wirelesslytransmitted per room to the Serial Server. The servercollects the information of each room and the appli-cation get this information from a specific IP and portfrom the HTTP server.

As can be seen in Figure 12, the device is alsoequipped with four LEDs that serve as feedback whensetting the magnets but can also be controlled re-motely. Thus, one could replace the magnets with sim-ple buttons and the LEDs could show the current stateof the office. This state can then be changed by, e.g.,the same Flash application that shows the state of theroom. It simply needs to write a specific value coupled

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 10: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

with those LEDs and the Actuator Sensor takes over topass the commands to the hardware component.

Figure 12: Screen dump of the Room Occupation Sys-tem.

Figure 13: Picture of the Room Occupation Systemuser interface.

5 Related Work

Tangible User Interfaces (TUIs) have been subject toscientific research for a long time. The initial worksby Ullmer and Ishii [IU97] and others show the highpotentials of intuitive, physical interfaces for human-computer interaction. Holmquist et al. in [HSU04]present a comprising overview of recent tangible userinterfaces.

ENI (for example used in [Gro02] is an event andnotification infrastructure using a client-server envi-

ronment. It allows attaching sensors and indicators di-rectly to the client or enables them to communicatewith the server via a web-based interface. Addingfunctionality of new devices requires writing plug-ins and changing existing ENI clients or creating newones. The programming language needed to supportall features is fixed to be Java. The application ofthe proposed architecture is similar to our work, butoriented towards a collaborative scenario targeted atgroup awareness and not for making sensor data ofphysical and tangible user interfaces available to ap-plication programmers as presented in this work.

With iStuff, Ballagas et al. [BRSB03], present an-other flexible prototyping platform for explicit, post-desktop interaction. iStuff allows multiple, co-locatedusers to interact with applications and displays in so-called augmented environments such as the StanfordiRoom. As our approach, their system is event-basedand allows dynamic re-mapping of devices to events.

Greenberg in [Gre02] discusses the problems ofmaking physical user interfaces available to applica-tion programmers, which is still a challenging task.He compares the difficulties for accessing physicaluser interfaces to those the first GUI developers had– building everything completely new from scratch.By the proposed architecture we show application pro-grammers a simple way of managing this problem.

Collabolla, presented by Bove et al. [BPW05], isa physical interface for arcade games. Players sit oninflatable balls, so-called Space Hoppers, and by hop-ping, jumping and rolling control their game charac-ters. As with the Virrig, the playfulness and the affor-dances of the physical input device are exploited forthe game controller.

6 Conclusions

In this paper we have introduced an approach thathelps prototyping physical interaction. Our focus ison the support for games. The examples given con-centrate on physical input devices and their integra-tion into Macromedia Flash MX. The abstractions pro-vided aim at easing the task for hardware develop-ers and game developers alike by providing a suitablemiddleware.

By encapsulating the access to physical devices intoa component in Flash we see that implementations be-come much simpler. Providing sensors and actuatorsas variables makes their physical distribution transpar-ent to the developer. The simple way of exchang-

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 11: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

ing the information via XML and HTTP makes newdevelopments very simple as developers can use li-braries already available. For some gaming domainsthe proposed solutions has restrictions as the time de-lay between the occurrence of a manipulation in thereal world and availability of the data in Flash can takeup to 200 ms. But even given this time delay gamesthat need ’immediate’ reaction can be prototyped withthe infrastructure described. Since some applicationsare time critical and cannot live with such a delay, weare currently optimizing data transfer, e.g., letting theFlash application query for its data directly via TCPand skip the HTTP server.

We are using this setup in other projects that weare running. Especially colleagues and students whoare new to embedded device programming are verypleased to be able to use devices and sensors with-out having to get deeper knowledge about controllingthem.

Currently we are preparing a version of the VirrigRace Game to perform a user study with children. Infuture work we want to qualitatively and quantitativelyassess the advantages of physical controls for edutain-ment systems.

Another project we are currently working on is con-cerned with using the same technology of the Virrigcushion. A similar device called therapy top is used inprofessional sport schools for training and convales-cence purposes. Athletes and patients have to performspecific exercises defined by professional therapists.It is very important that these exercises be done cor-rectly. To support trainees without much supervisioneffort, we are developing a training application wherepeople get immediate audio-visual feedback on theiractions. With the concept we presented the underlyinglogic can be realized very quickly.

7 Acknowledgments

The work has been conducted in the context of theresearch project Embedded Interaction (’EingebetteteInteraktion’) and was funded by the DFG (’DeutscheForschungsgemeinschaft’).

References

[BPW05] Jennifer L. Bove, Simone Pia, andNathan Waterhouse,Collabolla, http://people.interaction-ivrea.

it/s.pia/collabo_1.htm , 2003,visited 21/10/2005.

[BRSB03] Rafael Ballagas, Meredith Ringel, Mau-reen Stone, and Jan Borchers,iStuff: aPhysical User Interface Toolkit for Ubiq-uitous Computing Environments, CHI’03: Proceedings of the SIGCHI con-ference on Human Factors in Comput-ing Systems (New York, NY, USA),ACM Press ISBN 1-58113-630-7, 2003,pp. 537–544.

[Coh03] Michael Cohen,The Internet Chair, In-ternational Journal of Human-ComputerInteraction15 (2003), no. 2, 297–311.

[DKBZ05] Christian Decker, Albert Krohn, MichaelBeigl, and Tobias Zimmer,The Parti-cle Computer System, Proceedings of theFourth International Symposium on In-formation Processing in Sensor Networks(IPSN), April 2005, pp. 443–448.

[GKBS04] Hans-Werner Gellersen, Gerd Kortuem,Michael Beigl, and Albrecht Schmidt,Physical Prototyping with Smart-Its,IEEE Pervasive Computing Magazine3(2004), no. 3, pp. 74–82, ISSN 1536–1268.

[Gre02] Saul Greenberg,Rapid Prototyping ofPhysical User Interfaces, Proc. GraphicsInterface, invited talk, May 2002.

[Gro02] Tom Gross,Ambient Interfaces in a Web-based Theatre of Work, Proceedings, 10thEuromicro Workshop on Parallel, Dis-tributed and Network-based Processing,2002, pp. 55–62.

[GSK+02] Jonathan Green, Holger Schnädelbach,Boriana Koleva, Steve Benford, TonyPridmore, Karen Medina, Eric Harris,and Hilary Smith,Camping in the Dig-ital Wilderness: Tents and Flashlightsas Interfaces to Virtual Worlds, Confer-ence on human Factors in ComputingSystems CHI’02, Extended Abstracts onHuman Factors in Computing Systems,ACM Press ISBN 1-58113-454-1, 2002,pp. 780–781.

urn:nbn:de:0009-6-2948, ISSN 1860-2037

Page 12: Playing with the Real World - Semantic Scholar · PDF filePlaying with the Real World Paul Holleis, ... stored in XML format and directly use it in their appli- ... (DPPL). The text

Journal of Virtual Reality and Broadcasting, Volume 3(2006), no. 1

[HSU04] Lars Erik Holmquist, Albrecht Schmidt,and Brygg Ullmer, Tangible Interfacesin Perspective: Guest Editors’ Introduc-tion, Personal Ubiquitous Computing8(2004), no. 5, pp. 291–293.

[IU97] Hiroshi Ishii and Brygg Ullmer,TangibleBits: Towards Seamless Interfaces Be-tween People, Bits and Atoms, CHI ’97:Proceedings of the SIGCHI conferenceon Human factors in computing systems,ACM Press, 1997, pp. 234–241.

[KSHS05] Matthias Kranz, Dominik Schmidt, PaulHolleis, and Albrecht Schmidt,A DisplayCube as a Tangible User Interface, In Ad-junct Proceedings of the Seventh Inter-national Conference on Ubiquitous Com-puting (Demo 22), September 2005.

[MAS04] Christian Metzger, Matt Anderson, andThad Starner,FreeDigiter: A Contact-Free Device for Gesture Control, ISWC’04: Proceedings of the Eighth Interna-tional Symposium on Wearable Comput-ers (ISWC’04), IEEE Computer Society,2004, pp. 18–21.

[MMES04] Carsten Magerkurth, Maral Memisoglu,Timo Engelke, and Norbert Streitz,To-wards the Next Generation of Table-top Gaming Experiences, Proceedings ofthe 2004 Conference on Graphics Inter-face GI’04, Canadian Human-ComputerCommunications Society ISBN 1-56881-227-2, 2004, pp. 73–80.

[Nie05] Jakob Nielson, Killing time is thekiller application, TheFeature:It’s all about the mobile internet,http://www.thefeature.com/article?articleid=8183 , visited21/10/2005.

[Rek01] Jun Rekimoto,GestureWrist and Ges-turePad: Unobtrusive Wearable Interac-tion Devices, ISWC ’01: Proceedings ofthe 5th IEEE International Symposiumon Wearable Computers, IEEE ComputerSociety, 2001, p. 21.

[RS01] Gerhard Reitmayr and Dieter Schmal-stieg,An Open Software Architecture for

Virtual Reality Interaction, VRST ’01:Proceedings of the ACM symposium onVirtual reality software and technology(New York, NY, USA), ACM Press,2001, pp. 47–54.

[RSH+02] Yvonne Rogers, Mike Scaife, Eric Har-ris, Ted Phelps, Sara Price, HilarySmith, Henk Muller, Cliff Randell, An-drew Moss, Ian Taylor, Danae Stanton,Claire O’Malley, Greta Corke, and Sil-via Gabrielli, Things aren’t What TheySeem to Be: Innovation Through Technol-ogy Inspiration, Proceedings of the Con-ference on Designing Interactive SystemsDIS’02, ACM Press, ISBN 1-58113-515-7, 2002, pp. 373–378.

[SHK04] Albrecht Schmidt, Paul Holleis, andMatthias Kranz,Sensor Virrig - A Bal-ance Cushion as Controller, WorkshopPlaying with sensors on UbiComp 2004,September 2004.

[Son05] Sony, Eye-Toy, http://www.eyetoy.com , visited 21/10/2005.

[THS+01] Russell M. Taylor, Thomas C. Hud-son, Adam Seeger, Hans Weber,Jeffrey Juliano, and Aron T. Helser,VRPN: a Device-independent, Network-transparent VR Peripheral System,VRST ’01: Proceedings of the ACMsymposium on Virtual reality softwareand technology (New York, NY, USA),ACM Press, 2001, pp. 55–61.

[VRC99] Andrew Vardy, John Robinson, and Li-Te Cheng,The WristCam as Input De-vice, Proceedings of the 3rd IEEE Inter-national Symposium on wearable Com-puters ISCW’99, IEEE Computer SocietyISBN 0-7695-04280, 1999, pp. 199–202.

CitationPaul Holleis, Matthias Kranz, Anneke Winter,and Albrecht Schmidt,Playing with the real world,Journal of Virtual Reality and Broadcasting,3(2006), no. 1, April 2006,urn:nbn:de:0009-6-2948, ISSN 1860-2037.

urn:nbn:de:0009-6-2948, ISSN 1860-2037