16
Wii Sensor Bar Positioning in 3D Space Computational Geometry Course Project Tomer Cagan and Yoav Francis, The Interdisciplinary Center, Herzelia Abstract Once introduced, the WiiMote and Sensor Bar became a very common form of user input for many applications, mainly for entertainment but also for medical and other fields. Most commonly the technology is used for control in 2d space (up/down and left/right) relying on the WiiMote camera tracking mechanism and offer limited support of “Z” axis (front/back) by means of accelerometer measurement. In our project we would like to explore the possibility of tracking the sensor bar in 3D space with a setup that includes 2 WiiMotes. Introduction The WiiMote was introduced during 2006 along with the Wii system from Nintendo. In addition to the common buttons and sticks the WiiMote provides motion sensing with two main mechanisms, an accelerometer which we will not discuss here and an optical sensor which is the main interest of this small research. The optical sensor is a simple grayscale digital camera installed behind an IR filter which has a built in chip for tracking up to 4 IR lights. Along with the WiiMote comes a Sensor Bar which is actually a simple construction that includes several IR lamps positioned a known length apart that can be tracked by the WiiMote. The actual camera is a 100Hz, 128x96 pixels camera that tracks the lights and interpolate the reading to give a 1024x768 pixel resolution. The WiiMote communicates via Bluetooth interface which, along with many available libraries/APIs makes it a very convenient and popular user input device.

Wii Sensor Bar Positioning in 3D Space

Embed Size (px)

DESCRIPTION

In this project we demonstrated the ability to provide a 3 dimensional human-computer interface input/interaction mechanism using simple setup comprised of 2 fixed WiiMote and a moving light source. This can be further improved by adding additional sensors to the moving light source to give a rich input mechanism with virtual or real 3D space. While the work we did does not give a robust implementation, with relatively simple techniques it can be aggregated to create an accurate and responsive 3D input setup with relatively low cast (at about $21 per WiiMote). It is to be seen what application could be created for such a setup utilizing the technique discussed here.

Citation preview

Page 1: Wii Sensor Bar Positioning in 3D Space

Wii Sensor Bar Positioning in 3D SpaceComputational Geometry Course Project

Tomer Cagan and Yoav Francis, The Interdisciplinary Center, Herzelia

AbstractOnce introduced, the WiiMote and Sensor Bar became a very common form of user input for many applications, mainly for entertainment but also for medical and other fields. Most commonly the technology is used for control in 2d space (up/down and left/right) relying on the WiiMote camera tracking mechanism and offer limited support of “Z” axis (front/back) by means of accelerometer measurement. In our project we would like to explore the possibility of tracking the sensor bar in 3D space with a setup that includes 2 WiiMotes.

IntroductionThe WiiMote was introduced during 2006 along with the Wii system from Nintendo. In addition to the common buttons and sticks the WiiMote provides motion sensing with two main mechanisms, an accelerometer which we will not discuss here and an optical sensor which is the main interest of this small research.

The optical sensor is a simple grayscale digital camera installed behind an IR filter which has a built in chip for tracking up to 4 IR lights. Along with the WiiMote comes a Sensor Bar which is actually a simple construction that includes several IR lamps positioned a known length apart that can be tracked by the WiiMote. The actual camera is a 100Hz, 128x96 pixels camera that tracks the lights and interpolate the reading to give a 1024x768 pixel resolution.

The WiiMote communicates via Bluetooth interface which, along with many available libraries/APIs makes it a very convenient and popular user input device.

First amongst the application developers for the WiiMote tracking capabilities is Johnny Chung Lee from Google who researches Human-Computer Interface (HCI) [3]and developed several interesting applications, including a White Board, Finger Tracking, Surface Identification and Head Tracking.

While many of the applications are using the WiiMote as an input device to operate a computer or control specific aspects/programs [7]; other works with WiiMote include a physics teaching kit that uses the WiiMote and Sensor Bar [1]; and a research on rehabilitation using WiiMote to track limbs and give feedback [8].

Page 2: Wii Sensor Bar Positioning in 3D Space

Related WorkAs we’ve seen in the previous chapter, while there are many applications with WiiMote and Sensor Bar setup, most of them only deal with 2D space and may also use accelerometer to give notion of back/front movement.

A “3D orientation” is demonstrated in Johnny Chung Lee’s project for head tracking [4] in which the Sensor Bar is fixed to the user’s head and tracked. This allows rendering a 3D scene with orientation relative to the user’s position and thus achieving a 3 dimensional virtual space that changes with the user’s perspective.

An application that addresses a real 3D space uses basic trigonometric calculations to determine the distance of the Sensor Bar from the WiiMote [1].

The Wii Physics project uses the WiiMote to teach physics. In one of its applications the WiiMote is suspended of a spring facing the Sensor Bar which is position statically below it (on the floor). In this setup, since the Sensor Bar is parallel to the camera field of view (FOV) one can deduce a calculation for the distance of the WiiMote to the Sensor Bar [2]:

Figure 1: Basic setup for “Z” (Distance) measurement

The calculation is based on the setup depicted above Define: HFOV = 41°, VFOV = 31°

Angular field of view per pixel, ΘFOV=

HFOV1024

+VFOV768

2

Distance between two dots on the camera, r=√(x1−x2)2+( y1− y2)

2

Page 3: Wii Sensor Bar Positioning in 3D Space

R1

Total angle subtended by the two LEDs and WiiMote,

2α=r ΘFOV=(HFOV1024

+VFOV768 )√(x1−x2 )2+( y1− y2)

2

2→

α=r ΘFOV2

=(HFOV1024

+VFOV768 )√ (x1−x2 )2+ ( y1− y2 )2

4

Given d (actual distance between the LEDs = size of Sensor Bar, z=d

2 tan (α )→

z= d

2 tan((HFOV1024+VFOV768 )√(x1−x2 )2+( y1− y2 )2

4 )For a given Sensor Bar (or other similar light source), the distance between the two extreme light (d in calculation above) is known and thus one can find the distance z between the WiiMote and Sensor Bar.

This approach is limited to the situation where the Sensor Bar is parallel to the camera’s field of view. As we will see in the next chapter, a problem arises when the Sensor Bar is not parallel to the camera’s field of view in which case, d is not a known quantity.

Our ProjectIn this chapter we present our project. We first introduce the difficulty facing with only one remote and then present the calculations and the actual work we did in building a small demo.

The problemAs we saw above, with a predetermined orientation (parallel) of the Sensor Bar relative to the WiiMote camera one can calculate the distance between the two. The calculation is based on knowing the camera field of view and the actual size/distance between the Sensor Bar IR lamps.

The problem arises when the orientation of the Sensor Bar is not fixed as in the above calculation, in which case using one WiiMote will not suffice. In the following images we can see some cases in which the Sensor Bar is not oriented in parallel to the camera’s field of view.

Top View Camera POV

Figure 2: Sensor Bar in Diagonal Orientation

Page 4: Wii Sensor Bar Positioning in 3D Space

R1

R1

In Figure 2: Sensor Bar in Diagonal Orientation the Sensor Bar is some distance away from the WiiMote and is positioned at some angle relative to the camera FOV, thus it has some reading {{x1, y1},{x2, y2}}. In Figure 3: Sensor Bar in Reverse Diagonal Orientation we see the Sensor bar is the same distance from the camera but at a reverse orientation; still, since the WiiMote merely tracks the lights on the sensor bar the same reading, {{x1, y1},{x2, y2}} is observed – it cannot infer the rotation. The same case occurs in Figure 4: Sensor Bar Further Away as well. Here the Sensor bar is actually further away and with a different angle relative to the WiiMote camera but still the same reading, {{x1, y1},{x2, y2}}, is observed.

In any of the setups depicted in the figures above one cannot use the knowledge of the distance (d) between the Sensor Bar’s lights as it does not lay parallel to the camera’s FOV. Thus, this voids the calculation as it was presented in [2].

In our project we wanted to see how a setup that includes two WiiMotes can be used to overcome this problem and whether it can give an accurate 3D space mapping.

Basic SetupFor the project we decided to start with a basic setup as depicted in the following diagram:

Figure 3: Sensor Bar in Reverse Diagonal Orientation

Figure 4: Sensor Bar Further Away

Page 5: Wii Sensor Bar Positioning in 3D Space

R1

R2

R1 R2

w2 w1

R1 R2

w2 w1

xz

World x axis

World y axis

World z axis

WiiMote center lines

R1 FOV Axis Projection

R1 FOV Axis Projection

Legend

As we can see, the two remotes lay perpendicular to one another and each impose a side or a wall of a 3 dimensional cube. The Sensor Bar is moved within this space create by the WiiMote cameras FOV. The reading of each WiiMote corresponds to the position in space of the Sensor Bar. Note that this reading it in some arbitrary space imposed by the FOV. We will discuss the meaning later.

Development PhasesIn the following subsections we will slowly explore the setup and come up with the required calculations

Point in 3The first phase was merely a simple “test” of seeing a point in the location of the Sensor Bar by combining readings from the two WiiMotes.

In Figure 6 we can see the three axes x, y and z (green, blue and red) imposed by the 2 WiiMotes FOVs. Each reading gives an x and y coordinates that should be interpreted as a reading on one of the axes. The actual readings from remote r1 and r2 (cyan and magenta, respectively) are on given and interpreted

Figure 5: Basic Setup

Figure 6: Axes and Interpreting Readings

Page 6: Wii Sensor Bar Positioning in 3D Space

as the x,y (w1) and y,z (w2) plains respectively. Thus, if we have a reading {x1,y1} and {x2,y2} from the two WiiMotes respectively they are interpreted in the 3D coordinates as {x,y,z}x = x1, y = y1, z = x2. Note that arbitrarily we selected to use y1 as the y-coordinate reading. y2 could have be used instead or an average of both.

As we can see in the above picture (ref), the reading from each WiiMote is the dark square on the corresponding wall. The black box in the center is the position of the Sensor Bar (averaging all the separated {x,y} readings and interpreting as above.

Sensor Bar in arbitrary 3DOnce we had a basic setup and reading we went on to show the actual Sensor Bar within our (arbitrary) 3D Space. Given two sets of up to 4 {x,y} measurements corresponding to the 4 lights a WiiMote can track we needed to draw the position and orientation of the Sensor Bar within our space.

PitchCalculating the pitch is not possible – the WiiMote will see the light sources in the exact same way when changing the pitch (up to the point that it won’t see them at all). Having 2 WiiMotes doesn’t help in this case.

RollCalculating roll is straight forward and actually can be achieved by just one WiiMote. The roll of the sensor bar is the angle between the vector of the Sensor Bar and the “floor” imposed by the remote FOV.

Figure 7: Point in 3D Demo

Page 7: Wii Sensor Bar Positioning in 3D Space

R1 R2

w2 w1

xz

α

dx1

dx2

dx1 dx2

YawTo calculate yaw one must have reading from two WiiMotes. Once you have readings of at least 2 lights from 2 WiiMotes, the 2 sets of reading give a delta along the world x and z axes:

From these quantities and the relation between then we can deduce that the angle relative to the world, α, is the inverse tangent of dx1 and dx2.

Having the roll and yaw, along with the position we can now draw the actual position and orientation of the Sensor Bar within the virtual space imposed by the FOV of the WiiMotes.

Figure 8: Relations for Finding Yaw

Page 8: Wii Sensor Bar Positioning in 3D Space

In Figure 9: Sensor Bar in Arbitrary 3D Space Demo above we can see a screen shot from the demo showing the Sensor Bar position and orientation. Note the Role and Yaw values display corresponding to the orientation of the Sensor Bar

Sensor Bar in 3D SpaceSo far we only dealt with an arbitrary position and with the orientation of the Sensor Bar. In a sense we were working in a 3D space that is a cube of 1024 x 768 x 1024 pixels. The actual dimension of a pixel is not known and depends on how far the Sensor Bar is from each WiiMote – the closer it is, the smaller is the physical interpretation of a pixel movement.

In order to give a pixel actual physical interpretation we have to determine how far away each WiiMote is. Then a movement of some pixels in the virtual world actually corresponds to a distance in the real world (and vise-versa)

Figure 9: Sensor Bar in Arbitrary 3D Space Demo

Page 9: Wii Sensor Bar Positioning in 3D Space

HFOV

R1

R2

HFOV

α

90-α

d1

d2

Having calculated the actual orientation of the Sensor Bar, we can get back to the calculations described in [2] and listed above. Having the 2 WiiMotes reading we can see the relations between them and we can also know the actual distance between the light that each one of WiiMotes can observe.

It is important to note the inverse relation between the 2 FOV. Thus, to calculate the di exposed to each of the remotes (R1 and R2) we have to use a different trigonometric function.

Projected Distance on WiiMote 1: d1=d sinα

Projected Distance on WiiMote 2: d2=dcos α

Having this actual distance we can go back and plug the distance back into the equation above and get the distances z1 and z2 from R1 and R2 respectively.

Page 10: Wii Sensor Bar Positioning in 3D Space

Calculation AccuracyThe Following is a simple reading-to-real world comparison, in centimeters. Z1 is the distance from the first WiiMote, Z2 – the distance from second WiiMote.

Z1 – Real Z1 – Reading Z1 – Diff% Z2 – Real Z2- Reading Z2 – Dif%37cm 50cm 35% 42cm 53cm 26%22cm 30cm 36% 22cm 30cm 36%50cm 66cm 32% 40cm 62cm 47%60cm 72cm 20% 60cm 73cm 21%

Page 11: Wii Sensor Bar Positioning in 3D Space

As we can see, we get an average offset in the calculation of around 30% - which is rather poor – this is probably due to lightning issues interference, frequent and un-smooth changes in the reading, and the usage of a rather simple, not bright enough, Sensor Bar.

The Demo ProgramAs part of our project we created a simple demo program that demonstrates the different phases and related goal. Following is a brief discussion of the code we libraries used and the code we wrote.

Development EnvironmentThe mechanism of working with the WiiMote Bluetooth protocols and explanations of what and how it works can be found in [9]. There are many implementations of the WiiMote API. These libraries give a simple yet complete mechanism to communicate with the WiiMote and read sensor and button information from it.

For convenience and ease of development we selected to use WiimoteLib [5], a managed code library, which seems to be complete and easy to use. In addition to the availability of the library we have found several resources that help with starting Wii development. Foremost of these are the resources found in [6].

Due to the nature of the demos – a continuing updating of a rendered scene, we selected to use XNA as our development platform. XNA is a managed code wrapper around DirectX, mostly used as a game development framework that supports 3D and 2D and provides simple mechanisms for game development that lent itself easily to our purpose. In addition we used [11] to get up to speed with developing 3D scenes.

Development ProjectWe created a simple multi-screen XNA project to demonstrate the above phases. This section lists the important components

IWiiMotesService.cs and WiiMotesServiceImpl.cs specify an interface and implementation of such functions that are required for interacting with the WiiMotes.

WsbpDemo.cs (and Program.cs) – entry point to the demo Demos folder – includes the actual demo code for each of the demos. ScreenManager folder – screen management facilities. Developed by Microsoft Screens folder – miscellaneous menu screens and assets

The project is available for general use at http://code.google.com/p/wsbp/

Future WorkIn our project we only demonstrated what can be done with a setup of 2 WiiMotes and a light source similar to Sensor Bar. The demos we created are relatively crude but we believe that they capture the essence of the work that can be done with a similar setup.

Page 12: Wii Sensor Bar Positioning in 3D Space

The foremost issue that needs to be addressed is the stability of the reading. That can be solved by smoothing the reading using an appropriate digital filter. It is still remain to resolve the best fitting filter (a simple running average? more complex digital filter (DSP)?).

In addition to selecting a filter, the strategy of applying it should be determined – the WiiMote tracks up to four lights and report them in some order. Moving the Sensor Bar around show/hide the lights; further away only 2 lights are perceived. This implies that the input to the filter should be carefully determined.

In this project we only explore the use of the IR camera with its tracking capabilities. Since the Sensor Bar is the actually being moved with the covered space. It is possible to mount some IR light on an additional WiiMote and then one can have inputs from additional built in sensors. In such a setup one can use these additional inputs for more accurate tracking and manipulations of the data.

Once the above modifications are implemented it is still to be seen what actual applications can be developed using this setup. We believe this can be taken in two directions – manipulate a physical object in the real world base on accurate movement and orientation tracking (and other inputs) within the virtual space (e.g. fly a drone in a room according to movement of a WiiMote in the imposed VR space) and mapping physical world to VR such as games that take into account the volume of a room or the physical movement of a tracked object relative to some virtual object (augmented reality).

ConclusionIn this project we demonstrated the ability to provide a 3 dimensional human-computer interface input/interaction mechanism using simple setup comprised of 2 fixed WiiMote and a moving light source. This can be further improved by adding additional sensors to the moving light source to give a rich input mechanism with virtual or real 3D space. While the work we did does not give a robust implementation, with relatively simple techniques it can be aggregated to create an accurate and responsive 3D input setup with relatively low cast (at about $21 per WiiMote). It is to be seen what application could be created for such a setup utilizing the technique discussed here.

Bibliography[1] WiiPhysics, Physics with a WiiMote, http://wiiphysics.site88.net/[2] WiiPhysics, Distance Measurement with the WiiMote,

http://wiiphysics.site88.net/physics.html[3] Johnny Chung Lee, Wii Projects, http://johnnylee.net/projects/wii/[4] Johnny Chung Lee, Head Tracking for Desktop VR Displays using the WiiRemote,

http://www.youtube.com/watch?v=Jd3-eiid-Uw [5] WiimoteLib, Managed Library for Nintendo’s WiiMote, http://wiimotelib.codeplex.com/ [6] Wii@ESU, http://brannigan.emporia.edu/projects/WII/ [7] Wii@ESU Projects page, http://brannigan.emporia.edu/projects/WII/wiiprojects/index.htm

Page 13: Wii Sensor Bar Positioning in 3D Space

[8] Bert Bongers, Stuart Smith, Interactive Rehabilitation through Active Multimodal Feedback and Guidance, University of Technology, Sydney, http://bertbon.home.xs4all.nl/downloads/Interactive%20Rehabilitation%20print.pdf

[9] Wii Brew, http://wiibrew.org/wiki/Wiimote[10] XNA development, http://msdn.microsoft.com/en-us/centrum-xna.aspx[11] Riemer XNA tutorials, http://www.riemers.net/