8
Figure 1: The top figure shows a puppeteer wearing an elephant puppet with VRSurus playing a VR game. The bottom figure shows the stereo rendering results in the virtual reality headset (Oculus Rift DK2). VRSurus: Enhancing Interactivity and Tangibility of Puppets in Virtual Reality Ruofei Du Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA [email protected] Liang He Makeability Lab | HCIL Computer Science Department University of Maryland College Park, MD, USA [email protected] indicates equal contribution. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). CHI’16 Extended Abstracts, May 7–12, 2016, San Jose, CA, USA. ACM 978-1-4503-4082-3/16/05. http://dx.doi.org/10.1145/2851581.2892290 Abstract We present VRSurus, a smart device designed to recog- nize the puppeteer’s gestures and render tactile feedback to enhance the interactivity of physical puppets in virtual re- ality (VR). VRSurus is wireless, self-contained, and small enough to be mounted upon any physical puppets. Using machine-learning techniques, VRSurus is able to recognize three gestures: swiping, shaking and thrusting. Actuators (e.g., solenoids, servos and vibration motors) assist with the puppetry visible to the audience and provide tactile feed- back on the puppeteer’s forearm. As a proof of concept, we implemented a tangible serious VR game using VRSurus that aimed at inspiring children to protect the environment and demonstrated it at the ACM UIST 2015 Student Innova- tion Contest. Our 3D models, circuitry and the source code are publicly available at www.vrsurus.com. Author Keywords Virtual Reality; Tangible User Interface; Haptics; Gesture Recognition; Head-Mounted Display ACM Classification Keywords H.5.2 [Information Interfaces and Presentation (e.g.,HCI)]: User Interfaces: Input devices and strategies; I.3.7 [Com- puter Graphics]: Three-Dimensional Graphics and Realism: Virtual reality

VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA [email protected]

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

Figure 1: The top figure shows apuppeteer wearing an elephantpuppet with VRSurus playing a VRgame. The bottom figure showsthe stereo rendering results in thevirtual reality headset (Oculus RiftDK2).

VRSurus: Enhancing Interactivity andTangibility of Puppets in Virtual Reality

Ruofei Du⇤

Augmentarium Lab | UMIACSComputer Science DepartmentUniversity of MarylandCollege Park, MD, [email protected]

Liang He⇤

Makeability Lab | HCILComputer Science DepartmentUniversity of MarylandCollege Park, MD, [email protected]

⇤ indicates equal contribution.Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s). Copyright is held by theauthor/owner(s).CHI’16 Extended Abstracts, May 7–12, 2016, San Jose, CA, USA.ACM 978-1-4503-4082-3/16/05.http://dx.doi.org/10.1145/2851581.2892290

AbstractWe present VRSurus, a smart device designed to recog-nize the puppeteer’s gestures and render tactile feedbackto enhance the interactivity of physical puppets in virtual re-ality (VR). VRSurus is wireless, self-contained, and smallenough to be mounted upon any physical puppets. Usingmachine-learning techniques, VRSurus is able to recognizethree gestures: swiping, shaking and thrusting. Actuators(e.g., solenoids, servos and vibration motors) assist with thepuppetry visible to the audience and provide tactile feed-back on the puppeteer’s forearm. As a proof of concept, weimplemented a tangible serious VR game using VRSurusthat aimed at inspiring children to protect the environmentand demonstrated it at the ACM UIST 2015 Student Innova-tion Contest. Our 3D models, circuitry and the source codeare publicly available at www.vrsurus.com.

Author KeywordsVirtual Reality; Tangible User Interface; Haptics; GestureRecognition; Head-Mounted Display

ACM Classification KeywordsH.5.2 [Information Interfaces and Presentation (e.g.,HCI)]:User Interfaces: Input devices and strategies; I.3.7 [Com-puter Graphics]: Three-Dimensional Graphics and Realism:Virtual reality

Page 2: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

Introduction

Figure 2: Closer view of anelephant puppet with VRSurus.The elephant puppet wears acustom 3D-printed cap thatencapsulates the Arduinomicrocontroller, an inertialmeasurement unit and otherelectronic modules.

Puppets are widely used in storytelling for puppetry on thestage as well as playing games among children. However,as physical inanimate objects, conventional puppets canhardly provide immersive auditory and haptic feedback.Previous research such as 3D Puppetry [6] and Video Pup-petry [1] used a Microsoft Kinect or an overhead camera tocreate digital animation with rigid puppets. Nevertheless,they did not allow users to wear a flexible puppet in an im-mersive virtual reality environment. As exploratory work, westarted with the following questions: “Could we endow pup-pets with more interactivity in both real and virtual world?Could we make traditional puppetry more immersive withvirtual reality? Could we allow puppeteers to feel the life ofpuppets via haptic feedback?”

To answer these questions, we have designed and devel-oped VRSurus (Figures 1 and 2), a smart device that en-hances interactivity and tangibility of puppets using gesturerecognition and tactile feedback in a virtual reality environ-ment. VRSurus uses simple machine-learning algorithmsand an accelerometer to recognize three gestures. It is alsobuilt with servo motors, solenoids and vibration motors torender haptic sensations. VRSurus is designed in the “hat”form so that it could be mounted upon any puppet. We havealso implemented a serious VR game where the puppeteermanipulates a virtual elephant puppet (corresponding tothe physical puppet) to clear the litter in the forests, splashwater onto the lumbermen and destroy factories that pol-lute the air. The game is designed to educate children onenvironmental protection.

Our main contribution is the concept of an interactive pup-pet for virtual reality powered by tactile feedback and ges-ture recognition, as well as our specific mechanical andsoftware design. The main benefit of our approach is its

applicability to VR games and puppetry performance. Wehave presented this game at the ACM UIST 2015 Stu-dent Innovation Contest in Charlotte, North Carolina, USA.Please watch our supplementary video for a demonstrationat www.vrsurus.com.

Related WorkVRSurus was built upon previous work in smart puppetry,gesture recognition, tangible interfaces and virtual reality.

Smart PuppetryIn this paper we use the term smart puppetry to refer tonovel approaches that combine puppets and electronics.Our work is largely inspired by 3D Puppetry [6] invented byHeld et al., which uses puppets as input via a MicrosoftKinect depth sensor to create virtual animations. Theirsmart puppetry allows a puppeteer to manipulate mul-tiple rigid puppets simultaneously. However, it requirespre-creation of the puppet database with visual featuressuch as SIFT. Similarly, Video Puppetry [1] developed byBarnes et al., allows puppeteers to perform puppetry withpaper cut-out characters and an overhead camera but alsodemands a pre-trained SIFT database. Unlike previouswork, VRSurus does not require such visual databases.Thus, it can be easily adapted to other puppets without ex-tra training efforts.

Our work is also related to an embodied smart puppetrycalled 3DTUI [12], which was developed by Mazalek et al..3DTUI allows users to pull strings above a cactus mari-onette to control a virtual character. Their follow-up workI’m in the Game [11], manipulated a virtual character usinga puppeteer’s full body movement by strapping a woodenpuppet to the puppeteer’s body. In VRSurus, the anima-tion of the virtual character is controlled by both the head-mounted display and the inertial measurement unit (IMU) in

Page 3: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

VRSurus. We create a ride-on-puppet perspective by syn-chronizing the puppeteer’s viewpoint and the orientation ofthe puppet in real time, which is deployed by the quaternionrotation from the head-mounted display (HMD). We use theIMU in VRSurus to recognize gestures and perform actionssuch as swiping, thrusting and shaking.

Previous research also investigates novel approaches thatuse data gloves [15], smartphones and craft supplies [10]or markers and robots [18] to perform virtual or physicalpuppetry. In contrast, our system preserves the natural userexperience of wearing an off-the-shelf puppet. VRSurusserves as an attachment that enhances the interactivity ofpuppets instead of replacing it.

(a)

(b)

Figure 3: Sketches of ourconception of VRSurus. (a) showsa puppeteer controlling a giraffepuppet to play a platform VR game;(b) shows a puppeteer controllingan elephant puppet to play afirst-person VR game. Initially, weplanned to sew sensors directlyonto the puppet. In subsequentiterations, we created a hat-likeattachment which allows VRSurusto be mounted onto any puppet.

Tangible Interfaces for Virtual RealityThe tangible interface of VRSurus is mostly related to Im-pacto [9] invented by Lopes et al.. Impacto decomposesphysical impact into a tactile aspect using a solenoid and animpulse response using electrical muscle stimulation. Simi-lar to Impacto, we use two solenoids to tap the puppeteer’sskin and use vibration motors for impulse feedback. To cre-ate physical animations, we use servos to actuate parts ofthe puppets (e.g., the elephant’s ears). Similarly, tapping isalso used by Li et al. to convey messages [8].

Besides Impacto, previous research on tangible interfacesfor virtual reality also inspired us in our design process andfuture plans. For example, Kron and Schmidt [7] developeda haptic glove which provides heat, vibration and kinestheticfeedback via miniature tactile fingertip devices for virtualand remote environments. Bao et al. invented REVEL [2],which uses reverse electrovibration to create an oscillatingelectrical field around the user’s fingers for augmented re-ality use cases. Schmidt et al. presented Level-Ups [17], apair of interactive stilts which stimulates haptic feedback forwalking up and down the steps in virtual reality.

Gesture Recognition for Virtual RealityGesture recognition has been well explored in the pastcouple of decades [13]. Previous work in gesture recog-nition usually involves external devices such as color orinfrared cameras [14], depth sensors [16], data or conduc-tive gloves [19, 20] or inertial sensors [3, 4]. For puppetry inthe virtual reality environment, we employ low-cost, cheap,and reliable IMU, which can also be integrated within thepuppet, to train our own classifier for gesture recognition.

System DesignWe present the iterative design process of VRSurus alongwith the lessons learned throughout the procedure.

IdeationIn the initial ideation phase, we sketched two scenarios inFigure 3, the first “giraffe” scenario is similar to many plat-form games with a side-scroller effect such as Super MarioWorld. However, during the initial trials among members ofour lab, we found that the 2D side-scroller design was notpreferred because it did not fully utilize the 360 degree vir-tual world. Several lab members soon lost their interestsbecause of the weak connection between themselves andthe virtual character.

Therefore, we turned to the second “elephant” scenario,which is analogous to playing a first-person shooter game.When the puppeteer rotates his or her body, the virtual ele-phant moves together with the viewpoint as if the puppeteeris “riding” on the elephant, thus enhancing the connectionbetween the puppeteer and the virtual character. VRSurusis designed to recognize three gestures, mapping to threedifferent animations as shown in Figure 6.

Instead of sewing all the sensors inside the puppet, as wedid at an earlier design stage, we fabricated a smart hatencapsulating all external sensors as shown in Figure 5,

Page 4: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

(a) (b) (c)

3D-printed flexible cable chain

Servo motors

Servo motors

Elastic bands

Threads

Solenoids inside

Figure 4: Mechanical design of VRSurus in details: (a) shows how two servo motors are seated inside the 3D-printed flexible cable chain, (b)shows how servo motors pull the forelegs of the puppet via threads and elastic bands, and (c) shows how two solenoids were embedded insidethe 3D-printed cases around the puppeteer’s forearms.

which makes VRSurus usable in more puppets. We didnot employ external sensors such as Microsoft Kinect, In-tel Real Sense, and Leap Motion Controller to recognizegestures because they usually have a narrow field-of-viewand limit the interaction space. We would like puppeteers tofreely interact in the 360 degrees virtual reality.

(a)

(b)

(c)

(d)

Figure 5: (a) shows VRSurus’sprofile view, (b) shows theorthogonal view with the capremoved, (c) shows the close-upview with the battery removed, and(d) shows each individualcomponent.

Hardware DesignTo empower the puppet’s I/O abilities, we have leveragedPololu MinIMU-91 (only accelerometer is used in the currentprototype; gyroscope and compass are reserved for futureuse) to capture the puppeteer’s gestures (input) and smallpush-pull solenoids, servo motors, and vibration motors tosimulate tactile feedback and tangible animations on arm(output). Two solenoids, one on either side of the forearm,indicate the directions of the targets. Two servo motors maypull parts of the puppet using a string to simulate physicalanimations. Additionally, vibration motors can generate vi-bration feedback on the entire puppet.

1Pololu MinIMU-9: https://www.pololu.com/product/1265

Besides the basic I/O parts, VRSurus communicates withthe software on a laptop via wireless connectivity. All elec-tronic components are connected to an Arduino Pro mini.Two LiPo batteries are used to power the micro-controllerand the wireless module individually.

Considering that all sensors and actuators should be mountedon the puppet, we have fabricated a flexible cable chain be-cause (a) it can adapt to various sizes of the puppeteers’arms, and (b) the hollow structure of the cable chain unitis perfect for housing bigger components (e.g., motors).Both the hat-like case and the cable chain are 3D printedusing PLA in an Ultimaker Printer2. All component specifi-cations, circuitry schematics and 3D print files are availableon https://github.com/ruofeidu/ninjaterp.

Software DesignThe software part of VRSurus consists of a gesture recog-nizer, a proof-of-concept VR game and a web server.

2Ultimaker printer: https://ultimaker.com/

Page 5: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

(a) Swiping (b) Shaking (c) Thrusting

Figure 6: This figure shows three gestures we designed to control the virtual character: (a) swiping from left to right triggers the character toswing the nose and blow wind, (b) shaking up and down triggers the character to splash water, and (c) thrusting from inward to outward triggersthe character to howl, trample on the ground and throw fireballs. Live demonstration is included in the supplementary video.

Gesture RecognizerWe have trained our classifier using the decision tree al-gorithm in Weka [5]. In total, we used the following sixteenfeatures: the sum of mean values on all axes, the differ-ences between each each pair of axes, the power of eachaxis, the range of each axis, the cross product betweeneach two axes and the standard deviation of values alongeach axis. To recognize the four gestures: idle, swiping,shaking and thrusting, we collected 240 sets of raw ac-celerometer values for each gesture from 4 lab members(60 sets per gesture per person). We achieved an averageof 97% accuracy using 5-fold cross validation. The classifierand all signal processing program is written in Java.

(a)

(b)

(c)

Figure 7: Virtual characters in theVR game. (a) shows the heroSurus controlled by the puppeteer,(b) shows the the enemies, (c)shows the Oculus Rift HMDrendering results of the lumbermanbeing defeated by Surus.

VR GameWe designed and implemented a proof-of-concept seri-ous VR game based on WebGL and WebVR written inJavascript and PHP. We use Three.js

3 for cross-platformrendering and Sea3D Studio

4 for rendering the anima-tion in WebGL. In the VR game, the player acts as a little

3Three.js: http://www.threejs.org4Sea3D: http://sea3d.poonya.com

elephant called Surus to prevent evil human beings from in-vading the forest. The player is able to use defined gesturesto commit attacks as stated in the Ideation section. The tar-get can be litter, a lumberman and a polluting factory. Thegoal of the game is to defeat as many enemies as possi-ble in a limited time. We designed a tutorial session beforethe game, encouraging players to familiarize themselveswith each gesture and all the enemies. When a lumbermanappears, the game would play the 3D audio of axe chop-ping according to the orientation between the player andthe lumberman. Meanwhile, the game sends signals to VR-Surus to enable the left or right solenoid to tap on the user’sarm. When the factory is destroyed, the game would outputvibration feedback along with the 3D audio of an explosion.After each gesture is successfully performed, the servo mo-tors are signaled to string the elephants’ ears and forelegsbackward and forward, simulating physical movements forthe puppet. We have also written a web-server in PHP thatexchanges signals between the gesture recognizer and theVR game every 10 ms. The game runs in 60 fps.

Page 6: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

Preliminary DeploymentVRSurus was initially deployed for the ACM UIST 2015 Stu-dent Innovation Contest in Charlotte. It was demonstratedlive throughout the entire 2.5-hour contest. Each game ses-sion lasted about two minutes (one exact minute for the VRgame and about one minute for the introduction and hard-ware preparation). In total, there were 63 participants fromthe UIST community including two invited K12 specialists,who tried out our device. There were more than a hundredpeople who watched our device as well as the gaming pro-cedure. Overall, the audience’s reactions were positive.After successfully destroying an enemy in the VR environ-ment, many participants cheered and some even laughedloudly. Some people mentioned that the tutorial greatlyhelped them learn the gestures and the virtual reality en-vironment was really impressive. People were also greatlyencouraged when the game told them that how many treesthey “saved with their own hands (gesture)” in the end.

Figure 8: During the preliminarydeployment, participants from theUIST community interacted withour prototype of VRSurus at theACM UIST 2015 StudentInnovation Contest

We also received some criticisms and suggestions for fu-ture work. One major concern was that the current proto-type is a little fragile and heavy to wear over an extendedperiod, although the device is small enough to be put uponthe puppet. Some people also mentioned that the gesturerecognizer failed to recognize their gesture when movingslowly and the tutorial did not give them any feedback. Soduring the deployment, we sometimes instructed people tomove faster to enable the accelerometer to generate betterdata. In our next iteration, we plan to train the recognizerwith more gestures from more volunteers and at varyingrates of movements.

Conclusion and Future WorkIn this paper, we have described a prototype device that en-hances the interactivity of puppets with gesture recognitionand haptic feedback for virtual reality gaming. We received

initial feedback from participants at the ACM UIST 2015Student Innovation Contest in Charlotte.

We have identified several directions for future work. Oneis to extend more input modules for better animations. Forexample, by embedding flex sensors in different parts of thepuppet (e.g., ears and nose), the virtual character can ani-mate corresponding parts when the puppeteer manipulatesthe physical one in the real world. By using a microphone,we can facilitate performance which kinetic input cannotsupport, such as creating a conversation with other virtualroles. Additionally, it could be a promising storytelling tool toexpress children’s creativity. For instance, a child can expe-rience others’ stories with recorded sound, tactile feedbackand physical movements all at the same time.

We also intend to conduct a long-term case study to exam-ine how children would like to interact with smart puppetsand what they can learn from immersive gaming experi-ences.

AcknowledgementWe would like to thank Sai Yuan from the Department ofAnimal and Avian Sciences in the University of Maryland forhelping in sewing the puppets, as well as vocal and audioproduction in our demonstration software and video. We arealso grateful to Prof. Amitabh Varshney for his suggestionsthat have helped improve the paper. We also thank the or-ganizers of UIST Student Innovation Contest for support ofthe servo motors, as well as the UIST participants for pre-cious feedback.

We appreciate the support of the NSF grant 14-29404 andthe NVIDIA CUDA Center of Excellence. Any opinions, find-ings, conclusions, or recommendations expressed in thispaper are those of the authors and do not necessarily re-flect the views of the research sponsors.

Page 7: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

REFERENCES1. Connelly Barnes, David E Jacobs, Jason Sanders,

Dan B Goldman, Szymon Rusinkiewicz, AdamFinkelstein, and Maneesh Agrawala. 2008. VideoPuppetry: A Performative Interface for CutoutAnimation. ACM Transactions on Graphics (TOG) 27, 5(2008), 124:1–124:10.

2. Olivier Bau and Ivan Poupyrev. 2012. REVEL: TactileFeedback Technology for Augmented Reality. ACMTransactions on Graphics (TOG) 31, 4 (2012), 1–11.

3. Ari Y Benbasat and Joseph A Paradiso. 2002. AnInertial Measurement Framework for GestureRecognition and Applications. In Gesture and SignLanguage in Human-Computer Interaction. Springer,9–20.

4. Keywon Chung, Michael Shilman, Chris Merrill, andHiroshi Ishii. 2010. OnObject: Gestural Play WithTagged Everyday Objects. In Adjunct Proceedings ofthe 23nd Annual ACM Symposium on User InterfaceSoftware and Technology (UIST). ACM, 379–380.

5. Mark Hall, Eibe Frank, Geoffrey Holmes, BernhardPfahringer, Peter Reutemann, and Ian H Witten. 2009.The WEKA Data Mining Software: An Update. ACMSIGKDD Explorations Newsletter 11, 1 (2009), 10–18.

6. Robert Held, Ankit Gupta, Brian Curless, and ManeeshAgrawala. 2012. 3D Puppetry: A Kinect-BasedInterface for 3D Animation. In Proceedings of the 25thAnnual ACM Symposium on User Interface Software &Technology (UIST). ACM, 423–434.

7. Alexander Kron and Günther Schmidt. 2003.Multi-Fingered Tactile Feedback From Virtual and

Remote Environments. In 11th Symposium on HapticInterfaces for Virtual Environment and TeleoperatorSystems. 16–23.

8. Kevin A Li, Patrick Baudisch, William G Griswold, andJames D Hollan. 2008. Tapping and Rubbing:Exploring New Dimensions of Tactile Feedback WithVoice Coil Motors. In Proceedings of the 21st AnnualACM Symposium on User Interface Software andTechnology (UIST). ACM, 181–190.

9. Pedro Lopes, Alexandra Ion, and Patrick Baudisch.2015. Impacto: Simulating Physical Impact byCombining Tactile Stimulation with Electrical MuscleStimulation. In Proceedings of the 28th Annual ACMSymposium on User Interface Software & Technology(UIST). ACM, 11–19.

10. Jesús Ibáñez Martínez. 2014. emoPuppet: Low-CostInteractive Digital-Physical Puppets with EmotionalExpression. In Proceedings of the 11th Conference onAdvances in Computer Entertainment Technology.ACM, 44:1–44:4.

11. Ali Mazalek, Sanjay Chandrasekharan, MichaelNitsche, Tim Welsh, Paul Clifton, Andrew Quitmeyer,Firaz Peer, Friedrich Kirschner, and Dilip Athreya.2011. I’m in the Game: Embodied Puppet InterfaceImproves Avatar Control. In Proceedings of the FifthInternational Conference on Tangible, Embedded, andEmbodied Interaction (TEI). ACM, 129–136.

12. Ali Mazalek and Michael Nitsche. 2007. TangibleInterfaces for Real-time 3D Virtual Environments. InProceedings of the International Conference onAdvances in Computer Entertainment Technology.ACM, 155–162.

Page 8: VRSurus: Enhancing Interactivity and Tangibility of ... · Ruofei Du⇤ Augmentarium Lab | UMIACS Computer Science Department University of Maryland College Park, MD, USA ruofei@cs.umd.edu

13. Sushmita Mitra and Tinku Acharya. 2007. GestureRecognition: A Survey. IEEE Transactions on Systems,Man, and Cybernetics, Part C: Applications andReviews 37, 3 (2007), 311–324.

14. Kouichi Murakami and Hitomi Taguchi. 1991. GestureRecognition Using Recurrent Neural Networks. InProceedings of the SIGCHI conference on Humanfactors in computing systems. ACM, 237–242.

15. Emil Polyak. 2012. Virtual Impersonation UsingInteractive Glove Puppets. In SIGGRAPH Asia 2012Posters. ACM, 31.

16. Zhou Ren, Junsong Yuan, and Zhengyou Zhang. 2011.Robust Hand Gesture Recognition Based onFinger-Earth Mover’s Distance with a CommodityDepth Camera. In Proceedings of the 19th ACMInternational Conference on Multimedia (MM). ACM,1093–1096.

17. Dominik Schmidt, Robert Kovacs, Vikram Mehta,Udayan Umapathi, Sven Köhler, Lung-Pan Cheng, andPatrick Baudisch. 2015. Level-Ups: Motorized Stiltsthat Simulate Stair Steps in Virtual Reality. In

Proceedings of the 33rd Annual ACM Conference onHuman Factors in Computing Systems (CHI). ACM,2157–2160.

18. Ronit Slyper, Guy Hoffman, and Ariel Shamir. 2015.Mirror Puppeteering : Animating Toy Robots in Front ofa Webcam. In Proceedings of the Ninth InternationalConference on Tangible, Embedded, and EmbodiedInteraction (TEI). 241–248.

19. John Weissmann and Ralf Salomon. 1999. GestureRecognition for Virtual Reality Applications Using DataGloves and Neural Networks. In International JointConference on Neural Networks (IJCNN), Vol. 3. IEEE,2043–2046.

20. Yupeng Zhang, Teng Han, Zhimin Ren, NobuyukiUmetani, Xin Tong, Yang Liu, Takaaki Shiratori, andXiang Cao. 2013. BodyAvatar: Creating Freeform 3DAvatars Using First-person Body Gestures. InProceedings of the 26th Annual ACM Symposium onUser Interface Software and Technology (UIST). ACM,387–396.