1
Multi-user Interaction with MultICopters M MIC UDP Tracking Multicopteransteuerung Gestenerkennung Auslesen von Kinectdaten Prof. Dr. Ulrich Schwanecke [email protected] T 0611 9495-1254 Annalena Gutheil, B.Sc. [email protected] Hochschule RheinMain Fachbereich Design Informatik Medien Computer Vision & Mixed Reality Group Campus Unter den Eichen 5 65195 Wiesbaden http://cvmr.mi.hs-rm.de/icarus While the control of multicopters by remote control requires skill and practice, MIMIC allows a natural and intuitive interaction between human and multicopters. In addition to gestures for takeoff and landing, the user can control freely the copter by hand, so that it mimics the move- ments of the hand. This makes it particularly easy to predetermine the movements of Copters. By pointing with the hand on the copter, a copter is selected by a user and can then be controlled by hand. By bales of fist free control is ended and another user can take control over the copter. It is thereby ensured that a copter can always be assigned to only one user. To select a copter, a user can point on it with his hand. The copter now follows the hand's movement until the user clench- es his fist, which stops the hand control. This enables other users to take control over this copter and also ensures that one copter is only assigned to maximal one user. MIMIC is integrated in the ICARUS project (Infrastructure for Compact Aerial Robots Under Supervision) of RheinMain University. Position and orientation (poses) of copters are determined using the active optical HSRM (high-speed and robust monocular) tracking developed by the CVMR (Computer Vision and Mixed Reality) working group. A Microsoft Kinect provides information about the positions and movements of the users. The integration of this information into the common world coordinate system in which the multicopters move, allows direct interaction between users and copters.

M MIC with MultICopters Multi-user Interactioncvmr.mi.hs-rm.de/resources/Paper/CeBIT-MIMIC-flyer-english.pdfMulti-user Interaction M MIC with MultICopters UDP Tracking Multicopteransteuerung

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Multi-user Interaction with MultIC optersM MIC

UDP

TrackingMulticopteransteuerung

GestenerkennungAuslesen von Kinectdaten

Prof. D r. Ulrich [email protected] 0611 9495-1254

A nnalena G utheil, B [email protected]

H ochschule R heinM ainFachbereich D esign Informatik MedienC omputer Vision & Mixed Reality G roupC ampus Unter den E ichen 565195 W iesbaden

http://cvmr.mi.hs-rm.de/icarus

While the control of multicopters by remote control requires skill and practice, MIMIC allows a natural and intuitive interaction between human and multicopters.

In addition to gestures for takeo� and landing, the user can control freely the copter by hand, so that it mimics the move-ments of the hand. This makes it particularly easy to predetermine the movements of Copters.

By pointing with the hand on the copter, a copter is selected by a user and can then be controlled by hand. By bales of �st free control is ended and another user can take control over the copter. It is thereby ensured that a copter can always be assigned to only one user. To select a copter, a user can point on it with his hand. The copter now follows the hand's movement until the user clench-es his �st, which stops the hand control.

This enables other users to take control over this copter and also ensures that one copter is only assigned to maximal one user.

MIMIC is integrated in the ICARUS project (Infrastructure for Compact Aerial Robots Under Supervision) of RheinMain University.Position and orientation (poses) of copters are determined using the active optical HSRM (high-speed and robust monocular) tracking developed by the CVMR (Computer Vision and Mixed Reality) working group.

A Microsoft Kinect provides information about the positions and movements of the users. The integration of this information into the common world coordinate system in which the multicopters move, allows direct interaction between users and copters.