Upload
beverly-houston
View
214
Download
0
Embed Size (px)
Citation preview
S
Robotic application ofHuman Hand Gesture
Ali El-Gabri, Al-Noor AcademyNathaniel Mahowald, Mass AcademyGrad Students: Dimitri Kanoulas and Andreas Ten PasPI: Robert Platt
Introduction
The fundamental point of this project is:
1. Making gestures that make the robot pick up
objects
2. Pointing at the object the user wants the
Robot to pick up.
How?
3. Creating interfaces between the computer
and the sensor
4. Creating interfaces between the sensor and
Robot.
The materials used were:
1. ROS Hydro Medusa• Robot Operating System• Provides several helpful tools• Hydro Medusa 7th Version
2. Xtion Pro• Like a Kinnect• Makes gesture tracking precise
3. Baxter• Robot – two hand manipulator
Materials
Methods
1. Install ROS Hydro Medusa
2. Install openni_launch• Camera driver
3. Install openni_tracker• Creates a skeleton for
any person in front of it
4. Set up a catkin workspace
5. Write python code to communicate
Sub-project 1: Directional Pointer
1. Work with Rviz
• ROS Visualization; visualizes camera feed
2. Set up the Transforms (TF) in Rviz
• Keeps track of 3D frames’ change over time
• Operates in a distributed system
3. Created a TF listener in python
• Receives coordinate frames
• Query for specific TFs between frames
4. Functional code
• Informs user of the direction arm is in the xyz plane
Sub-project 2: Body Part Pointer
1. Have two users on screen
2. Point with left hand
• Display body part being pointed at for either
user
• Display which user is pointing at which
3. How this helps:
• More work with Rviz
• Rviz already recognized human bodies
• Experimented with Dot Product, Matrices,
and creation of Vectors
• First step to pointing at other things
Sub-project 3: Gesture Control
1. Method of gesture based control without fixed frame
2. This is the first place where we fixed our user buildup problem
3. Went through a few drafts of what positions worked
4. First project we worked with on the robot
Final Project: Any Frame Pointer
1. We couldn’t get a “true” pointer without creating a fixed frame; our solution was calibration
2. Extremely accurate
3. Uses left hand as a signal that user is pointing
1. Potential extensions
Video of Final Product
Next Steps
1. Use voice recognition software to interact with the user
2. Create a pointer that does not require calibration and uses frame to run
3. Compile all the code onto a usable device, so that a disabled person could use a robotic arm to pick up objects they need
Acknowledgements
A special thanks to our very helpful Grad students, Dimitri Kanoulas and Andreas Ten Pas.
A very warm appreciation to Robert Platt, our ever wise PI.
And, of course, to those who made it possible and walked us every step of the way:
Duggan Claire, Program Director
Kassi Stein, Program Coordinator
Chi-Yin Tse, Program Coordinator