88
Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação para a Obtenção do Grau de Mestre em Engenharia Informática e de Computadores Júri Presidente: Prof. José Delgado Orientador: Prof. José Santos-Victor Co-orientador: Prof. Alexandre Bernardino Vogais: Prof. Francisco S. Melo June 22, 2011

Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Evaluation of recent game interfaces for the command ofa humanoid robot

Duarte Cordeiro Ferreira Osório de Aragão

Dissertação para a Obtenção do Grau de Mestre em

Engenharia Informática e de Computadores

Júri

Presidente: Prof. José DelgadoOrientador: Prof. José Santos-VictorCo-orientador: Prof. Alexandre BernardinoVogais: Prof. Francisco S. Melo

June 22, 2011

Page 2: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

ii

Page 3: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Acknowledgments

First, I would like to thank Professor Jose Santos-Victor for being my supervisor, and Professor

Alexandre Bernardino for their support throughout the development of this dissertation.

I also want to thank all the RoboCub community, Vislab workers and students with whom I discussed

and learned many of the things that made this work possible. Specially to Ricardo Nunes, Nuno

Conraria, Giovanni Saponaro, and Ashish Jain, for repairing the damage I made to the robot, proof

reading early versions of this dissertation, and constantly challenging me to do just one more thing.

I extend my thanks to all my friends for their support and friendship during this periods, and Mafalda

Fernandes for the great help on the images of this work.

Finally and foremost, I would like to thank my family, parents and siblings, for their endless support and

encouragement throughout my academic path and particularly during the developing period of this

dissertation.

iii

Page 4: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Resumo

Esta dissertacao explora utilizacao de novos interfaces e interfaces tradicionais para controlo de

um robot humanoide. Os novos interfaces utilizados foram o controlo remoto da consola Wii e o

sensor Kinect da consola Xbox 360, os interfaces tradicionais utilizados foram um interface grafico de

computador e o cursor de um controlo remoto.

A complexidade dos robots humanoides, semelhantes ao iCub, e a necessidade de interaccao por

parte dos humanos, seja para o controlo da sua pose, ou para aprendizagem por demonstracao,

levou ao estudo de novas formas de interaccao com os humanoides. Das novas formas de interaccao

espera-se que sejam tao simples e claras como as formas tradicionais, mas tambem que permitam

mais interactividade que as antigas solucoes. Os novos interfaces apresentados pelas industria de

jogos sao uma boa forma para atingir este fim.

Para compreender a comparacao entes estes interfaces, foi realizado um estudo. Este estudo consistiu

em testes com tarefas simples de toque em objectos que estavam colocados ao alcance do iCub.

Os comentarios, envolvimento, e facilidade de utilizacao, foi anotada e comparada entre os varios

utilizadores, associada a cada um com o peso das caracterısticas do interface no desempenho das

tarefas avaliadas.

Os resultados dos testes sugerem que diferentes utilizadores adaptam-se de forma distinta ao mesmo

interface, nao esquecendo isto, e mostrado que alguns dos novos interfaces podem ser adaptados

com sucesso e maior performance para o controlo de um robot humanoide.

Palavras chave: interaccao, humanoide, comparacao, Kinect sensor e Wii remote

iv

Page 5: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Abstract

This dissertation explores novel and typical interaction devices as an interface for a humanoid robot

control. The novel devices used were the Wii gaming console remote control and the Kinect sensor

remote control for the Xbox 360 gaming console, the typical devices used were a computer graphical

interface and a remote control directional pad.

The complexity of humanoid robots such as the iCub, and the need for people to interact with them,

either simply for the control of its pose, or for imitation learning, lead to the interest in new forms of

interactions with the robot. The new forms of control are expected to be as precise as the previous

ones, but to allow a more natural and intuitive interaction. Recent interaction devices released by the

gaming industry are a good way to achieve this goals.

To understand how these interfaces compare an evaluation was done, consisting of tests with objects

for the iCub to reach while being controlled using one of the proposed interaction system. Differences

between users, comments, their involvement, and ease of use were compared and associated with

users profile and interface characteristics.

The tests results suggest that different users adapt differently to the same interface, having this in mind,

it was successfully shown that the new interaction devices can be adapted to obtain better performance

while controlling an humanoid robot.

Keywords: interaction, humanoid, comparison, Kinect sensor and Wii remote

v

Page 6: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Contents

Resumo iv

Abstract v

List of Figures ix

List of Tables x

List of Abbreviations xi

1 Introduction 1

1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.3 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Related Work 5

2.1 Human-Robot Interaction (HRI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.2 Wii remote . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.2.1 Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.2.2 Wii Motion Plus (WM+) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.2.3 Wii Remote (Wiimote) interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.3 Kinect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.3.1 Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.3.2 Application Programming Interface (API) . . . . . . . . . . . . . . . . . . . . . . . 11

2.3.3 Kinect interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.4 iCub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.4.1 iCub Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

2.4.2 Yet Another Robot Platform (Yarp) . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.5 Concluding remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

vi

Page 7: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

3 Proposed Interface Systems 16

3.1 Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.2 iCub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.3 Graphical User Interface (GUI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.4 Wiimote . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3.4.1 Motor control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

3.4.2 Kinematic control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.5 Kinect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.5.1 Hand kinematic control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3.5.2 Skeleton motor control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3.6 Concluding remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4 Applications 28

4.1 Application architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4.2 Main libraries used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

4.2.1 Yarp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4.2.2 iCub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

4.3 Wiimote modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

4.4 Kinect modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4.5 Concluding remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

5 Evaluation 38

5.1 Tests description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

5.2 Tests results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

5.3 Questionnaire results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

5.4 User comments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5.5 Important issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.6 Concluding remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

6 Conclusion 52

6.1 Research summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

6.2 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

6.3 Thesis statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

6.4 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

A Related work Appendix 56

A.1 Wiimote manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

vii

Page 8: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

B Evaluation Appendix 58

B.1 Tests presentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

B.2 Questionnaire form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

B.3 Questionnaire results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

Bibliography 74

viii

Page 9: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

List of Figures

2.1 Wiimote with the Motion Plus extension attached . . . . . . . . . . . . . . . . . . . . . . 6

2.2 Microsoft Kinect sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.3 Kinect projects examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

2.4 iCub robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.1 Computer motor interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

3.2 Wii remote axes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

3.3 Wiimote controlled motors in the iCub arm . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3.4 Wiimote motor control of the iCub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.5 Wiimote kinematic control of the iCub . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

3.6 Wiimote cursor control of the iCub . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

3.7 Kinect detection made by sample programs. . . . . . . . . . . . . . . . . . . . . . . . . . 24

4.1 iCub Network Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

4.2 Simple representation of Yarp Device Driver main classes . . . . . . . . . . . . . . . . . 31

4.3 Simple representation of the Interaction Module main classes . . . . . . . . . . . . . . . 34

4.4 Wiimote modules diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.5 Kinect modules diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

5.1 Interfaces test scenery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

5.2 Task time comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

5.3 Task error comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

5.4 Tests steps time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

5.5 Error types for the touch object task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

5.6 Error types for the avoid object task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

ix

Page 10: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

List of Tables

5.1 User tests per control and device . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

5.2 Interaction control ranking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

x

Page 11: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

List of Abbreviations

HCI Human-Computer Interaction

HRI Human-Robot Interaction

TUI Tangible User Interface

Wiimote Wii Remote

WM+ Wii Motion Plus

IR Infrared

MOT Multi Object Tracking

LED Light-Emitting Diode

HID Human Interface Device

DOF Degree of Freedom

DSP Digital Signal Processor

CAN Controller area network

Yarp Yet Another Robot Platform

API Application Programming Interface

FPS Frames per second

ToF Time-of-Flight

GUI Graphical User Interface

SVN Apache Subversion

TCP Transfer Control Protocol

SDK Software Development Kit

xi

Page 12: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

RGB Red Green Blue

RPC Remote Procedure Call

I2C Inter-Integrated Circuit

D-Pad Directional Pad

xii

Page 13: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chapter 1

Introduction

This is a introductory chapter, elaborating on the motivation and contribution in which this work is

based on.

1.1 Motivation

Human-Robot Interaction (HRI) is a part of the Human-Computer Interaction (HCI) field that is becoming

more fundamental each day, with the progress in robotics, and the discovery of the specific problems

that arise from the robotics area[26]. The hope that robots might some day populate the world and

help us with our daily tasks is a distant truth in most peoples minds, often because the simple functions

that robots do nowadays are accompanied by complex interfaces, that discourage most users from its

utilization. Demands for better and original interaction forms in the industry led to the appearance of a

set of interesting interfaces that are still to be explored and can introduce novel ways of interaction

with robots. The mixture between these two worlds, games and robotics, as resulted in an interesting

approach for interaction with robots by people with low or no knowledge about robots.

With a completely different goal than the HRI objective, the gaming industry has released several

interface devices that are sold worldwide at a low cost. Many of these gaming interfaces use state of

the art technology and allow an easy computer connection. Because of this cheap access to typically

expensive hardware, many researchers and hobbyists have hacked these devices so that software

development is possible to meet their personal needs, along with this hacking communities were formed

to support the work that was being done, and that still needs to be done. These commercially available

products have increasingly become the main focus of many projects that explore new and unexpected

interaction techniques. Most of the exploration evolution with this devices has been appearing both

in journals and conferences, as well as on-line videos and wikis, and usually with very technical and

direct explanations. Researchers are trying to keep up with the pace of hobbyists, who are not worried

about the coherence and consistency of their work, and study how this devices can really be useful for

1

Page 14: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

their projects.

Because of their characteristics many of these devices are very good to be used as Human-Robot

interfaces, not so expensive as the alternatives, and already known and used by a large group of

people, mostly gamers, but with very different characteristics. The amount of inspiration that can be

taken out of the web, from works built by others has not only been a plus to the researchers but also to

the manufactures of these products, which have considered this kind of use an advantage1. With these

robust gaming devices, that are becoming more common, it is important to understand how to map the

proper device to the proper interaction form. Humanoids have different needs from a mechanical arm,

those needs should be taken into consideration when deciding which might be the best interaction

device and form.

The iCub is a humanoid social robot involved in projects that range from psychology to the grasping of

objects. The interaction with the arms of the robot is typically made either through programming, or

a specific software that allows to move each motor independently. This system although working, is

neither simple or intuitive to reach a desired position, taking a lot of time to execute and plan, each time

that there is any small change to be made. This difficulty becomes often an obstacle for small changes

that might help in a execution of a task, but are left undone due to being too expensive. Creating a

simpler interface that is usable can change this detail which can result in faster development cycles.

Interfaces that are able to do motion capture can also be very helpful in programming by demonstration,

where human gestures need to be captured so that an agent, in this case a humanoid, can learn from

them.

An improper interface to the function that the robot is supposed to be involved in might unable the

success of that task. On the other hand a good interface during development helps to speed up the

work, allowing the research to be abstracted from minor details. If a researcher needs to position an

arm to be able to test a grasp, it is time consuming and useless to learn how to program a system

that keeps positioning the arm to its proper position, just to make a grasp. A good interaction must be

intuitive, natural, correspond to the user expectations and be specific. To achieve this, it is important

not only the data that an interaction device may retrieve, but also how the robot that we are interacting

with responds. That mapping is what might make a difference in solving a interaction objective of a

user, a good mapping meets user expectations.

The novel interaction devices available at low prices from the gaming manufactures, might serve as a

good platform for HRI work, although most of these devices are very unexplored, and need the very

basics to be studied and annotated. The works that hobbyist and the hacking community created, that

can be followed on-line, serve as good inspiration for the use of the novel devices, but it is important to

better understand how do mapping these devices to specific robots having in mind the tasks that will

be executed by the controlled robot will affect the users performance and success the execution of the

pretended tasks.

1http://www.techflash.com/seattle/2010/11/microsoft-kinect-not-hacked-left.html

2

Page 15: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

1.2 Contribution

This work studies the advantages and disadvantages of two novel gaming interfaces, a typical Graphical

User Interface (GUI) and Directional Pad (D-Pad), as an humanoid robot interface. It is hoped that

the comparison between the several systems that have been increasingly used by researchers and

developers becomes useful as a starting point for future works with interfaces and interaction. Also, it

was noted that the modules developed during this work might also be useful utilities for the iCub and

Yarp community.

The gaming interfaces used were the Wii gaming console remote (the Wii Remote (Wiimote)), with the

Wii Motion Plus (WM+) sensor extension, and the Kinect camera for the Xbox 360, these are novel

interfaces, that still do not have many studies about them available, much of the information available is

on-line, through wikis and blogs. The Kinect was actually released during the development of this work,

November 2010, and it was only reverse engineered to be used with a computer by the end of this

work. However the interaction allowed by the Kinect camera is such a novelty that simple tests with it

can already produce interesting results, and can adapt to robotics in a very clear and intuitive way.

To understand the best and most interesting interaction devices, we developed several interaction

modules, each module permitted to two have different interaction forms with the same device, a

kinematic form and a motor form. So there were specific modules developed for the Wiimote and for

the Kinect, that took advantage of different interaction forms.

The Wiimote allowed us to create three distinct interaction forms: (1) the movements made by the

user are interpreted by the iCub to move in a predefined manner, always following the user movement;

(2) the direct control of the iCub motors, each type of rotation allows the user to control a different

motor; (3) the cursor pad on the Wiimote is used to control the iCub arm directionally. The third form is

considered a typical interface because being used with the Wiimote is no different from using it with

any other device that has six buttons.

The Kinect had two interaction forms: (1) the hand movement of a user would be interpreted by

the iCub to move in a predefined manner, always following the user hand; (2) the motors would be

controlled directly, mimicking the user pose. This interfaces are set on or off using the Wiimote.

To test these interaction methods, modules were created that can be used with other purposes, as

some already have been. These programs were developed in an open-source way, having in mind that

another developer might reuse parts of the program or data being collected by the program. It has

also been made an effort in implementing some of these solutions as useful programs for the Vislab

developers, to facilitate simple and repetitive tasks, such as positioning the parts of a robot in a proper

way fo use. From the data that can be collected with these new modules, it is hoped that it might be

put to use as learning data sets, for the robots comprehension of human tasks and movements.

For the comparison users with different backgrounds were asked to perform several tasks with each

one of this interaction modules. The results of the evaluation made can now be used as a starting point

3

Page 16: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

for other interaction works that might want to use this type of devices for humanoid control. During the

tests there were annotations made relatively to the time that it took users to perform tasks, comments

of the users, and difficulties that might have been sensed by the test orientator from the users.

Because of the modules high adaptability, this interaction programs were extended to the Vizzy robot,

a social and mobile robot, as a form of demonstrating this robot capabilities. This robot is distinct from

the iCub, although it also uses Yarp which made it a good recipient for part of the modules developed

for the iCub.

The modules developed explore the novel interaction devices that have been emerging from the gaming

industry. The interaction devices used today for gaming are interesting, cheap, replacements of typically

expensive hardware. Due to their high and easy availability they also result in a high production of

interaction works. Although this is a very strong inspiration, it not always results in proper interactions

for the target system, we hope that the results of this work can contribute to better the understanding

of how to make this novel devices adapt to an humanoid.

1.3 Outline

This document is organized in six chapters, and several chapter sections.

Chapter 2, is the related work chapter, it presents a brief literature review on the fields of HRI,

the Wiimote, the Kinect sensor, the iCub robot, and iCub specific software as Yet Another Robot

Platform (Yarp).

Chapter 3, describes how the proposed interaction forms developed and used for the interfaces

comparison work, and some of the options made.

Chapter 4, presents the software implementation techniques and options used for the interaction

software developed.

Chapter 5, describes the test done with the users to understand the characteristics of the interaction

devices, and its results.

Chapter 6, is a concluding chapter, where a overall view of the work done and main conclusion drawn

is made, as well as future work suggestions are made.

With the exception of the first and last chapter, all the other chapters have a concluding remarks

section, where a overview of the chapter is made.

4

Page 17: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chapter 2

Related Work

In this chapter we address the HRI systems, the Wiimote, the kinect, and the iCub related work.

2.1 Human-Robot Interaction (HRI)

A robot autonomy level can be measured by the percentage of time that a robot does not need

intervention to realize a task [25]. Nowadays robots are still not completely prepared to be autonomous

and understand our language, but they have become useful in many tasks as helpers [3]. HRI is

responsible for the way of “communicating” with a robot and control how it may help us in a task.

The problem of natural movements in robots as been approached in several different forms, the solution

is divided into two parts: the users interaction and the robot response. Relating both parts there is the

gesture that it is as important as the way we do the gesture [16] so that we can be comprehended,

so it is important that a social robot such as the iCub is able not only to repeat a gesture, but also to

reproduce it in a natural way, retrieving characteristics from the users interaction.

In the iCub robot case there have recently been presented some works in which a user controls

the robot by making pressure on its artificial skin [19], or a robot movement can be programmed by

a user that pushes the robot parts. This type of interaction is particularly useful when learning by

demonstration, because the information from some interaction devices may be easily depured and

sent to the robot. Several works on this subject already use one of these kinds of interaction [4]. From

another perspective to face this “communication” problem there have been several approaches. Many

use teleoperation [9] where the user and the robot do not have any contact, not even needing to be in

the same room.

In the last few years there were several gaming interfaces, introduce by gaming companies, that have

contributed for the development of the HCI and from which HRI has taken advantage. Game controllers

have been compared by other works [18], and considered as suitable for motion capture, with these

controllers it is possible to build cheap and robust interaction systems, and that are already known by a

5

Page 18: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

large group of users. Two very recent, but very sophisticated, examples of good gaming controllers

are the Wiimote and the Xbox Kinect, these have made possible for developers to access cheap

accelerometers, gyroscopes and depth cameras. From this accessibility there have been appearing

new possibilities for HRI systems.

2.2 Wii remote

The Wiimote is the main controller for the Nintendo Wii Gaming console. This gaming remote control is

differentiated from its competitors due to its motion sensing capability, which allow the user to interact

through it with gestures.

Also the Wiimote allows for expansion attachments to be used and augment its capabilities. This

expansions range from the addition of gyroscopes, to game specific hardware as a gamepad guitar,

or steering wheel. In figure 2.1 it is shown a Wiimote with a Motion Plus extension attached which is

highlighted by a blue circle.

Figure 2.1: Wiimote with the Motion Plus extension attached.

The Wiimote was revealed at the Tokyo Game Show on October 14, 2005. Due to its unique features

at the time, it gained significant attention from hackers for non Wii or gaming purposes [23].

There are now several other interesting interaction devices, with similar features, some of which were

revealed during the development of this work, as the PlayStation 2 (PS2) EyeToy1, the PlayStation 3’s

(PS3) Eye2, the PS3 Move3, or the Microsoft Kinect4.

1http://en.wikipedia.org/wiki/EyeToy2http://en.wikipedia.org/wiki/PlayStation_Eye3http://en.wikipedia.org/wiki/PlayStation_Move4http://en.wikipedia.org/wiki/Kinect

6

Page 19: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

2.2.1 Specifications

Although the Wiimote official specifications are unpublished, the global hacking community has

collectively reverse-engineered a significant portion of the technical information regarding its internal

workings. Much of this work has been collected from online wikis at http://wiili.org and http:

//wiibrew.org. Johnny Chung Lee has contributed with a specification as a result of the wikis

information [12].

The Wiimote has an Infrared (IR) Camera, a Accelerometer, twelve push-buttons, a vibration motor, a

speaker, a bluetooth chip, internal flash memory and an expansion port. To be used with the Wiimote

there is a “Sensor Bar” with IR emitters, and the expansion port allows the connection of other devices

to expand functionality.

The IR Camera sensor has a Multi Object Tracking (MOT) engine that allows high-speed, high-

resolution IR sources tracking. There can be a maximum of four IR light sources, that are mapped in a

1024x768 pixel space at 100Hz refresh rate, according to position and intensity of each IR light source.

The Accelerometer was manufactured by Analog Devices, the one embedded in the Wiimote is model

ADXL330. This model is a 3-axis linear accelerometer, with a sensitivity range of +/-3g (gravitational

acceleration), and returns 8 bits of information per axis at a 100Hz update rate.

There are 12 Buttons arranged in a symmetrical way so that the remote can be held either with the

left or right hand. On the bottom of the Wiimote there is one trigger-like button, at the top there are

four directional buttons (a typical D-Pad), a plus and minus buttons, a home button, one A button,

two number buttons (one and two), and inside the battery case there is a synchronize button. The

trigger-like button is used with the index finger, all the others are best accessed with the thumb, except

the synchronization button which is hidden inside the battery case. In the appendix A.1 are presented

the pages of the official Wii console manual, that show and identify all the Wiimote buttons.

The Vibration Motor is similar to the ones found in mobile phones, and can only be set ON and OFF.

The four Light-Emitting Diodes (LEDs) at the lower part of the Wiimote can be individually addressed

and only have a binary state.

The Speaker in the Wiimotes center can play audio data with 4-bit, 4KHz sound, similar to telephone

quality. This is the part of the Wiimote about which there is less information.

The communication is made through a Bluetooth connection. This system uses a Broadcom 2042 chip,

which was designed for devices that conform to the Bluetooth Human Interface Device (HID) Standard.

The Wiimote has a Internal Flash Memory of approximately 5.5KB, that is used for the device settings,

maintaining output and storing data.

The Expansion Port is on the bottom part of the Wiimote and it is a proprietary six-pin connector.

This connector is used to communicate with a power extension, and with extensions such as the

Nunchuck or the WM+. The port provides a 3.3V power and 400Khz of Inter-Integrated Circuit (I2C)

serial communication to which a microcontroller can easily provide a Bluetooth-to-I2C bridge.

7

Page 20: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

To operate the Wiimote needs two AA batteries and has an operating time of 20 to 40 hours. The

battery level can be checked trough an 8-bit value.

2.2.2 Wii Motion Plus (WM+)

The WM+ was released in June 2009 [22] as an extension for the Wiimote. The WM+ makes the

Wiimote interaction more precise, by measuring the rotation change applied to the remote.

Inside the WM+ there is a dual-axis gyro by InvenSense, the IDG-600(pitch and roll), and a single-axis

gyro by EPSON TOYOCOM labeled X3500W (yaw). These two gyroscopes allow to measure the rate

of rotation along all 3-axes of X (pitch), Y (roll), and Z (yaw), as shown in figure 3.2. Specifications

of the X3500W are currently unknown, but in the Invense website the IDG-600 specifications can be

found 5:

• Highest range to measure fast controller motions up to ±2.000o/sec full scale range.

• Two separate outputs per axis for standard and high sensitivity, reduces system cost by permitting

use of lower resolution analog-to-digital converters (ADC). Depending on the speed detected,

the sensitivity changes.

• IDG-650 has an integrated single-chip in-plane design with superior X/Y cross-axis alignment.

ISZ-650 single-axis (Z) gyro complements the IDG-650 dual-axis (X/Y) gyro for a 3-axis gyro

solution where all sensors mount in-plane with other system electronics.

• Superior vibration rejection by design.

• Highest shock resistance at 10,000g.

• Auto-zero function minimizes bias drift.

• World smallest dual-axis gyro form factor at 4x5x1.2mm.

• Nasiri-Fabrication platform delivers inherent size, performance and cost advantages unattainable

with competing fabrication processes.

2.2.3 Wiimote interaction

With the Wiimote introduction in the mass market there was a high interest of the scientific community

in using this device for its own research. As cited by many works [15, 1] the Wiimote has the advantage

of being inexpensive and can be used as a tangible natural interaction device.

The Wiimote is considered as a spatial convenient device [24], because it respects three main

characteristics: provides spatial data, has sensors and emitters and it is inexpensive and durable.

Having these characteristics into consideration, the Wiimote is defined as a useful 3D User Interface.5http://invensense.com/mems/gaming.html

8

Page 21: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

The works that use this device range from motion control systems [5], to robotic user interface [2], or

even art projects [11]. The flexibility and ease of use has been presented as a efficient interface for

robotic control on several works [15, 2, 8]. These works typically compare traditional interface methods

with the Wiimote interface, where the accelerometer and IR camera are used to control a robot or a 3D

environment.

The accelerometer in the Wiimote allows to independently get the current orientation of the remote,

and the amount of force applied to the remote in respect to the gravity [8]. Examples of this system

are described in works that try through this way to capture human motion models [21], interface with

robots [8] or assist in rehabilitation therapy [10]. One of the main challenges with the usage of the

accelerometers in the Wiimote is the noise that they produce. To avoid this problem a Kalman filter is

often used. This filters out the noise by observing the noisy data retrieved over time and calculating

estimates of the true values of that data. Also the accelerometer can not retrieve the orientation of

the Wiimote with all the degrees of freedom, because a rotation made around the axis of gravity is not

detected. This limits the Wiimote to gesture detection, and does not really do motion tracking except in

very controlled situations.

To overcome this the IR Camera can detect up to four different IR sources, with that data it is possible

to understand the Wiimote position by triangulation, with the “sensor bar” serving as reference point.

Although the typical usage of this feature is with the Wiimote/camera in the hand, a lot of the studied

systems keep the IR Camera still and have a moving IR light source that is detected. This allows the

interface developed to be less intrusive since the user only needs to have an invisible light (IR) source,

the main disadvantage in this solution is the spatial limitation of the camera angle. Following the fixed

camera idea, there were experiments made with two Wiimotes [14, 20], this way it is possible to get a

better definition, the opposite technique as also been tested using two light sources, in distinct places,

to make the triangulation [1].

The WM+ was sold as a Wiimote extension although nowadays its gyroscope is shipped as a part of

the Wiimote device. The introduction of this gyroscope allows the Wiimote to have more precise data

about the angular changes made. The merge between the accelerometer data and the gyroscope data

allows for a more accurate orientation tracking [24, 7]. Although the WM+ adds accuracy, it obliges the

user to make a calibration before using it, which might not be so straightforward.

2.3 Kinect

The Kinect for the Xbox 360 console from Microsoft6, shown in figure 2.2, is a webcam style device add-

on that allows a controller free gaming interaction. The Kinect was released in Europe on November

10, 20107, so it is still a quite novel device. Although the Kinect is still the only one of its kind existing,

6http://www.xbox.com/en-US/Kinect/7http://en.wikipedia.org/wiki/Kinect

9

Page 22: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

with such a low price, Microsoft competitors are beginning to emerge, as it is possible to see by the

unreleased but rumored Asus Wavi Xtion Motion8, which copies much of the Kinect functionality.

Figure 2.2: Microsoft Kinect sensor.

The technology used by the Kinect to produce its result is quite old, but the algorithms that it uses for

skeleton detection and user tracking are the ones being presented as innovations in the gaming world.

The systems that are used with this sensor allow to control games using the user body without any

type of controller in the user hands, as Microsoft puts it “you are the controller”9. The skeleton detected

by the Kinect is the kinematic structure of a user, the detection of head, arms, legs, and torso, with this

being detected the movements of the body can easily be mapped into a virtual avatar in a game, and

the pose of the user is copied into the avatar.

Due to its low price and since a few free Application Programming Interfaces (APIs) have been released,

there has been an exponential rise in the amount of projects done using the Kinect, from NASA10

to researchers to hobbyists, have contributed to a very steep and interesting evolution in the natural

interfaces community using this device. Those projects show many times original, and unexpected

uses of the Kinect for different projects.

2.3.1 Specifications

The Kinect specifications have not been released by Microsoft until now. Although there is little

information about the Kinect full specifications, by dismantling the device it was possible to understand

some of its components11.

The sensors are composed by: two cameras, and one accelerometer. There is one Red Green

Blue (RGB) camera and another IR camera, working at 30 Frames per second (FPS), using 640 by

480 pixels with 32 bit color and using 320 by 240 pixels with 16 bit color, respectively. The IR camera

uses a system designed by PrimeSense to get the depth image, that image is obtained through an IR

dots pattern projected by the IR light projector12 to the left of the cameras. This pattern is best detected

from 1.2 meters to 3.5 meters. The KXSD9 Series accelerometer is located at the bottom of the Kinect

and it is used for image stabilization.

8http://tinyurl.com/4l5lu5a9http://www.microsoft.com/presspass/features/2010/oct10/10-21kinectads.mspx

10http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2011/01/10/BUO01H4ISI.DTL11http://www.ifixit.com/Teardown/Microsoft-Kinect-Teardown/4066/12http://www.youtube.com/watch?v=nvvQJxgykcU

10

Page 23: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

There is also an audio microphone array that is able to capture 16 bit sound at 16kHz, this is mostly

used by the Xbox 360 to make audio conference calling, and position detection through sound.

2.3.2 Application Programming Interface (API)

Since the Kinect launch several open source and non open source drivers and APIs have bean made

available, the most popular ones are libfreenect13 and OpenNI/NITE14.

The libfreenect is an open source effort in having a full API for the Kinect, which is represented by

the OpenKinect15 community. The OpenKinect community is interested in making software for the

Kinect using only open source technologies. The libfreenect is presented as an unfinished work, which

stills lacks many functionalities, however it is supported by an very active community and it has been

growing every day.

The OpenNI is a open source API that provides support for specific drivers to specific hardware, this

API is used for many different devices, among them the PrimeSense camera used by the Kinect.

PrimeSense has released freeware drivers and a middleware framework to be used with the OpenNI

API called NITE16. With the OpenNI and NITE API it is possible to use algorithms for hand or body

tracking, and also skeleton tracking, retrieving the positions and rotations of each body joint.

Each one of these APIs only allow to control the RGB, the IR camera, and the motor in the support for

the cameras, both the audio and the accelerometer are not accessible. It is expected for Microsoft

to released its own Software Development Kit (SDK) in the Spring of 201117. With it Microsoft might

release the algorithm for merging the data from the RGB and IR camera which makes the skeleton

tracking algorithm much more robust, and also avoids the calibration task that is needed by the OpenNI

and NITE for the same purpose.

2.3.3 Kinect interaction

A depth camera can retrieve a high precision depth-image, where each pixel indicates also its depth. To

achieve this a depth camera can use several different techniques, stereo imaging, Time-of-Flight (ToF),

or structured light. Stereo cameras are becoming very common for day to day use, some are already

sold as part of cell phones. Stereo cameras allow to capture two images taken from cameras that are

separated with some space between them. With the two captured images it is possible to calculate a

depth image by triangulation, with the two cameras and the corresponding pixels of the two images.

ToF cameras use light pulses. Using light pulses means that the light is switched on for a very short

time period, this allows the ToF camera sensor calculate how much time took the light pulse to be

13https://github.com/OpenKinect/libfreenect14http://www.openni.org/15http://openkinect.org16http://www.primesense.com/?p=51517http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/

11

Page 24: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

sent and reflected back, calculating the distance that a certain pixel is at. Because the ToF cameras

are still quite expensive and stereo cameras are very noisy, Microsoft chose to use a structured light

pattern. This light pattern is projected into the user space, by the IR light source in the Kinect. The

deformations in that light pattern are interpreted by the Kinect sensor, to detect what is the distance of

each pixel. Recently Microsoft acquired a company that produces ToF cameras and it is rumored that

the next version of Kinect might use ToF instead of the structured light technique18. This preference is

due to the quality of the depth image calculated, that is higher in a ToF camera, and also more robust

to different environments.

Although this contrast in quality exists between a ToF camera and the Kinect, it is possible to use the

Kinect as an ToF to solve problems that were using ToF as its depth map retrieving device. This way

works such as [6], where a robot is controlled through pointing, can be achieved with a lower budget.

There are not many published works with the Kinect, but it is not possible to ignore all the interaction

projects that have appeared on the web. With the purpose of following interesting Kinect projects, the

KinectHacks19 website was created. Among many interesting projects the some have demonstrated

this device qualities and capabilities. The two kinects 20 project, uses two kinects, face to face, to get a

full 3D object description. The Kinect Hand Detection allows the detection of the hands and fingers

of a user21, to manipulate virtual objects. A modified mobile version of the Kinect as been mounted

on top of a quadrotor, which is an aircraft that is lifted and propelled by four rotors as seen in figure

2.3b, to sense obstacles that it might find in the way22. Some projects have also merged the kinect

with different algorithms in order to get an improved performance as it is an example the 6D SLAM

with the RGB and depth data from Kinect23, making it possible to quickly acquire colored 3D models of

objects as the one seen in figure 2.3a. This project also got involved with the KinectHacks community

by being referred with a small demonstration of the Kinect interface working with the iCub that had

more than 2200 hits24.

2.4 iCub

The interfaces developed throughout this work have been thought of with the iCub robot in mind. The

iCub is an humanoid child robot, spread in several laboratories around the world, being associated

with works in fields that go from Electronics and Computer Science to Neuropsychology.

The hardware and software developed within the iCub project is all open source, which makes all the

development of the iCub available to anyone interested. This software can be all downloaded form the

18http://www.nytimes.com/2010/10/30/technology/30chip.html19http://kinecthacks.net/20http://www.youtube.com/watch?v=5-w7UXCAUJE21http://www.youtube.com/watch?v=tlLschoMhuE22http://www.youtube.com/watch?v=Sw4RvwhQ73E23http://www.youtube.com/watch?v=XejNctt2Fcs&feature=related24http://kinecthacks.net/irobot-icub-humanoid-robot-kinect/

12

Page 25: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

(a) 6D SLAM with RGB-D Data from the

Kinect.

(b) Quadrotor helicopter with the Kinect mounted on top.

Figure 2.3: Kinect projects examples

iCub website25 [13].

The two following sections are a broad description of the iCub and its programming framework, this

introduction was mainly based on two works [13] and [17].

2.4.1 iCub Robot

The iCub is a 140cm tall child, that is able to crawl, sit, and do sophisticated manipulation with its

hands, the physical aspect of the robot can be seen in the figure 2.4. The iCub has 53 Degree of

Freedom (DOF), 39 DOF that are distributed through the upper part of the robot body, and 14 DOF are

in the low part. The hands alone, which are used for manipulation, have 9 DOF, with three independent

fingers plus other two connected to a single motor. The legs, although not programmed to do that,

have enough strength to support the robot on bipedal mode. There is also a set of sensors distributed

along the body: cameras, gyroscopes, accelerometers, microphones, and force/torque sensors. A

sensitive skin has been developed, but is not yet available at the Instituto de Sistemas e Robotica, at

the moment this dissertation is being written. With the skin it is possible to detect pressure on the robot

outer part, making it possible to grasp objects more precisely, and to control the robot by pushing its

parts.

All the low level control is made through a set of Digital Signal Processor (DSP) based cards, connected

via Controller area network (CAN) bus, specifically designed for the iCub robot. The sensor and motor-

state data is passed on to the PC104 card located in the head of the robot, where all that data is

reformatted and synchronized with data streams to a Gbit Ethernet connection. Typically the heavier

computation is made on independent machines external to the iCub, connect through the iCub Ethernet

connection.

The iCub robot was developed with the goal of: creating an open hardware/software humanoid robotic

platform for research in embody cognition, and advancing our understanding of natural and artificial

25http://www.robotcub.org/

13

Page 26: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Figure 2.4: iCub robot.

cognitive systems by exploiting this platform in the study of the development of cognitive capabilities

[13].

The software and libraries that are available in the iCub open-source repository, allow the development

of software for the iCub control. The control of the robot is made through Yarp server, and there is

already implemented a inverse kinematic control [17] that allows to control the iCub full arm and torso

by specifying where should a end-effector reach, typically the robot hand, but any end-effector can be

defined.

2.4.2 Yet Another Robot Platform (Yarp)

The computation made in independent machines, and in the pc104 inside the iCub robot, uses a

framework called Yarp. This framework abstracts two common difficulties in robotics: modularity in

algorithms and interfacing with the hardware. Yarp is not iCub specific, what obliges the iCub to have

some specific software written to be able to take care of the abstraction between the iCub and the

framework and prepare the robot to receive Yarp-like instructions.

The main abstractions of Yarp are the ports. Ports implement the observer pattern that allows to send a

message through one port, to a multiple number of ports that are distributed across different computers,

this way the sender and receiver can work independently. The observer pattern is a broadly used

software design pattern where an object, the subject, maintains a list of its dependents, the observers,

the observers are notified each time that the subject considers it necessary. The subject in ports would

be a writer port, and the observers would be read ports that are connected to the subject port.

Another abstraction implemented by Yarp is the device, that allows to specify which device the

programmer/user wants to interact with and the framework provides an object that hides the low-level

14

Page 27: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

programming details. This pair of port /device allows for the development of a remote device driver that

can be used across a network, making the heavier processing independent from the robot hardware.

The Yarp framework divides the default devices into many abstractions that can be used, such as IPo-

sitionControl, IVelocityControl or IKin. Those abstractions are used to control the robot mechanics and

sensors without having to be worried with the low-level instructions. Yarp does this by providing an ob-

ject with simple control functions such as the positionMove(), velocityMove(), or setRefSpeeds().

Some of these device objects need exterior modules to be running to support all of the actions available.

Typically what these device objects do is no more than creating ports, to send and receive messages

to the control boards of the robot or to external modules that do other work.

2.5 Concluding remarks

In this chapter were presented the several elements of the work to be developed: HRI, the interaction

devices Wiimote and Kinect sensor, the iCub robot, and the Yarp framework.

Due to the lack of autonomy that still exists in robots today, HRI is still very focused in controlling robots

through several ways, such as teleoperation to achieve a natural and intuitive interaction, in a natural

and intuitive way.

The gaming industry has been developing several user interface devices, that are meant for games

control, but that offer very high technology solutions that can be adapted to the robot interfaces

challenges, and have as advantages robustness and low price. The Wiimote and Kinect are two of

the latest devices that have been made available, and they are being adapted for a wide range of

interaction projects from art to robotics with very interesting results.

In this work case the iCub robot will be used for the development of interfaces that save developers

time, and allow for easy interaction with first time users is a good application for the cognitive study and

interactive learning. Some of the most sophisticated robots available are used as helpers that need

constant control for very specific tasks, but the control form should be simplified and use metaphors

that are understood by users, this new interfaces allow for new metaphors to be created.

To interface with the systems and devices Yarp will be used, which allows for a easy separation of the

logic blocks of our application.

There are not many projects released using robotics and the Kinect or Wiimote/WM+ but it is certain

that this is a mixture that is going to be appearing in many other works in the near future adapted to

many different tasks in robotics.

15

Page 28: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chapter 3

Proposed Interface Systems

In this chapter the concept for the interface design is described, as the design itself and the options

taken for the design of the interaction systems.

3.1 Concept

To achieve our goal of comparing several interaction devices we used several interface systems, a

typical GUI interface on a computer, a typical D-Pad control interface, a motor control interface with

the Wiimote, a kinematic control interface also with the Wiimote, a kinematic control interface with

the Kinect sensor, and a motor control interface with the Kinect sensor. The interaction methods

developed were created to control the iCub robot movement, particularly to do arm tasks, such as

touching objects or following paths. Trying to achieve precision, natural movement characteristics and

intuitive interaction for the iCub movements was a important goal. Although the precision intended

was difficult to achieve, due to the iCub limitations, provide interaction methods and experiments that

illustrate the advantages and disadvantages of each of the interfaces and their use.

The devices used for this study were a computer GUI, a Wii Remote (Wiimote), and a Kinect sensor.

The computer GUI had already been developed, and it is broadly used and accepted by the iCub

development community as the default interface for the iCub motor control. The Wiimote interfaces

and Kinect interfaces are novel contributions of this work.

3.2 iCub

The iCub robot is a humanoid robot with 53 DOF. To control each motor it is first needed to declare a

part of the robot (head, torso, legs, or arms), where the motor is located, this is part of a three level

tree, where the levels correspond to: robot, part, and motor. Within each part the motors are identified

by sequential numbers, that start from 0 to the value of DOF of the part accessed.

16

Page 29: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

The software used to control the iCub can be downloaded from the iCub Apache Subversion (SVN)1,

although there is a binary version available it is preferred to compile the source available with the latest

updates.

The iCub software was developed as open source software, so it is a result of a series of contributions

to the iCub project done by different individuals from different places. This makes the iCub repository

very broad in its goals, just as there are applications to control the robot motors, there are also image

processing applications, or robot learning applications. Many of this applications are divided into

modules that can be reused with modules other than the original ones.

A example of a module is the iKin Solver that uses the iKin Controller, and that is able to receive a 3D

Cartesian point (where the root point is between the legs of the robot), and use the inverse kinematics

of the iCub robot to make the robot end-effector move to the specified point. The end-effector in the

arm case is typically the hand of the robot, but another end-effector can be specified, such as the

shoulder, elbow, or any of the robots joints. Any program developed can use these modules that were

implemented exactly to be used as support utilities for other programs.

The GUI used can be found in the iCub repository, but the Wiimote and Kinect interaction modules

were designed following the approach used by modules already available and are ready to be used for

other purposes than solely the control of the iCub arm, both of the modules also take advantage of

some software available in the iCub repository, such as the iKin driver, or the motor control devices.

3.3 Graphical User Interface (GUI)

The GUI used to “control” the iCub motors is a part of the iCub software repository, so no development

was needed.

Figure 3.1: Computer motor interface - named robotMotorGui.

A screenshot of the the GUI used is shown on figure 3.1. The GUI divides each of the iCub parts1https://robotcub.svn.sourceforge.net/svnroot/robotcub/trunk/iCub

17

Page 30: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

into tabs, and a “all” tab that joins all the controls presented in each tab into a single one. Each tab

is subdivided into a set of control panels equal the number of motors in that part, or in case of the

kinematic control tabs there are three panels to control the Cartesian position of the end-effector and

three panels to control the orientation of the end-effector, both tabs have one more panel with buttons

that do predefined actions, such as moving the motors to predefined angles, or to move the motors

through a user defined sequence of angles.

The control panels can either control the motors of the iCub independently or control a specific iCub

end-effector, such as a hand, each panel contains a set of buttons and sliders that move the motor to a

predefined angle, or that run a sequence of user defined angles, the same happens with the kinematic

control where each panel controls a Cartesian value or a Euler angle, for position or orientation of the

end-effector. Also in the panel there are a couple of integer labels that work as output of the motor

position and current velocity, or the end-effector Cartesian position and orientation angle. By moving

the angle or velocity slider the user can move and alter the motor status, the same happens for the

kinematic control. When controlling the kinematics of the robot through a panel, each time that a slider

is altered making the robot move to a new pose, the user must wait for the robot to reach its target or

for the time specified to reach a target to end, so that the slider can be changed again. When a target

position is not reached a warning message is displayed.

The GUI is a complex interface made out of simple parts, it becomes complex mainly because of

the amount of controls that are available to the user, and also because while controlling the motors,

repositioning one motor will affect the position of other motors, obliging the task of repositioning the

robot into a specific position, to be done in one previously planned sequence. To be agile in this logical

thought of motor sequence it is important to have a clear idea of how each motor moves, and how the

change in position of a motor will affect the others, this perception is only reached with some practice,

although in the kinematic control this idea is simplified because only a target point needs to be set for

the whole robot to move towards it.

This system allows for a simple control of the motors, and kinematic chain, subdividing the problem

of the motor control into simple independent problems for each motor, and abstracting the kinematic

control to a 3D Cartesian position. Although this is a complex interface, it is composed of simple parts,

which makes it a comprehensible system.

3.4 Wiimote

The Wiimote interaction software for the iCub was developed specifically for this study.

The Wiimote is a remote control used for interaction with the Wii gaming console, that equipped with

the Wii Motion Plus (WM+) extension is able to detect angular motion in three axes, pitch, yaw, and

roll, as it is shown in figure 3.2. The WM+ extension adds a gyroscope to the already available sensors

list. The sensors available in the Wiimote have an world axis that is defined in this way: the Y axis, as

18

Page 31: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

being along the Wiimote; the Z axis, as the direction perpendicular to the buttons surface on the top of

the Wiimote; the X axis, has the direction perpendicular the Wiimote side.

Figure 3.2: Wii remote axes.

The main difference to other works made on humanoids with the Wiimote was using the WM+ extension

gyro instead of the Wiimote accelerometer, this allows for the detection of a angular movement made

with Wiimote instead of the Wiimote position relatively to gravity. Because it was preferred to track the

data from the gyroscope instead of the accelerometer, only the change in angular movements made

with the remote are detected.

The buttons on the Wiimote were also used, the 1 and 2 button were used, on the motor control, to

alter the part being controlled to arm or forearm, in the kinematic control, to alter between controlling

the position or controlling the orientation, and the trigger like button B was used to start or stop the

control of the robot, while pressed the robot would be controlled, when released the robot stops and

clears all the previous commands that were waiting to be executed.

During the work it was avoided to use the IR camera because this obliged the users to be always

aware that they should be pointing the device to the sensor bar (the IR source), and this would not be

useful, particularly for the mobile robots.

Although the Wiimote control methods presented below can be used to control any part of the robot,

they were developed focusing on the robot right arm control, which is the focused part during the

evaluation made.

3.4.1 Motor control

The Wiimote motor control, uses the Wiimote to interface with each motor independently. A rotation

along each of the Wiimote axis, controls a different motor. The robot is controlled while the user is

pressing the B button, and the motors only move while the user is moving the Wiimote with some

angular speed.

19

Page 32: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

The arm control is divided into two parts, the arm (motors 0, 1 and 2), and the forearm/hand (motors

3, 4 and 6) parts. The parts angular movements are shown in figure 3.4 in different colors, and the

motors used are shown in figure 3.3. The user can toggle between parts using buttons 1 and 2 of

the remote. The arm part has three motors that control the three DOF, the Wiimote are mapped the

following way: the Y axis rotation rotates the arm around itself (motor 2, axis 3), the X axis rotation

rotates the arm around a axis perpendicular to the robots torso side (motor 0, axis 1), the Z rotation

depends on the motor position that is dependent on the X axis rotation (motor 1, axis 2). The forearm

part is mapped as: Y axis rotation rotates the forearm around itself (motor 4, axis 5), the X axis rotation

rotates the arm around a axis perpendicular to the robots arm (motor 3, axis 4), and the Z rotation

rotates the hand left and right in a wave movement (motor 6).

Figure 3.3: Wiimote controlled motors in the iCub arm.

The control of each motor independently obliges, as the the GUI control does, to be aware of the

sequence needed to be followed to reach a specific position. The advantage is that this sequence

is made out of only two independent parts, that makeup the arm. Each of those part is made up of

a maximum of three motors, and the motors are controlled simultaneously by the Wiimote angular

movements.

There is no need for the user to comprehend how the motors are really positioned because the

movements done with the Wiimote are mapped to the robots motors. Although the user needs to be

aware that this movement mapping does not mean that a movement done by the Wiimote will be the

movement done by the robot, but that a movement around an axis means a relative movement on a

specific motor of the robot.

This system allows a dynamic control of the robot where the motor angles values are abstracted to

20

Page 33: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Figure 3.4: Wiimote motor control of the iCub.

the user, although the motor rotations and the positions between them are not. This facilitates the

control of a single joint, that is composed by multiple motors, of the robot, making it easier to achieve a

position through a “try and fail” method.

3.4.2 Kinematic control

The Wiimote kinematic control can be used with three different control methods, the Wiimote move-

ments can be captured and mapped to the robot, the robot end effector can be positioned using the

arrow buttons from the Wiimote, or the orientation can be changed through the Wiimote movements.

The control using the arrow buttons could easily be exported to a computer keyboard or any other

device that has six buttons, so it is considered a typical device.

The kinematic control means that the control can be done in relation to an inverse kinematic chain,

the user only needs to specify where in the Cartesian space the end-effector should reach, and the

robot tries to reach that point. In the Wiimote kinematic control method case it means that the angular

movements rotate the end-effector around the shoulder of the iCub, having as radius the yellow line

and as end-effector the brown ball in the figure 3.5, the cursor control method pressing a button moves

the end-effector in a straight way, and the orientation control method does not alter the position of the

end-effector only its orientation.

In the cursor control method, the end-effector is moved in the space with the cursor and the + and -

buttons. The front and back button, move the end-effector to the front of the robot, or to the back of the

robot, the left and right buttons, move the end-effector to the left or the right of the robot, the + and -

buttons move the end-effector up or down.

In the Wiimote kinematic control method, the end effector is moved in space by the movements made

21

Page 34: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Figure 3.5: Wiimote kinematic control of the iCub.

with the remote and the + and - buttons, the user only controls the robot while the B button is pressed.

The movements of the Wiimote are directly mapped into the end-effector position, although because

only angular movements can be detected, the end-effector always maintains the same distance from

the robots shoulder, this distance can be changed by the + button, that increases the distance, and the

- button, that decreases the distance to the shoulder. In other words the angular movements made

with the Wiimote are mapped in a spherical way to the end-effector, the radius of this sphere can

be changed with the + and - buttons, and its center remains in the robot shoulder. In figure 3.6 this

interaction movements is illustrated by arrows.

Figure 3.6: Wiimote cursor control of the iCub.

22

Page 35: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

In the Wiimote kinematic orientation control method, the end-effector can be oriented by making

angular movements with the remote, as with the Wiimote kinematic control method, the control is only

made while the B button is pressed. This orientation control method has effect on the end effector

orientation, to achieve some orientations of the end-effector the arm must change its configuration, but

the Cartesian position of the end-effector is always maintained. Orientation change corresponds to the

movement of the Wiimote, when a orientation can not be reached, the robot ignores the orientation

change request.

This interaction system abstracts the user from any understanding of the motors, their position or angles.

The mapping is done directly between the user movements or direction input to the robots end-effector,

this is what defines how the motors should be configured to achieve the desired end-effector position.

Because the end-effector is a virtual point, there is no way of a user, after having done a movement,

having a precise idea of where the end-effector was positioned, so the control is made based on the

direction where the user wants the robot to move, and not on the position that the user wants the robot

to reach, obliging the user to relay on intuition.

3.5 Kinect

The Kinect interaction software for the iCub was developed specifically for this study.

The Kinect is a novel remote control for the Xbox 360 gaming console. This device is composed of one

RGB, one IR camera, an IR light source, a microphone, an accelerometer, and a small motor in its

base. In this work, only the IR camera and the IR light source were used.

The Kinect interaction device allows players to interact with the games by moving the body in front of

the camera. Typically the movements of the users body are mapped into a avatar character in a game,

this work did not stay far from this, but the mapping was done between the user and the iCub, that

presents very different challenges than a computer avatar.

A user can be detected by the Kinect device by doing a clear gesture, just by standing in front of the IR

camera the user pixels in the camera are detected. By waving four times the hand of the user can be

detected, and if the user stands in the Ψ position, figure 3.7b, with the arms parallel with the floor and

the forearms up forming a 90 degrees angle, the skeleton can be detected. These detections are not

immediate, they require some seconds and that the user or the hand is fully viewed by the camera,

without being too close or too far away.

When a user skeleton is detected by the Kinect, it outputs the user joints in a set of rotation matrices

and an 3D Cartesian distance points. The skeleton is made of 24 points and 10 rotation matrices,

each of these elements has a confidence value that is between 0 and 1 that indicates how accurate

the element might be. If the camera is moved from its original position, the skeleton needs to be

recalibrated, which means doing the detection process again.

When the hand is detected it outputs the Cartesian point that is considered to be the center of mass of

23

Page 36: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

(a) Kinect hand detection. (b) Kinect skeleton detection (Ψ calibra-

tion pose).

Figure 3.7: Kinect detection made by sample programs.

the hand. This point is continuously followed even if the camera is moved from its original position.

The rotation matrices, and the Cartesian points outputted have their axes and origin defined by the

Kinect device position. The Cartesian points can be outputted in centimeters, which are useful for real

world coordinate systems, or in pixel values, which are useful for graphical interfaces.

To start and stop the Kinect control, it was decided to use the Wiimote, trigger like B button. This

allowed to start and stop the control with a subtle movement, that goes undetected by the Kinect, so it

has no effect in the way the robot moves, other than enabling or disabling its control.

Although the Kinect control methods presented below can be used to control any part of the robot, they

were developed focusing on the iCub robot right arm control.

3.5.1 Hand kinematic control

The Hand kinematic control method for the iCub uses the Kinect hand detection to control the robots

kinematic chain, the end-effector of this chain follows the users hand movement in a mirrored way.

Kinematic control means that the control can be done in relation to an inverse kinematic chain, the

user only needs to specify where in the cartesian space the end-effector should reach,and the robot

tries to reach that point.

To understand the control made we can consider the brown ball, in figure 3.5, as the end-effector.

The user hand detection is done by waving the hand, in a smooth clear movement, in front of the IR

camera. After a few waves the user will be warned by a message in the monitor that the hand has

been detected.

The control starts when the user presses the Wiimote B button, and stops when the button is released.

When the control is started the hand detected point is considered to be at its origin, the same happens

to the end-effector, so the robot end-effector does not move.

24

Page 37: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Moving the hand affects the end-effector, making the robot end-effector move. The difference between

the origin and the current hand point is mapped into the end-effector, this makes the robot hand follow

the user hand detected point. The movements of the robot are mirrored to the movement of the hand,

which means that pushing the hand front will make the robot end-effector go back, dragging the hand

left will make the robot end-effector move right.

The control stops as soon as the user releases the Wiimote button, when the control stops it is not

considered if the end-effector has reached its requested position, and the internal state of the origin

point is cleared.

Because the hand is constantly shaking the API has a smoothness value that can be set from 0 to 1.

In this case we set the value to 0.8, a high value, although it loses precision, it does not consider very

small changes in the hand position, which might bring unwanted jitter to the robot.

Through this system a user can control the robot end-effector by moving its own hand in front of the

Kinect sensor. As the kinematic system presented before, this system abstracts the user from any

understanding of the motors its position or angles. All the motor control is done simply by moving and

changing the hand position. This type of control is very simple, and quite intuitive, although it does not

give any feedback to the user other than the movement that is being executed by the robot. Also what

a user considers small distance might not be so small to the robot.

3.5.2 Skeleton motor control

The Kinect skeleton motor control method, uses the Kinect API to detect a user skeleton. The software

developed maps the skeleton joints to the iCub motor joints, reproducing the user position.

When the application starts the iCub arms go to a “safe” position, parallel to the floor, this way when

the user starts controlling the robot if a dangerous position is assumed, there is time to stop the robot,

without self-collision.

The user skeleton detection is done by standing in front of the IR camera in the Ψ position, that can

been seen in figure 3.7b. Through out the detection and calibration process there are three important

messages, “User detected”, “Psi position detected”, “User calibration: Success/Failure”,

the last message indicates the success or failure of the calibration. If there is a failure indication on the

last message, it is recommended that the user verifies that the position is correct and all the body is

viewed by the IR camera.

The skeleton detected by the Kinect can be mapped in two ways, either straight forward, the right arm

of the user moves the right arm of the robot, or in a mirror way, the left arm of the user moves the right

arm of the robot. Here our option was to make it straight forward, because the user would stand to the

side of the robot and not in front of it. This way it is easier for the user to feel that the robot is mimicking

the user, so the user can see that any pose taken by the user will be assumed by the robot.

The control starts when the user presses the Wiimote B button, and stops when the button is released.

25

Page 38: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

While the user is controlling the iCub, the joint rotation matrices of the skeleton detected are mapped

into the robot motors. Each rotation matrix can be mapped into a maximum of three motors as Euler

angles. Because of the motors configuration the mapping between the Euler angles and the motor

angles is direct, because the number of useful joints in the skeleton detected is equal to the number of

joint in the iCub. Each joint in the skeleton is mapped to each joint in the iCub.

The speed at which each motor rotates is defined by the difference from the current angle to the target

angle. If a very small movement is detected a very small speed is assumed, this avoids the wobbling

effect of the robot, that occurred when the speed was constant, because of the small differences in the

joints angles.

Using this control method the user can define poses for the iCub to mimic. This is done by the user

standing in front of the Kinect IR camera in the desired pose. As the kinematic system presented

before, this system abstracts the user from any understanding of the motors its position or angles.

However this is not a kinematic control, but a straight forward motor control, where the angles of several

motors can be defined by the user pose. This type of control is very simple, and quite intuitive, the main

issue is that the user body and the robot body have different characteristics, which obliges the user to

be aware that although the user body is controlling the robot a collision that is not occurring on user

body might me occurring with the robot body, be it with external objects to the robot or by self-collision.

3.6 Concluding remarks

In this chapter the developed control methods were described: GUI control method; Wiimote motor

control method; Wiimote kinematic control method; Cursor control method; Kinect kinematic control

method; Kinect skeleton motor control method.

The GUI control system is the most typical interface used to control the iCub it allows for a simple

independent motor control, or for a kinematic control by specifying the end-effector Cartesian position.

The systems developed for the Wiimote and the Kinect were both divided into two software applications,

one takes advantage of the kinematic chain available in the iCub framework, the other accesses and

controls the motors individually.

The control of the motors individually, in most cases, obliges the user to be well aware of how changing

one motor angle might affect the other motors, except in the Kinect motor control system. In the Kinect

motor control system a position is achieved by the pose of the user body detected by the Kinect IR

camera.

The control of the kinematic chain simplifies some of the motor tasks, because the motor angle values

are calculated depending on a end-effector position value. Moving the end-effector target will affect all

the motors needed to reach the target.

Both systems lack what might be a fundamental characteristics for a better control, that is some

feedback other than the robot motors reaction. Not having any feedback other than the robot motors

26

Page 39: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

reaction was the choice made because the goal was to compare these two devices with the typical

GUI and not having a good graphical interface for every type of interaction developed.

Even with no interaction feedback, after having developed a few of the control methods, there were

already requests of fellow colleagues, to use this interfaces as utilities during their own work develop-

ment. Mostly to position the robot in a more suitable way for their work, as they have considered it

easier than the traditional GUI.

27

Page 40: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chapter 4

Applications

This chapter describes implementation details about the developed control methods described in the

previous chapter 3.

4.1 Application architecture

For the interaction with the iCub through the Kinect and the Wiimote, several modules were developed.

All the software developed throughout this work, was made using the C++ language, the Yarp libraries

and, the iCub libraries. Some of the modules created were used just for data visualization and

comprehension. Next we will describe the main modules created for the Wiimote and the Kinect

interaction interaction with the iCub, the data vizualization modules developed are not described

because they were only used for development support.

Yarp makes it possible to run modules in different machines and networks, connecting them through

the port abstraction. This allows a separation between the modules that interface with the devices, the

modules that work on converting the devices data into interaction commands, and the control modules

for the iCub, the two devices considered are the Wiimote and the Kinect.

The figure 4.1 is a representation of the network of modules created and used, and how they relate to

other machines.

The Yarp device drivers are the device interfaces created. More specifically they are two distinct

modules (one for the Wiimote another for the Kinect) that are recursively retrieving data from the

Kinect and the Wiimote, and redirecting that data in a usable format to the output ports specified in a

initialization file. These modules also have input ports that are used for the remote alteration of the

device settings, such as activation of actuators, or the internal state reset.

The interaction modules, respond to events that occur on ports connected to the Yarp device driver

ports, and output using the iCub libraries to the iCub ports. The events consist of data captured by the

interaction device and sent to a port, this data is converted into motor angle values or Cartesian point

28

Page 41: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

iCub BrainiCub

chico3Wiimote

Yarp device driver

Interaction Modules

Kinect

Figure 4.1: Kinect/Wiimote Yarp drivers and iCub network diagram.

values by the interaction modules that are interpreted by the iCub library functions.

Chico3 is a laptop computer where the the Yarp Server normally runs, and where typically the iCub

modules are called to be executed on remote machines, such as the iCubInterface in the PC104

computer mounted in the iCub or the iKin Solver/Controller modules in the iCubBrain server.

Every module developed respects the Model-View-Controller architecture:

Model this concept is present in all modules. In the device driver it is responsible for the interaction

device data streaming, and in the interaction control modules it is responsible for the conversion

between interaction data and motor angles, or useful Cartesian points.

View this concept is present in all the modules. In the device driver it is in the form of the interaction

device, used to interface with the user. In the interaction control modules it is present as the

results outputted to the iCub robot, the result from the interaction.

Controller this concept is present in all modules. It is the concept responsible for the port and data

management, in both, interaction control module and device driver.

There was a concern through out the work to make every class implemented reusable, and under-

standable, to whomever might want to run or reproduce it.

4.2 Main libraries used

The Yarp and iCub come with C++ libraries for developers to use. The libraries permit coding abstracted

from the technical details of the devices and encoders and focus on control and communication issues.

Both of these libraries were fundamental for the Wiimote and Kinect modules development. These

libraries are the ones that allow to control the iCub and to set up a simple distributed solution.

29

Page 42: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

4.2.1 Yarp

To remotely send instructions and retrieve the status from encoders and devices there is a Yarp server

continuously running, this server maintains a list of ports that are available. When a connection is

made to one of the ports available Yarp connects, directly without the data passing through the network

node that is running Yarp, to the network node that created the port. Typically each program creates its

own ports and connections are made port to port, each program is only responsible for maintaining its

own ports, this avoids Yarp server from becoming a bottleneck managing several connections. The

ports created are Transfer Control Protocol (TCP) ports that abstract the data passed from the network

node specific characteristics. Using the Yarp ports it is possible to get a distributed solution to work

with the devices that connect through a Yarp network.

To communicate to a specific device it is good practice to create a specific Yarp device driver, these

device drivers are to a user always instantiated in the same way and return a interface object that

has a set of functions specific to interface with that device. Usually each device needs some program

to be running to get or set the values of the device directly. The program that uses the device is the

one responsible for the management of the Yarp ports, this management involves keeping the ports

alive and using the ports to respond to calls made by other programs that might be using the device.

Through the ports it is possible to get media, text, or numeric data.

In the Yarp repository there are already many different drivers available that can be used, but a

programmer can also create a new driver from scratch, as the Kinect and Wiimote were. The drivers

created are accompanied by a program that controls the device directly. The drivers mostly abstract

the users from the ports by using the functions available in the driver interface class, with drivers the

communication is still made through the Yarp ports. Although abstracted the suggested architecture to

be implement by the program interfacing with the device is divided into two main classes, a Port Thread

class that is responsible for setting up ports and is constantly updating, and a Device Driver class that

interfaces with the device while it calls or it is being called by the Port Thread class. The Port Thread

class has two base classes RFModule and TypedReaderCallback. The RFModule is a helper class that

has already implemented a thread and needs that the virtual methods for updating and configuring

the thread to be implemented. The TypedReaderCallback is a callback class that is passed to a port,

as an object, and for each port event it calls a virtual method to be defined by the user. A simple

representation of the Yarp device driver main classes is shown in figure 4.2

Yarp is described as being the pipes in a water system1. Yarp itself can be used for many purposes

other than robotics, it sets up a way for different programs to communicate with each other over a

TCP network. This becomes very useful for intense processing tasks that need to be distributed

through several computers. The libraries available with Yarp can even abstract programmers more

from the communication ports that Yarp uses allowing to call functions remotely as they were part of

1http://eris.liralab.it/yarpdoc/what_is_yarp.html

30

Page 43: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

DeviceInterfaceDriver+getEventFromDevice()+sendEventToDevice()

PortThread+getDataFromPort()+sendDataToPort()+update()

RFModule+updateThread()+configureThread()+addPort()+sendDataToPort()+getDataFromPort()

TypedReaderCallback+onPortEvent()

Figure 4.2: Simple representation of Yarp Device Driver main classes.

the programs library, using the Yarp ports underneath. Yarp server although needed so that ports can

connect to each other is only used as a port listing service, the connected ports communicate directly

through TCP and if the communication is made through the same computer the communication is

made without the TCP by memory access. By splitting the device driver from the robot controller the

modules developed take advantage of the Yarp framework. The robot controller drivers follow some of

the ideas implemented in the Yarp device driver, such as the use of a Port Thread class, although it

has many characteristics that are discussed in the next chapter because they are iCub specific.

4.2.2 iCub

In the iCub, repository there can be found not only code contributions of different projects done with

the robot, but also a library that is used with Yarp implementing drivers and useful utilities.

In this repository is a critical program called iCubInterface. The iCubInterface initializes all the motors in

the robot, and sets up all the ports that are used to control the motors. The iCubInterface is the program

that setups the ports for the motorboard driver, which is a generic driver from Yarp, for interfacing

with the motors. This program directly uses the motors setting and getting their encoder status. The

iCubInterface is generic enough to be adapted to robots other than the iCub, such as it was done

with the Vizzy robot. The adaption is done by altering an initialization file, that specifies which motors

connect to which port, more than one motor is reached through a single port, although each motor is

only connected to one port. For each port prefix there are three ports set, a status port, a command

port, and a Remote Procedure Call (RPC) port. The status port indicates the status of the motors, the

command port receives commands, and the RPC port receives RPC calls and acknowloges them. The

port prefixes setup for the icub are:

/icub/head

/icub/face

/icub/cam

/icub/torso

/icub/righ_arm

31

Page 44: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

/icub/left_arm

/icub/right_leg

/icub/left_leg

The names of the ports are self explanatory, each one controlling a set of joints, with the exception

of the /icub/face and /icub/cam, where one is connected to the facial expressions of the iCub, and

the other is connected to the iCub cameras respectively. Both /icub/face and /icub/cam are not the

typical prefixes, not having the three ports for status, command, and RPC, but the face has eyelids,

and the cameras are left or right, with fovea and logpolar ports.

The motorboard driver can control the motors in two distinct forms, either by position or by velocity.

The position control is made by specifying the motor angle that a user wants to reach, the driver

maintains a internal state of the angle changes that is read from the encoder. When a new angle is

determined, and a reference speed and acceleration has been chosen, the motor moves according to

these values to the specified position. The velocity control is done by specifying the speed which a

user wants a motor to move, positive velocities move in a positive direction, negative ones move in a

negative direction, also here a reference acceleration needs to be specified. These control forms are

instantiated by two classes which interface through the iCubInterface ports, the IPositionControl

and IVelocityControl.

In the motor interaction controller modules developed, only the velocity control was used, this choice

was made because a better control could be made. The applications send timed updates of the control

values to the motors, when the position control was used the response time was higher resulting

in sometimes unpredictable movements, particularly noticeable in the case of the Vizzy robot, that

also used the ICubInterface, the motor made a unpleasant clicking sound whenever a new position

value interrupted an already moving motor. With the velocity control the speed defined depended

on the current angle and on the desired angle, that was maintained internally to the program calling

the velocity control, if the difference between the desired and current angle was negative it meant a

negative movement should be made and vice-versa. Also the speed chosen depended on how far the

current angle and the desired angle were, the closer they were the lower the velocity was, this avoided

a wobbling effect that happened when using the position control.

Another driver available is the iKin, the iKin has a set of functions that facilitate the use of the inverse

kinematics of the iCub. Like the iCubInterface, this driver is generic enough to be adapted to the

Vizzy robot by rewriting a initialization file with the kinematic chain specified in the Denavit-Hartenberg

convention. There are two programs that accompany the iKin driver, a solver and a controller. The

programs are not merged into a single one, because besides separating robot control from the

kinematics control, they are very process intensive, this way they can be run in different computers

through the Yarp ports. The iKin solver receives the current status of the motors that it needs to move

and calculates the positions to which the motors need to move so that the end-effector can reach, or

32

Page 45: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

try to reach, a point given in Cartesian coordinates, this is where the inverse kinematics are calculated.

The iKin controller defines the velocities and acceleration that each motor should be set to, while

moving to a angle, the goal is to make the iCub do a more natural, and human like, movement. When

using the iKin driver the iKin solver and controller need to be running simultaneous because they

exchange messages between them, so that the controller and the solver know what are the angles

desired for the iCub motors. The driver also permits to set characteristics for this type of control as the

tolerance to consider a desired end-effector target achieved, or the amount of time to go from current

end-effector pose to the desired end-effector pose.

The kinematic modules use the ICartesianControl class, the goToPosition() function can be called

with or without a orientation defined, in the implemented modules case, it was decided that the

orientation would only be defined when controlling solely the orientation, so only the Wiimote kinematic

control module uses the orientation in one of its modes. This was decided because when trying to

reach a point with the Cartesian control if the orientation was defined, the Cartesian Solver module

would have to weight between reaching the point and having the correct orientation, and to our goal it

was more interesting reaching the desired point in the best way possible. The speed of the movement

made can be defined by the amount of time that it will take to reach from the current pose to the desired

pose, this time is set in the initialization file, so a small movement by the user will result in a slow

movement by the robot and a large movement by the user will result in a faster movement by the robot.

The suggested implementation of the interaction control modules, use both the position control and

kinematic control to interact with the iCub robot, although the interaction methods differ, all the modules

have a similar class diagram. As in the Yarp Device Driver there is a Port Thread class that sends, and

receives data from the Yarp device driver, and also calls functions from an abstract class that controls

the iCub robot the iCubGeneric class, the Port Thread also uses as base classes the RFModule and

TypedReaderCallback classes as in the previously described module. The iCubGeneric abstract

class as virtual functions that must be implemented by the classes that inherit from it, and simple

functions and data structures that maintain the needed data to control the robot. The virtual functions

that must be implemented, initialize the robot, close the robot connection, and update the robot. It

was chosen that the iCubGeneric maintained its own internal state data from the interaction device,

because it allowed to separate the update rate of the robot and the device data update rate, so a event

in a data port does not mean an immediate reaction of the robot state, this makes time for a better

control of the robot, the need to send commands to the robot is much less frequent than the data

updates from the interaction devices. The classes that implement the iCubGeneric class control the

robot either through the kinematic control or the motor control, this inheritance was useful because

the Port Thread never cares about how the robot is controlled, but it cares about at what rate must

the robot state be updated, when does it start, or stop controlling the robot, and what data should be

passed on to the robot. Conversion of the data into useful movements is done by two classes that are

implementations of the iCubGeneric class, one specific for the kinematic control, another for the motor

33

Page 46: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

control, these are the classes that have to be implemented depending on how it is supposed for the

iCub to interpret the device data, they both implement all the virtual methods from the iCubGeneric. A

simple representation of the explained interaction module can be seen in figure 4.3.

iCubGeneric#DeviceData

+startRobot()+stopRobot()+updateDeviceData()+updateRobotState()

PortThread+getDataFromPort()+sendDataToPort()+update()

RFModule+updateThread()+configureThread()+addPort()+sendDataToPort()+getDataFromPort()

TypedReaderCallback+onPortEvent()

iKinController+moveRobot(x,y,z,)+convertDeviceData()

motorController+moveRobot(id,angle,speed)+convertDeviceData()

Figure 4.3: Simple representation of Interaction Module main classes.

The iCub libraries available allow to connect to the iCub and its devices as a normal device driver,

abstracting the programmer from all the specific iCub control issues. The motor access is done through

the remote_controlboard driver by using the iCubInterface, which is responsible for maintaining the

ports that interface with the robot motors. The iCub library has a kinematic control driver, the iKin, that

allows the inverse kinematic control of the iCub. Because the iKin task is a heavy processing task, it is

divided into two programs a controller and a solver, these programs not only solve the kinematic task,

but try to achieve a natural movement by defining the motor speeds depending on the movement to be

done. The interaction modules developed separate the data retrieval from the robot motors update

without the port control block and the robot control block having a high dependency on each other, this

made possible a independent implementation and tweaking of the control from the data retrieval. Also

the implementation suggested keeps the update rates separated making a more convenient control

available, where a control instruction can be sent after analyzing several device data updates, which

come at a much higher rate than the needed for the robot control.

4.3 Wiimote modules

The Wiimote iCub controller application is divided into three modules ,as can be seen in figure 4.4, the

Wiimote Yarp driver (device driver), the Wiimote solver, and the Robot controller (interaction controller).

Each of these modules is a independent program that might be run on a different computer, connected

through the Yarp network. The Wiimote Yarp driver and the Wiimote Robot Controller follow the

architecture defined before, although they solve specific problems of the Wiimote interaction and

device interface.

The Wiimote Yarp driver module is the interface between the Wiimote device and the Yarp Network.

34

Page 47: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

iCubRobot

WiimoteYarp

DriverModule Wiimote

RobotControllerModuleWiimote

SolverModule

Figure 4.4: Wiimote Yarp modules and iCub controller.

The library chosen to interface with the Wiimote and the WM+ extension was the wiiuse2 library that

comes with the fwiineur3 libraries. The fwiineur is a set of libraries that allow MATLAB and SCILAB to

acquire data from the Wiimote device, the wiiuse library used by the fwiineur was altered to get data

from the WM+ extension, the original wiiuse does not have this feature implemented although it can be

programmed to do this by reading the data from the proper Wiimote memory address, and interpreting

that data.

In the Wiimote Module there is a initialization file where all the port names prefix, velocities and

robot name are set. The module starts by opening five ports, from which the prefix is specified in a

initialization file, by default the prefix is /wiimote. The ports opened are:

/wiimote[NUMBER]/sensor

/wiimote[NUMBER]/ext/mp

/wiimote[NUMBER]/status

/wiimote[NUMBER]/reader

/wiimote[NUMBER]/bots

Because there can be up to four Wiimotes connected simultaneously, all the excess ones are ig-

nored, the port prefix /wiimote[NUMBER] has a number indicating the Wiimotes ID. All the ports

are output ports, that send data whenever the main thread is updated, with the exception of the

/wiimote[NUMBER]/reader port, that is a reading port which allows to set the Wiimote sensors and

actuators, on and off.

The /wiimote[NUMBER]/sensor outputs the sensor values, per each set of values a vocab is written

before. Each of the sensors needs to be turned on, which is done when a command is received in the

reader port indicating what sensor should be turned on or off. The IR camera value is preceded by

the [IR] vocab, and sends up to four Cartesian points to the sensor port, representing each of the IR

sources. The accelerometer value is preceded by the [ACC] vocab, and outputs a vector that is defined

by the accelerometer position relatively to gravity.

2http://sourceforge.net/projects/wiiuse/3http://fwiineur.blogspot.com/search/label/release

35

Page 48: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

The Wiimote solver module connects to the /wiimote[NUMBER]/ext/mp port, which sends data from

the WM+ extension, and on each read event, updates a internal rotation matrix. In this module there is

a reader port that can receive a new rotation matrix to overwrite the current one. This matrix represents

all the rotations made with the Wiimote in the form RxRzRy. This module outputs its internal rotation

matrix, and the vector (0,1,0) rotated by the rotation matrix. When the WM+ extension is turned on

there is a calibration process where the average of noise output by the gyroscope is calculated and

removed from the values acquired, only after this calibration, which needs the Wiimote to be on a

stable position for a moment, the /wiimote[NUMBER]/ext/mp port starts outputting data.

The Wiimote Robot controller module connects to the ports opened by the solver module and the Yarp

driver module. This module uses the iCub library to connect to the robot ports through the PolyDriver

interface. This interface abstracts the iCub control through simple functions that make all the needed

port connections and send all the important data. It is possible to control the robot upper part by

directly calling the motor joints, and defining a velocity, acceleration and angle, or by using a the iKin

library that abstracts the kinematic control of the robot to a Cartesian point of the end-effector, the

details of these module were described in the previous section.

4.4 Kinect modules

The Kinect iCub control application is divided into two modules as can be seen in figure 4.5, the

Kinectic Yarp driver (device driver), and the Kinect robot controller (interaction controller). Each of

these modules is a independent program that might be run on a different computer, connected through

the Yarp network. The Wiimote Yarp driver is also used because the B button on the remote is used as

a on/off switch for the interaction. The Kinect Yarp driver and the Kinect Robot Controller follow the

architecture defined before, although they solve specific problems of the Kinect interaction and device

interface.

iCubRobot

KinectRobot

ControllerModule

KinecticYarp

DriverModule

Figure 4.5: Kinect Yarp modules and iCub controller.

The Kinect Yarp driver module is the interface between the Kinect interaction device and the Yarp

network. To interface with the Kinect sensor the OpenNi/NITE library was used. The OpenNi/NITE was

created by PrimeSense, OpenNI is a generic library to connect to a broad set of devices, NITE is used

by the OpenNI to interface directly with the Kinect device. Although there was already a Kinect driver

in the Yarp repository, it did not made the tracking of the skeleton, because the libfreenect library that

36

Page 49: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

was used did not have the skeleton tracking algorithm implemented yet, so using the OpenNi/NITE

library was preferred. This module can make two types of tracking, skeleton and hand tracking, the

type of tracking desired is specified in the initialization file, only one output port is opened, there was

no need for input ports because no settings needed to be changed on the Kinect for the interaction

to work. During the skeleton tracking at each update of the Port Thread a set of ordered Cartesian

points and rotation matrices, that describe the kinematic skeleton of a person detected by the Kinect

sensor, is output to a port. The description of the skeleton is made in two forms, a set of points in the

Cartesian space, and a set of rotation matrices that are indexed to each of the skeleton joints, also

there is a confidence value for each element (matrix or Cartesian point) that indicates how accurate

the element is. The hand tracking only outputs two values at each update, a Cartesian point and a

confidence value.

The Kinect Robot controller module connects to the port opened by the Kinect Yarp driver module, and

updates its internal state data on each port event, sending commands to the robot at a different rate. In

the initialization file it can be specified what is the desired type of control, motor control for the skeleton

tracking or the kinematic control for the hand tracking, depending on the desired tracking it is expected

that the port messages come in a specific format. The data received from the Yarp driver port, while

tracking the skeleton, in the form of rotation matrices, is factored as RxRyRz and mapped into three

motor angles. The rate at which the skeleton values are updated and at which the robot control is done

are different, because it was better to maintain a smoother control without respecting the Kinect frame

rate. The robot controller uses the remote_controlboard iCub driver move the motors independently

to the desired angles. While the robot controller module is tracking the hand the Cartesian point read

from the port is mapped to the cartesiancontrollerclient device end-effector for the robot. This

module only takes into account the first user detected ignoring all the others.

4.5 Concluding remarks

In this chapter the applications developed for interface with the iCub robot were described.

Both the applications followed a similar structure, the abstraction was made on several levels so that it

could be easily reused by other projects.

Of the Yarp framework was taken advantage to separate the interface device drivers from the interaction

controller logic of the robot. In the case of the Wiimote module a specific application was created to

handle the angular movement status, which might be replace by other application if needed with out

needing to change either the control application or the driver application.

Both of the driver application created were actually in another robot besides the iCub, the Vizzy robot,

the Kinect driver was used during several presentations. To achieve the desired effect it took less

than a week, and no changes were required in either of the drivers, only motor mapping needed to be

readjust to map the correct motors joints in the most proper way.

37

Page 50: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chapter 5

Evaluation

This chapter describes the tests made to understand the advantages and problems of the use of the

implemented interfaces, also the results from the tests are presented. This chapter makes a effort in

justifying the choices made for the tests, such as the test subjects selection.

The main goal of the tests was to compare the several interfaces developed, to identify important

issues to be solved in future works done with this devices, and to understand to which type of tasks

might each type of interface adapt better.

For the tests we have chosen a group of ten male and female subjects who have a broad age range

from twenty years old to sixty years old, but most of the group is around their twenties. Nearly all of the

subjects use computers every day, in their everyday job, but are not technical users, and have had

none or very little contact with the Wiimote and no contact with Kinect device, for most of them it was

the first time that they even heard about such a interaction device as the Kinect.

The subjects were chosen with these characteristics because we believe that a user with a very little

knowledge about the robot and of this type of interfaces might encounter more unexpected difficulties

than a user that already knows the iCub limitations or has a very good understanding of how the

interface works. This way the main difficulties in controlling the iCub, and most intuitive control methods

stand out in a much clearer way.

The users were invited to come to the Institute for Systems and Robotics - Lisboa, where the iCub

robot is kept, to participate in the tests for this thesis. Most of the tests were done during the weekends

so that the users could feel comfortable to execute the tests without any distractions, and to comment

about their experience.

5.1 Tests description

The experience proposed to the user involved a brief presentation about what was the project and

what were the tests, a set of tests with different interfaces, and a questionnaire with questions about

38

Page 51: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

their experience.

The initial presentation is shown in appendix B.1. The intent with this presentation was to explain what

is the iCub robot, what is its objective, why are the tests being done, how do the interfaces that were

going to be used worked, and how do the interface devices work. During the tests a presentation

slide with detailed information, or with a picture illustrating how the interface worked was shown on a

computer screen, as a support to the user test goal. Before each test the user was reminded about

how the interface worked, what was the goal of the test, and special details that should be taken into

account, such as the fact that the robot did not have any kind of protection about self injury and what

were the interface limits.

PPPPPPPPPPPControl

DeviceGraphical D-Pad Wiimote Kinect

Motor - - Motor Skeleton

Kinematic GUI Wiimote Cursor Kinematic Hand

Table 5.1: User tests per control and device.

Table 5.1 shows the tests made per device and control type. The control could be made either kinematic

or through motor joints independently, and the devices used were a graphical interface, a directional

pad, the Wiimote, and the Kinect. For the Graphical interface and for the D-Pad only the kinematic

control was used, because after a simple try out test with an experienced user the task with the motor

control was considered too complex to be done by a novice user, due to having to control each motor

independently one at a time.

All the tests were made using the same setup scenery. This scenery is presented in figure 5.1, where

it is possible to notice that all the objects are suspended from the ceiling, and reachable by the iCub

robot right hand. The objects are numbered from one to five to be easily identified by the user. In the

task where an obstacle should be present the object number two, shown in figure 5.1, was substituted

by a blank A4 paper sheet. The user could stand wherever it was more comfortable for the interaction,

except during the GUI test, because there was the need to use a specific computer, that was next to

the iCub robot. The test observer stayed next to the robots emergency shut-down button, and with a

extra Wiimote in hand that was programmed to position the robot into its initial safe position.

Each test is divided into five tasks, that are common to all tests.

1. Play with the interface until it is comfortable.

2. Move hand up and down.

3. Move hand left and right.

4. Touch all the objects by the predefined order (1,2,3,4).

39

Page 52: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

(a) Front view. (b) Left view.

Figure 5.1: Interfaces test scenery.

5. Touch some of the objects by the predefined order, with an obstacle (4,1,2).

The first three tasks were made so that the user could get accustomed with the interface, and to check

that the user was able to do basic tasks with the interface, tests would not continue until the user was

able to do the second and third tasks in a sufficiently effective way, they are considered basic tasks.

During these tasks only the users comments and notes about the interaction were taken. The fourth

and fifth tasks are advanced tasks, from this tasks the duration that each task needs to be done is

also registered by the program. Before the fourth and fifth were started, the arm would be moved to its

initial position making a 90o angle with the side of the robot, and the forearm perpendicular to the arm.

The tasks were made with the intention of testing the user ability to through the interface do a

predetermined path with key points, and also how constrains in that path might affect the user

experience. The order was chosen so that the user would have to make always a long path and a short

path, from object 1 to 2, or from object 4 to 1 is a long path, from object 2 to 3 or 2 to 1 was a short

path. Through this we hope to be able to understand what kind of interaction might serve better to

what purpose, and how is it possible to make it work better.

Besides these five tasks there were two extra tasks, these tasks were specifically designed for

the Wiimote kinematic interaction method, and for the Kinect skeleton interaction. With the Wiimote

kinematic interaction it was asked users to control the orientation of the hand and do simple movements,

such as turning the hand upside down and waving. With the Kinect skeleton interaction it was asked to

40

Page 53: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

the users to make the iCub robot imitate several poses made by the tests observer. The reason for this

tests was that both of these systems were extremely well received by some of the expert users that

tried out the interfaces, and it was interesting to see how the non expert users reacted. The tests were

extra because they were tasks too simple to be evaluated as the more complex tests.

Before an advanced task took place it was reminded to the user the rules of the task. In the forth task

the user should not touch the objects besides the next one in sequence. Although the main focus

was to touch all the objects, so if an unintended object was touched a note would be taken but there

would be no problem. In the fifth task the user should not touch the obstacle or any of the non intended

objects, a non intended object being any other than the next in the supposed sequence. Both of the

tasks should be done calmly, control was preferred to speed, and also it was considered more useful

not having a shut-down error, such as the robot self-collision than having a task done very quickly.

A test set was composed of five tasks per each interaction method, resulting in sixty tasks to be done.

The first task would consume much time because of all the doubts that the user had, the third and

second tasks were very quick and none of the users failed. To do each test set a user would need from

one hour and a half to two hours, taking into account all the explanations that were given, being that

the useful testing time, duration of the advanced tasks four and five for all the devices, took in average

forty minutes.

5.2 Tests results

The tests were evaluated in amount of time and number of errors.

The amount of time was calculated as the total time that it took a user to do the task requested. During

the tasks some users stopped to do questions and to think about what should their next action be, to

get a better control. The times presented have taken that into account. The times were divided by task

and by each object touched.

The number of errors is divided between wrong object touched, reboot, quit, default error, and obstacle

touched. A wrong object touched error indicates that the iCub robot touched a non-intended object. A

reboot error indicates that a task needed to be restarted due to some problem related with the user

interaction. The problem can vary from the robot self collision, to a robot part breaking. A quit error

indicates that a user as quit from doing the task. A default error is a unspecified error that occurred

at that time from the user interaction. An obstacle touched error indicates that a user touched the

obstacle.

A comparison between the average time it takes to conclude an advanced task (tasks 4 and 5) is

presented in figure 5.2. The graph shows the average time that it takes a user to finish an advanced

task with each one of the control methods. From the graph it is clear that the GUI interface is the one

that takes the longest and the Wiimote cursor (the D-Pad interface) is the fastest control method. The

avoid obstacle task with most of the interfaces takes a little less time probably because it was the

41

Page 54: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

second time users tried to reach the same objects, and this time there were three objects instead of

four. In the case of the Kinect Kinematic (Hand) test there was a big difference between the touch

objects test and the avoid obstacle test, this happened because users did larger movements in the

second test as a way to deviate from the obstacle, this technique resulted in faster movements from

the iCub robot.

GUI Wiimote Motor Wiimote Kinematic

Wiimote Cursor Kinect Kinematic(Hand)

Kinect Motor0

1

2

3

4

5

6

7

Task time

Touch ObjectAvoid obstacle

Control type

Tim

e(m

inut

es)

Figure 5.2: Task time comparison.

A comparison between the average amount of errors occurring in the object touch task and the obstacle

avoid task in all the interactions is shown in figure 5.3. It can be easily understood that the interaction

with the less errors is the Wiimote cursor, where both tasks have equal average amount of errors

occurring. The GUI and the Wiimote kinematic interaction are the ones where the most errors take

place. The Wiimote motor and the Kinect motor (Kinect skeleton) interaction have similar amount of

errors, although while observing the users reactions to the interface one could understand that the

Kinect motor was much more successful, in time, speed of learning and user involvement. The Kinect

kinematic (Kinect hand) interaction had the same amount of errors for both tasks, and from what was

observed it was clear that, with the exception of the Wiimote cursor interaction which is a typical cursor

already known by all the users, this was where the control of what the robot was doing was made in a

easier form to the user. With the Kinect kinematic interaction users seemed to have a clear sense of

what would the robot do next and how to make it react as it was intended.

In figure 5.4a it is shown the average time of the touch object task to make the robot end-effector

(hand) travel from the initial position to the first object, from the first object to the second, the second to

the third and the third to the forth. As it was described before a path needed to be followed during the

tasks, the path for this task was defined by the numbers of the objects shown in figure 5.1. The first two

objects were the ones further way from each other and from the hands initial position, so it is natural

that this was the most time consuming maneuver, it was a surprise to see that the Kinect motor control

42

Page 55: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

GUI Wiimote Motor Wiimote Kinematic

Wiimote Cursor Kinect Kinematic(Hand)

Kinect Motor0

0,5

1

1,5

2

2,5

3

3,5

Task errors

Touch ObjectAvoid obstacle

Control type

Amou

nt o

f er

rors

Figure 5.3: Task error comparison.

was extremely fast in reaching the first object, the reason why the time to reach the second object

peaks is not only because a user to reach the second object needed to deviate from other objects

in the way, while the path to the first object did not have any objects, but also because there was an

unresolved issue on the position users tend to take to reach object two, this issue is better explained in

the 5.5 section. In the figure 5.3 it is also clear that the GUI interaction is the one that takes more time

to reach all the objects except when reaching the fourth object, which was right next to the left of the

third object. During the Wiimote motor many questions would be made because the control was not

very clear, specially while trying to touch the first object, although the intent of the first basic tasks was

to avoid this, having a more complex goal than making the robot arm go up, down and to the sides,

made the users feel more insecure, although after these doubts were clear the amount of time to reach

each object would decrease.

The avoid obstacle task consisted in touching on three objects without colliding with an obstacle placed

in the middle of the path, as described in section 5.1. In figure 5.4b is shown the average time to reach

each object in the test. The first object was very near the initial position of the hand and could be easily

reached, the path that needed to be done to the second and third object were the hardest challenges

to solve. The graph 5.4b is better understood if it is seen with the errors graph 5.6, because the times

achieved with some of the interfaces meant also a high level of errors. Focusing solely in the avoid

obstacles time graph it is easy to notice that the Kinect motor interaction was the most successful

along with the Wiimote cursor interaction. As before these were the interfaces with which the users felt

most comfortable. The Kinect motor task actually has a faster time to reach the second object than

the first, this happens because when the control starts with this interface the first movements done by

user tend to be more to make sure that the robot is actually mimicking the movements correctly than to

43

Page 56: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

GUI Wiimote Motor Wiimote Kinematic Wiimote Cursor Kinect Kinematic(Hand)

Kinect Motor0

0,5

1

1,5

2

2,5

3

Time to reach object

1st2nd3rd4th

Control type

Tim

e(m

inut

es)

(a) Touch objects task steps time.

GUI Wiimote Motor Wiimote Kinematic

Wiimote Cursor Kinect Kinematic(Hand)

Kinect Motor0

0,5

1

1,5

2

2,5

3

Time to reach object

1st2nd3rd

Control type

Tim

e(m

inut

es)

(b) Avoid obstacle task steps time.

Figure 5.4: Tests steps time.

reach the first object. The GUI and the Wiimote motor are the slower interactions, but although the

Wiimote kinematic time to reach the second object seems very good by comparison, in the graph 5.6 it

easily noted that there were many errors that accompany this result.

A comparison between each type of error per interaction control type for the touch object task is shown

in figure 5.5. In this graph it is clear that the Wiimote kinematic control is the less precise control having

the higher amount of wrong objects touched, although as it is pointed out in the previous section, the

main focus of this task was not to avoid the objects but to follow a path defined by key point objects,

but of course wrong object collision avoidance would be considered a plus for good control. The

lower error rates are associated with the Wiimote cursor interaction and the Wiimote motor interaction.

The Wiimote motor interaction proved to be not so efficient as some of the other interaction control

methods, but a better control could be made. The Kinect motor interaction also had a higher level

44

Page 57: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

of wrong objects touched, but from the observer perspective it was clear that the user took a riskier

control, were higher speed and more difficult movements where made, for example moving the hand

through two objects close to each other. Crossing the observer perspective with this results it would

be fair to assume that although the control was better with the Wiimote cursor and the Wiimote motor

interaction, the users seemed to be more confident with the Kinect motor interaction. The reboot error

happened with two different users, one because the robot self collided, the other because a movement

was considered too dangerous by the observer and the emergency shutdown button had to be pressed.

The quit error happened because the two older test users gave up after not being able to execute the

task, the Wiimote kinematic control was not the most successful control type but with this two users,

one sixty seven another fifty six years old, it was very hard for them to feel comfortable doing the

proposed tasks. The default error is better explained in section 5.5, sometimes it was difficult to detect

the user with the Kinect interface, but for most of the users there was no problem.

GUI Wiimote Motor Wiimote Kinematic

Wiimote Cursor Kinect Kinematic(Hand)

Kinect Motor0

0,5

1

1,5

2

2,5

3

3,5

Error typeTouch object

Wrong objectRebootQuitDefault Error

Control type

Amou

nt o

f err

ors

Figure 5.5: Error types for the touch object task.

To compare the avoid obstacle task errors there is an extra error indicating when a user touched

the obstacle, as it can be seen in figure 5.6, and the main focus of the task was to touch all the

object without touching the obstacle. The interaction with less errors was the Wiimote cursor control

interaction, the only errors occurring in this case were wrong objects touched. The Wiimote cursor was

also the only interaction where the obstacle was never touched. There were two interaction were user

quit, the main reasons were very low confidence in that the task could be completed, and one user

after having successive self collisions with the robot it was considered best to skip to the next test. The

reboot happened in most cases because the observer was not confident that the user would be able to

make a certain movement without either self collision, or moving the robot to an dangerous position

where parts could be broken, as it happened with one of the users where one of the iCub robot cable

was broken even though the limits set by the iCubInterface should avoid this. The Wiimote kinematic

45

Page 58: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

control was the one with the higher value of times the obstacle was touched, demonstrating how little

precision the users were able to have during this control method. In the touch object task the users

during the Kinect motor control opted for doing riskier movements as a way to shorten the time for a

specific test and also they wanted to feel how well the robot could be controlled with that system, this

was mainly due to the excitement of having a robot humanoid mimicking their movements. Because

the Wiimote motor control and the Wiimote kinematic control appealed to the user for their intuition

instead of logic, although the Wiimote motor worked better once it was understood as several younger

users showed, in this test the lack of precision would increase the difficulty of the task conclusion,

so some users facing that they could not conclude the task properly opted by quitting the task and

proceeding to the next task.

GUI Wiimote Motor Wiimote Kinematic

Wiimote Cursor Kinect Kinematic(Hand)

Kinect Motor0

0,5

1

1,5

2

2,5

Error typeAvoid obstacle

Wrong objectRebootQuitDefault ErrorObstacle

Control type

Amou

nt o

f err

ors

Figure 5.6: Error types for the avoid object task.

Through the notes and execution times taken during the tests it is possible to consider the GUI

interaction as the less efficient interaction, and the Wiimote cursor as the most efficient and with the

best control interaction. Nevertheless it can not be overlooked the fact that the users had a better

involvement while executing the Kinect motor task, and even with being a never used before interface,

which is not the case with the Wiimote cursor, it had very good results in terms of efficiency and of

user confidence in the interaction.

5.3 Questionnaire results

After the users had finished the tests, a web based questionnaire was given, and they were reminded

that the questionnaire should be filled taking into account that the goal was to compare the interfaces,

understand the weak points and most important issues to be resolved. The questionnaire can be seen

in the appendix B.2.

46

Page 59: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

The questionnaire was composed of eighteen questions, the first six were about the user characteristics,

such as if the user had ever used the interface devices, age, and profession. The following twelve

had directly to do with the interaction experience, most of the questions where either multiple choice

questions or ranking questions, there were only two questions were a written response was asked.

Because the tests were directed to non technical users, most of the answers available to choose from

the questionnaire are written in a subjective form, although in all the questions the first answer was the

most negative and the last one was the most positive.

All of the users considered the initial description of the interaction method being evaluated as clear,

but as it was already pointed out, there were questions about the specificity of each interaction as the

tasks were being done. The professional occupation of each test user was very different, with only

one user, a student, having a technological background. All the users spend time with a computer

every day, but only two users had any contact with a Wiimote like device and one owned a Wii gaming

console, no user had ever interacted with a Kinect device. the users age varied between twenty years

old and sixty seven years old, but most of the users were around twenty years old.

The exact results from the questionnaire can be consulte in the appendix B.3 in the form of percentage

tables and percentage graphs.

The questionnaire can be divided into two main questions groups, questions about the users difficulty

during the interaction experience, and questions ranking the interaction control types. One more

question about the extra task done was also asked.

From the questions directed to the users interaction difficulties the Wiimote kinematic was the control

type where the users felt that the robot responded in the least expected way, and the Wiimote cursor

where the users felt that the expected reaction was met during the interaction. There was also a high

value of correct expectations about the Kinect motor (Kinect skeleton) control.

The GUI control was where the users were less preoccupied, about what the reaction of the robot

might be, transmitting a good sense of control, probably the fact that only one type of movement could

be made at a time, as pointed in section 5.5, contributed to this result, obliging the users to focus on

one movement at a time. The Kinect motor control was the one with the highest value of preoccupation

from the user, this preoccupation is influenced by the fact that the users did not trust that the interface

would replicate their pose in an accurate way. In the Wiimote motor control users did not feel either too

preoccupied nor too relaxed about the interaction.

When the users were questioned if they felt disoriented, not clear if they were controlling the robot

correctly while using the interface, only the Wiimote kinematic control stands out as the one with worst

results, because much of the control is done based on intuition rather than in a logical form. In contrast

the Wiimote cursor control and the Kinect motor control were considered very clear, the users always

had a good understanding of how they were controlling the robot.

As for precision the Wiimote cursor control, the Kinect hand control, and the Kinect skeleton control

were the controls that obtained top grade voting, while the Wiimote kinematic control was where the

47

Page 60: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

grades where lower so less precision was felt by the users. The GUI was considered better than the

Wiimote motor control, but none of them got the higher rating.

GUI Wiimote Wiimote Wiimote Kinect Kinect

motor kinematic cursor hand skeleton

Best for task 4 5 6 1 3 2

Preferred by the user 3 4 3 2 2 1

Table 5.2: Interaction control ranking.

There were two ranking questions, where users were asked to rank the several interaction systems

used, by personal preference and by what they thought to be the best system to do the tasks that

were requested, the results of this ranking are shown in table 5.2. The users voted the Wiimote cursor

control as the best interaction system to do the tasks requested, and the Wiimote kinematic system as

the worst interaction control. To understand the user involvement and satisfaction with each system

by comparison, the top ranked system was the Kinect skeleton, and the last ranked system was the

Wiimote motor. The Wiimote cursor, the Kinect hand, and the Kinect skeleton control, were the three

top ranked system in both questions.

The questionnaire ended with two selection question about the orientation and pose extra tasks, this

questions were made to have some evaluation of the very simple tasks requested to the users. The

interaction in the Wiimote hand orientation task was considered “good”, and the interaction in the

Kinect pose task was considered “very good”, that are the top two classifications of five possible to

choose.

5.4 User comments

During the users tests, the observer would take notes on user comments, most of the comments were

done during the interactions considered harder, as the GUI, Wiimote motor, and Wiimote kinematic

interactions, but comments would be made through out all the tests.

The scenery was considered by many users as clear, and did not create any doubts about the tasks

requested. Although objects one and two were distinguished by users for being the hardest ones to

reach.

During the GUI test many users complain about the obligation to move in one axis at a time. The

controls were considered straight forward but, the fact that the x axis was negative to the front of the

robot and positive to the back of the robot made the control confusing. To avoid this confusion one user

made a note of the controls direction before starting the tests, the same user suggested that instead of

having solely the axis information having also name labels with the words: up; down; right; left; back;

48

Page 61: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

front; might make the interface clearer. One of the most repeated complains was related to the fact that

a user is not able to stop the control once it has begun, a stop button would suffice to fix this problem.

During the Wiimote motor tests the ability to stop a move the robot when the user wants, that is

available in all interfaces except the GUI, was appreciated and commented by several users. The

concept of the interaction was difficult to pass on to most of the users, with the exception of the

engineering student, who had a technical background, and after understanding how the joint motors

of the iCub were mapped to the Wiimote was very comfortable with the interaction. The forearm

control was limited to two motors, being simpler than the arm, that has three motors, although a user

commented that the forearm was harder to control conveniently.

During the Wiimote kinematic tests some users commented that their position relatively to the robot

had an influence on how they performed, because this oblige the users to control in a mirrored way. It

was also indicated that this was a dificult control form to interpreter, because no spatial notion of the

end-effector was present. Users were worried with the movements control due to the robot speed that

was higher than in the previous tests, at the same time there was a feeling that the robot had a delay

that did not help the control. The orientation that the end-effector (hand) took was also commented on,

because the users did not have nay control over that.

During the Wiimote cursor tests users felt that this was the most comfortable method, but that the delay

experienced was a issue, that influenced the users performance.

During the Kinect kinematic hand test comments were many times to the fact that the mapping of

movements to the robot could be made in other way, because it became not so clear. The user

detection problems were also an issue due, because it obliged users to re do the calibration process

again. The fact that control could be set on and off with the Wiimote button, was considered very

useful.

During the Kinect skeleton motor test users commented on the dificulty in controlling the robot around

object two. The interface system was considered very simple, but it was uncomfortable not to have

complete notion of what speed the robot could achieve with an unintended faster movement. Also the

position that the user picked initially most of the times didn’t allow to see the robot completely in a clear

way, specially in the obstacle task.

As overall comments users pointed out the usefulness of a graphical interface and some form of

feedback to some of the interactions, although it was explained that this was intentionally kept out.

Also a module for detection of self collision that would stop the robot from damaging it self, would have

helped to make the users less worried during the interaction.

After the tests and in a informal way, even with all the issues that can be resolved, users considered

that the interaction systems developed, could be useful for a different number of tasks, were fun,

practical.

49

Page 62: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

5.5 Important issues

There are several issues that were noted, and repeatedly commented on that should be better

explained.

The GUI control was already developed by a developer of the iCub community, but it might be good to

suggest a few alterations, the most noted was the fact that only a movement in a axis could be made

at a time, by directly changing the sliders. Allowing for more movements can bring a higher risk but

also a faster control from the user.

The Kinect device is a very new device, so some issues remain to be upgraded, although in most cases

it worked without any difficulties. One issue was that with one specific user, that was using not too tight

clothes as indicate by the manufacture that it should, was very hard to detect, and re-calibration had to

be re-done several times.

When using the Kinect skeleton motor control, reaching object two became difficulty because this was

at the limit of one of the motors. By our implementation, when a user tries to reach a certain angle in

a motor that is beyond the limit, the motor stops immediately. This option was taken as a precaution

against unexpected movements that might be requested to the motor from the interfaces.

Although it was preferred not to have any form of feedback that would facilitate users in their tasks, it

seemed clear that having a graphical, sound, or any other type of feedback would affect the way users

do the tasks not only depending in the interface they were using but also on the type of feedback they

would be able to get. Because the objective was to compare simple interfaces and how people would

interact with a humanoid robot through these interfaces only, it was preferred to leave the feedback

interfaces created during the development phase out of the user testing, since they could influence the

user performance.

During the kinematic controls users were surprised to see that the hand would take arbitrary orientations,

the kinematic chain used to control the robot, iKin, was the one resposible for this behavior. With the

iKin when a new position is given for the robot to reach, a new pose for the robot is computed, in that

pose it is considered two elements, the position to reach and a orientation to define. The orientation

could have been defined in a strict way, although after testing this system it was understood that it

would limit the reaching ability of the robot, and many times the orientation would oblige the robot to do

unexpected movements, so the orientation setting was ignored and only the position is defined.

With a more specific goal for the interface any of this issues can be resolved to adequate in a better

form for the problem, so this are considered important issues, but not critical errors in our systems.

5.6 Concluding remarks

In this chapter the evaluation tests and its results were presented.

There were a number of different options taken that influenced the results of the tests, two of those

50

Page 63: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

options that have had a heavy weight on the results were: the absence of feedback from the interaction

systems, other than the robot response, this would allow to evaluate the interface device itself and not

depend on the support interface; the test subjects that were chosen with very little technical abilities,

hoping that this would make stand out the determinant characteristics in the interaction.

The tests where divided into five tasks from which the amount of time spent doing the task was

annotated and also the errors that occurred during the task. Each of the tests was done for each of the

control methods, GUI, Wiimote motor, Wiimote kinematic, Wiimote cursor, Kinect kinematic hand, and

Kinect motor skeleton.

The interaction system with which the users obtained better results was the Wiimote cursor. This

system was the one users could related the best because the control of avatars and interfaces through

D-Pads and key cursors are relatively common in the day to day use of technology. The GUI also

represents a very common daily used interface, although due to the way it is organize it did not have a

good result, users expect most of the times it to be easier to use, and also simpler than three slide

bars. The Wiimote interaction systems where the gyroscope was used, kinematic and motor systems,

were the ones where users found it most difficult to grasp the concept, although in some particular

cases it was a very successful system, mostly when used by users with some experience of the device.

The Kinect interaction systems, were the most appreciated, and also the ones considered the biggest

novelty. Considering that for many of the users it was the first time that such a system was even seen,

that the time of learning needed was very small, and the results which were very satisfactory for a

system with no support interface, the Kinect was the interface where users felt more envolved.

The tests also allowed us to get to know important issues of the interactions, some that should be

enhanced by adding features, others that should be fixed for a better interaction.

51

Page 64: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chapter 6

Conclusion

Humanoid robots are becoming more and more complex, many advances in artificial intelligence try to

remove the need for constant interaction with robots, but the control through interaction is still crucial

for its utility, and it can be made in multiple forms. It is important to have simple and intuitive ways for

interacting with the complex robots available today, this is a form not only of controlling the robot but

also of communicating with the robot. These interaction techniques can be used as a form of learning

by demonstration or helping the robot completing certain tasks.

On the interaction front, gaming companies are trying to conquer its clients more and more, through

appealing and original interaction devices. This interaction devices are high tech devices for a low cost,

from their release there is a high expectation, leading to a series of blogs and youtube channels to be

created and followed that work as an unusual interaction community gathering point to use this devices

in unexpected ways.

This dissertation tries to join this two elements, the HRI needs and the novel devices possibilities, to

study what difficulties and involvement users feel when interacting with an humanoid robot such as the

iCub.

6.1 Research summary

The research started with the practical objective of developing a interaction system for the iCub robot

using a novel interaction device, the Wiimote bundled with the WM+ extension.

With this objective in mind research in the typical forms of interaction between the Wiimote device

and Humans was started by looking at the games available for the Wii gaming console. This led to

the initial question of “how can a gyroscope based device be used in the interaction between humans

and humanoid robots?”. From this point further research was done in the HRI field to understand

what already available systems and works there were. It was understood that humanoid robots have

many characteristics that are unique to their kind, so any interface developed for a humanoid must

52

Page 65: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

consider that. The most similar and interesting works studied were in the field of rehabilitation and how

this kind of devices can be used in the study of rehabilitation recovery. Specifically Wiimote related

papers mostly focused on the accelerometer, something that was adapted to our idea of control with

the gyroscope.

Because the main platform on which the work would be developed was the iCub robot, there was some

research related with the iCub software, and some previous works done in interaction with this robot,

although very little with this kind of interaction was found about the control of the iCub, or any other

complex humanoid robot.

This year as actually been a special year in gaming interaction devices because, the first body controlled

remote control was released by Microsoft, which was quickly made open for the developer community

by hackers and researchers. To take advantage of such a novel interaction device the work was made

broader in its goals slightly altering the main focus question to “how can novel gaming interfaces

compare as an humanoid interface?”. The research on the Kinect was made in an unorthodox way,

mainly because of being a completely new interaction device, almost no published work was available

in similar systems, specially in humanoid robots interaction. This led to a interesting partially web

supported research, where the main discussions occurred through forums and technology blogs, this

option led also to a need of constant re confirmation about the knowledge acquired.

6.2 Contributions

From the work developed the contributions can be grouped into two categories, a practical contribution,

that is translated into the software developed for the work done, and a theoretical contribution, that is

represented in this dissertation as the results of the interaction tests made and its conclusions.

The interaction software developed was made having the user tests as main focus, but also with focus

on reusability for other purposes. There are several possible improvements to be made that were better

understood at the end of this work, but even with does issues still unresolved the software developed

as already been used in several different occasions. One of the most recurrent use of the interaction

systems as been as a presentation of the iCub capabilities, the reception to this presentations as

been quite positive. The presentation was also ported to the Vizzy robot successfully, both the Kinect

interaction system as the Wiimote interaction system. In addition to this demonstration use, already a

few times students using the iCub for development have asked to use the interaction system as a tool

for altering the robot posture, due to its simplicity, this use saves time for students from doing non work

related tasks, the repositioning of the robot is done typically through the GUI interface that as it can be

seen from the test results it is the slowest of the systems to perform tasks. Also the Kinect interface as

been used with other projects successfully1 enabling grasping a distant object with a data glove, and

1http://youtu.be/xsUrt9ccGFs

53

Page 66: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

this first partnerships might lead to other more interesting works. The software developed will be made

available to the iCub community through the iCub and Yarp repository, to be used in other systems.

The tests made and its results and also the research done helped to understand users feeling about this

types of interaction. This understanding can be used in other works as a starting point for developing

new systems, straight forward interaction systems, as systems where user intervention might be

needed, can benefit from their use. For the Wiimote and the Kinect it was the start to get a good

interaction system using inexpensive devices, particularly in the case of the Kinect this work explored

a brand new device and some of the users reactions to it.

6.3 Thesis statement

Through the work developed it was possible to understand several important facts about the comparison

of this systems.

Known interfaces are better. Although the GUI which was the most common interface was not more

successful than the others, users felt better using it mostly because they knew what it would do from

the start, and would not have to rely on a good understanding of the concept because the concept

was already known. The D-Pad control was the most successful precisely because of that, users used

it knowing what was going to happen, and what they expected happened, so they were comfortable

enough to trust in their own judgment of what the result of the interaction might be. This trust was

shown to fail during the Kinect motor skeleton test, where users although understood the concept they

were distrustful of what the reaction of the robot might be, taking more time in the first step to check

if the interaction corresponded to their expectations. This will be solved when the technology used

enters our daily lives, this way users already create an idea of what the logic reaction of an robot might

be to that a interface and probably trust it more.

Different interfaces serve different users. There were big differences among users in the test results, in

some users it was possible to understand that they adapted better to a certain interface than others.

For example one of the users was able, during one of the tests, to get a clear idea about how a certain

interface worked, and that interface became the fastest one in that user tests. Although in another

test that might be considered easier the same user had a slower result than a user that was not able

of understanding the concept of the previous test. Users adapted better to whatever concept they

understand better, and the understanding of a concept might depend on the type of user knowledge,

for example, understanding the concept of the Wiimote motor control involved understanding that the

robot was composed of several motors and that the movement of each motor might change the position

of other motors.

Users expect clear feedback. Although the new interaction devices tend to be simpler they do not give

a clear feedback, to do that it is expected some form of visual feedback. In this case the feedback was

the robot reaction, this was not a clear enough feedback because those reactions were most of the

54

Page 67: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

time very complex, a graphical simple feedback was suggested by many as a good feedback solution.

Although the word “clear” also depends on the user.

Comparing directly typical and novel devices on can assume that, although certain conceptual lines

must be followed it is possible to execute a much more complex interaction using the novel devices

available, but the typpical interaction system is still preferred by users.

6.4 Future work

There are many details left open by this thesis that are interesting continuity points. We point out three

that we found interesting.

The tests will benefit to be done not only to compare interaction devices but also people reactions to it,

this could be studied by doing tests to people with different backgrounds and particularly with different

ages. During this work, older users showed a very different response from the younger users, the

question of how would child users interact through this devices with the humanoid remains open.

The exploration of possible feedbacks while controlling a humanoid robot is also an interesting study.

The humanoid robot should warn users when something is wrong or what is the current state, the way

to communicate this to the user can be made through a graphical interface, although to have the robot

communicate a problem while being controlled through interaction by itself might be a good useful

solution.

Blending interactive control of this novel devices, and robot understanding. If a interaction device

instead of controlling the robot directly could suggest actions to the robot, that probably could contribute

to a better interaction result. An example would be grabbing a mug, if a user could by mimicking

grabbing a certain object make the robot understand the intention of grabbing that object while

continuously controlling the robot, the control of the robot could be shared enabling a better success

chance.

55

Page 68: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Appendix A

Related work Appendix

A.1 Wiimote manual

56

Page 69: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

6 7

Component s

Components Co

mponents

Power ButtonPress to turn the Wii console ON or OFF.

Pointer Lens

B Button

Strap Lock

Wrist StrapAttachment

Wrist Strap cord

BatteryCover

Wii RemoteWrist StrapSee page 23 for information on wrist strap use.

A Button

Minus (–) Button

HOME ButtonPress to display the HOME Menu screen. See page 64, Wii Operations Manual - Channel and Settings, for more information.

Plus (+) Button

Speaker

1 Button

2 Button

Player LEDsIndicate which player the remote is set up for.

+Control Pad

Wii Remote(Shown with the Wii MotionPlus accessory removed and the Wii Remote jacket attached.)

Wii MotionPlus (Shown with the Wii Remote removed.)

External ExtensionConnectorAllows connection of external accessories such as the Nunchuk.

SYNCRO Button(SYNC.)Used when synchronizinga Wii Remote with the console. See pages 24-25.

JacketNOTE: The Power Button on the Wii Remote controller can be pressed through the jacket.

Wii MotionPlus connector plugPlugs into the External Extension Connector on the bottom of the Wii Remote.

Lock Release buttonsPress when attaching or removing Wii MotionPlus from the Wii Remote.

Lock SwitchSlide towards the top of the Wii Remote to lock Wii MotionPlus into place.

Jacket slotsTabs on the bottom of the sensor unit fit into these slots.

Bottom view

Connector cover attached

Bottom view

Connector cover removed

Connector coverProtects the External Extension Connector on the bottom of Wii MotionPlus

External Extension ConnectorFor connecting accessories such as the Nunchuk and Wii Classic Controller.

Connector cover cordInsert this cord into the Connector Hook on the plug of the accessory being plugged into the connector.

Wrist Strap slotPass the wrist strap through when installing Wii MotionPlus onto the Wii Remote.

Lock fingersInsert into the holes on the bottom of the Wii Remote.

Sensor unit

Front view Back view

To take advantage of the features of the Wii MotionPlus accessory, you must use Wii game software that supports Wii MotionPlus gameplay features. Look for this icon on game packaging for games that are designed to be used with Wii MotionPlus.

NOTE: Your Wii Remote will still function normally with games that do not include Wii MotionPlus gameplay features. The Wii Remote does not need to be disconnected from Wii MotionPlus once it is installed, unless you are using your Wii Remote with the Wii Zapper™, Wii Wheel™, or other accessories that attach in a similar way.

Page 70: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Appendix B

Evaluation Appendix

B.1 Tests presentation

58

Page 71: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Chico Interfaces

Tests

Who is Chico?

• Chico is a humanoid robot.• Part of a European

project.• Used in 20 laboratories around the world.– Part of a European project.

• Inspired in a child.

What is the point of this

• Finishing my thesis work.• Development of a simple interface for Chico.• Helping people to use robots in a easy way.• Tests to understand how:– The different interfaces compare.– How we can make it better.

How does it work?

• Two control systems:– Motor.– Kinetic.

• Computer GUI• Kinect sensor interface• Wii remote interface:– Angular movement (pitch/yaw/roll).– Buttons.– Motion Plus.

Tasks to perform

• Before starting each task, try the interface out.

• Five tasks to perform (calmly):1. Play until you feel comfortable.2. Move up and down.3. Move left and right.4. Touch all the objects by the predefined order

(1,2,3,4).5. Touch some of the objects by the predefined

order, with an obstacle (4,1,2).

Computer Control

• Using a graphical interface to control Chico.• Drag the sliders to move.• Motors to control:– X,Y,Z.

• Be careful so the robot doesn’t hit himself.• Please do, complain and ask for help.

Page 72: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Motor System

• Metaphor: Holding a part of the arm.• Each angular movement corresponds to a

different joint motor.• Buttons 1 and 2 change the motors you control.• Button B, starts moving the robot.• The robot only moves while you are moving.• Please do, complain and ask for help.

Motor System

1

2

Wiimote Kinematic System

• Metaphor: Chico follows a virtual point.• Buttons:– 1 selects position control.– 2 selects orientation control.– +/- vary the distance of the hand from the body.– B starts moving the robot.

• The robot only moves while you are moving.• Move slow.• Please do, complain and ask for help.

Wiimote Kinematic System

Wiimote Kinematic System

• Hand orientation extra task:– Try to put the hand accordingly to different

orientations.– Palm up, down, right, left.–Wave.

Cursor control

• Chico hand moves according to the cursor.• Cursor controls the hand horizontally.• +/- controls the hand vertically.• Please do, complain and ask for help.

Page 73: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Cursor control Kinetic Sensor

Kinetic Sensor

• The wiimote controls when the kinect is on or off.

• This device has no buttons, you move your body, the robot follows you.

• Hand tracker and Skeleton tracker.• Extra task:– Try to imitate three poses.

Questionnaire

• Please answer the questionnaire, and ask for help.

THANK YOU

Page 74: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

B.2 Questionnaire form

62

Page 75: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

iCub Interface 1/2 Sair deste inquérito

1.

*

1. Personal

Age

2. Profession

3. How much do you use computers?

4. How much have you used the Wii Remote before?

5. How much have you used the Kinect before?

6. Were the initial descriptions of the interaction methods clear?

7. Did the Interface do what you expect? Never Sometimes Most of the times Always

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

8. What didn't you like about the interaction?

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

9. Were you preoccupied about operating the robot through this interface (about breaking the robot or making errors)? Very worried Worried Concentrated Relaxed No problem!

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

Concluído

Com o apoio de SurveyMonkey

converted by Web2PDFConvert.com

Page 76: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

iCub Interface 2/2 Sair deste inquérito

1.

1. How did you like the experience? Hated it Boring Indeferent Interesting Loved it

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

2. Did you felt disorientation (felt little control over the robot)?

Was i controlling therobot?

Sometimes it moved as Iwanted

It seemed to work as Iwanted

It was very close to what Iwanted

It can read my thaughts

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

3. How concentrated did you needed to be (to control the robot properly)?

Very concentrated Concentrated I needed to think a lotbefore doing something

I needed to be looking atthe robot

I didn't even needed tolook at the robot

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

4. Which input method was the most precise?

No precision at all Precise if your'e lucky Depends on the task Close to the exact points This could be used insurgeries

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

5. Which input method was the most intuitive?

I did not understand how

to use itIt was hard to understand

how to use

I needed only a briefexplanation before

starting

I could learn how to use itwithout any explanation

I understood how to use itbefore I used it

Task: (Computer) Joint control

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

converted by Web2PDFConvert.com

Page 77: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Task: (Kinect) Skeleton control

6. Which input method was the best for you (for the task you had to do)? 1st 2nd 3rd 4th 5th 6th

Task: (Computer) Kinematiccontrol

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

7. Which of the input methods did you prefer? (not which one is the best, but which one did you like to use more) 1st 2nd 3rd 4th 5th 6th

Task: (Computer) Kinematiccontrol

Task: (Wiimote) Joint control

Task: (Wiimote) Kinematiccontrol

Task: (Wiimote) D-Pad control

Task: (Kinect) Hand control

Task: (Kinect) Skeleton control

8. How did you feel about the interface in the extra tasks? Very weak Weak Normal Good Very good

Task: (Wiimote) HandorientationTask: (Kinect) Photographspose task

9. All comments and opinions are welcome :)

Concluído

Com o apoio de SurveyMonkeyCrie o seu próprio inquérito online grátis agora!

converted by Web2PDFConvert.com

Page 78: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

B.3 Questionnaire results

66

Page 79: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Questionnaire

Page 1

GUI 0,00% 42,90% 28,60% 28,60%

0,00% 28,60% 71,40% 0,00%

14,30% 28,60% 42,90% 14,30%

0,00% 16,70% 16,70% 66,70%0,00% 33,30% 50,00% 16,70%

0,00% 14,30% 42,90% 42,90%

GUI 0,00% 57,10% 28,60% 14,30% 0,00% 2,57

0,00% 28,60% 71,40% 0,00% 0,00% 2,71

14,30% 42,90% 14,30% 28,60% 0,00% 2,57

16,70% 16,70% 33,30% 16,70% 16,70% 316,70% 16,70% 50,00% 0,00% 16,70% 2,83

28,60% 14,30% 28,60% 14,30% 14,30% 2,71

GUI 0,00% 16,70% 0,00% 50,00% 33,30% 4

0,00% 0,00% 0,00% 100,00% 0,00% 4

20,00% 0,00% 20,00% 40,00% 20,00% 3,4

0,00% 0,00% 20,00% 20,00% 60,00% 4,40,00% 20,00% 20,00% 40,00% 20,00% 3,6

0,00% 0,00% 0,00% 33,30% 66,70% 4,67

GUI 0,00% 0,00% 33,30% 50,00% 16,70% 3,83

0,00% 33,30% 50,00% 16,70% 0,00% 2,83

16,70% 16,70% 16,70% 50,00% 0,00% 3

0,00% 0,00% 20,00% 20,00% 60,00% 4,40,00% 0,00% 60,00% 20,00% 20,00% 3,6

0,00% 0,00% 50,00% 0,00% 50,00% 4

GUI 0,00% 66,70% 16,70% 16,70% 0,00% 2,5

Did the Interface do what you expect?

  Never SometimesMost of the

times Always

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

Were you preoccupied about operating the robot through this interface (about breaking the robot or making errors)?

  Very worried Worried Concentrated Relaxed No problem! Average (1-5)

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

How did you like the experience?  Hated it Boring Indeferent Interesting Loved it Average (1-5)

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

Did you felt disorientation (felt little control over the robot)?

 

Was i controlling the

robot?

Sometimes it moved as I

wanted

It seemed to work as I wanted

It was very close to what I

wantedIt can read

my thaughts Average (1-5)

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

How concentrated did you needed to be (to control the robot properly)?

 Very

concentrated Concentrated

I needed to think a lot

before doing something

I needed to be looking at

the robot

I didn't even needed to look at the

robot Average (1-5)

Page 80: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Questionnaire

Page 2

16,70% 50,00% 16,70% 16,70% 0,00% 2,33

33,30% 50,00% 16,70% 0,00% 0,00% 1,83

0,00% 40,00% 0,00% 40,00% 20,00% 3,420,00% 40,00% 20,00% 0,00% 20,00% 2,6

16,70% 33,30% 16,70% 33,30% 0,00% 2,67

GUI 0,00% 0,00% 33,30% 66,70% 0,00% 3,67

0,00% 0,00% 50,00% 50,00% 0,00% 3,5

33,30% 0,00% 0,00% 66,70% 0,00% 3

0,00% 0,00% 0,00% 60,00% 40,00% 4,40,00% 0,00% 40,00% 40,00% 20,00% 3,8

0,00% 0,00% 33,30% 33,30% 33,30% 4

GUI 0,00% 0,00% 83,30% 16,70% 0,00% 3,17

0,00% 33,30% 50,00% 16,70% 0,00% 2,83

33,30% 0,00% 33,30% 33,30% 0,00% 2,67

0,00% 0,00% 40,00% 20,00% 40,00% 40,00% 0,00% 100,00% 0,00% 0,00% 3

0,00% 0,00% 50,00% 33,30% 16,70% 3,67

1st 2nd 3rd 4th 5th 6thGUI 0,00% 20,00% 20,00% 20,00% 40,00% 0,00% 3,2

0,00% 16,70% 33,30% 16,70% 16,70% 16,70% 3,17

0,00% 16,70% 0,00% 33,30% 33,30% 16,70% 2,67

60,00% 20,00% 20,00% 0,00% 0,00% 0,00% 5,40,00% 20,00% 20,00% 40,00% 20,00% 0,00% 3,4

50,00% 16,70% 16,70% 0,00% 0,00% 16,70% 4,67

1st 2nd 3rd 4th 5th 6thGUI 0,00% 20,00% 20,00% 20,00% 20,00% 20,00% 3

0,00% 0,00% 20,00% 0,00% 60,00% 20,00% 2,2

0,00% 20,00% 20,00% 20,00% 20,00% 20,00% 3

25,00% 50,00% 0,00% 25,00% 0,00% 0,00% 4,7550,00% 0,00% 25,00% 25,00% 0,00% 0,00% 4,75

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

Which input method was the most precise?

 No precision at

allPrecise if

your'e luckyDepends on

the taskClose to the exact points

This could be used in surgeries Average (1-5)

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

Which input method was the most intuitive?

 

I did not understand

how to use it

It was hard to understand how to use

I needed only a brief

explanation before starting

I could learn how to use it without any explanation

I understood how to use it

before I used it Average (1-5)

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

Which input method was the best for you (for the task you had to do)?  Average (1-6)

Wiimote motorWiimote kinematicWiimote cursorKinect handKinect skeleton

Which of the input methods did you prefer? (not which one is the best, but which one did you like to use more)  Average (1-6)

Wiimote motorWiimote kinematicWiimote cursorKinect hand

Page 81: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Questionnaire

Page 3

40,00% 20,00% 20,00% 20,00% 0,00% 0,00% 4,8

Normal

0,00% 0,00% 25,00% 75,00% 0,00% 3,75

0,00% 0,00% 25,00% 25,00% 50,00% 4,25

Kinect skeleton

How did you feel about the interface in the extra tasks?  Very weak Weak Good Very good Average (1-5)

Task: (Wiimote) Hand orientation

Task: (Kinect) Pose task

Page 82: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

GUI Wiimote motor Wiimote kinematic Wiimote cursor Kinect hand Kinect skeleton0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Did the interface do what you expect?

Alw aysMost of the timesSometimesNever

GUI Wiimote motor Wiimote kinematic Wiimote cursor Kinect hand Kinect skeleton0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Were you preoccupied about operating the robot through this interface?(about breaking the robot or making errors)

No problem!RelaxedConcentratedWorriedVery w orried

Page 83: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

GUI Wiimote motor Wiimote kinematic Wiimote cursor Kinect hand Kinect skeleton0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

How did you like the experience?

Loved itInterestingIndeferentBoringHated it

GUIWiimote motor

Wiimote kinematicWiimote cursor

Kinect handKinect skeleton

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Did you felt disorientation (felt little control over the robot)?

It can read my thaughtsIt w as very close to w hat I w antedIt seemed to w ork as I w antedSometimes it moved as I w antedWas i controlling the robot?

Page 84: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

GUI Wiimote motor Wiimote kinematic Wiimote cursor Kinect hand Kinect skeleton0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Which input method was the most precise?

This could be used in surgeriesClose to the exact pointsDepends on the taskPrecise if your'e luckyNo precision at all

GUI Wiimote motor Wiimote kinematic Wiimote cursor Kinect hand Kinect skeleton0

1

2

3

4

5

6

Which input method was the best for you (for the task you had to do)?

Control type

Rat

ing

Page 85: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

GUI Wiimote motor Wiimote kinematic Wiimote cursor Kinect hand Kinect skeleton0

1

2

3

4

5

6

Which of the input methods did you prefer? (not which one is the best, but which one did you like to use more)

Control type

Rat

ing

Page 86: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

Bibliography

[1] A. Azmi, N.M. Alsabhan, and M.S. AlDosari. The Wiimote with SAPI: Creating an Accessible

Low-Cost, Human Computer Interface for the Physically Disabled. IJCSNS, 9:12–63, 2009.

[2] A. Bascetincelik. WiiRobot: Controlling Robots with Wii Gestures. Providence, RI: Department of

Computer Science Brown University, 2009.

[3] William Bluethmann, Robert Ambrose, Myron Diftler, Scott Askew, Eric Huber, Michael Goza,

Fredrik Rehnmark, Chris Lovchik, and Darby Magruder. Robonaut: A robot designed to work with

humans in space. Autonomous Robots, 14:179–197, 2003. 10.1023/A:1022231703061.

[4] S. Calinon, F. D’halluin, E.L. Sauser, D.G. Caldwell, and A.G. Billard. Learning and reproduction

of gestures by imitation. Robotics Automation Magazine, IEEE, 17(2):44 –54, June 2010.

[5] Y.W. Chow. Low-cost multiple degrees-of-freedom optical tracking for 3d interaction in head-

mounted display virtual reality. International Journal of Recent Trends in Engineering, 1(1):52–56,

2009.

[6] David Droeschel, Jorg Stuckler, and Sven Behnke. Learning to interpret pointing gestures with

a time-of-flight camera. In Proceedings of the 6th international conference on Human-robot

interaction, HRI ’11, pages 481–488, New York, NY, USA, 2011. ACM.

[7] T. Dufour, L. Geurts, and F. Windey. Development of a Toolkit for the Wii MotionPlus: Calibration,

Data Recording and Visualization.

[8] Cheng Guo and Ehud Sharlin. Exploring the use of tangible user interfaces for human-robot

interaction: a comparative study. In Proceeding of the twenty-sixth annual SIGCHI conference on

Human factors in computing systems, CHI ’08, pages 121–130, New York, NY, USA, 2008. ACM.

[9] J. Kofman, Xianghai Wu, T.J. Luu, and S. Verma. Teleoperation of a robot manipulator using a

vision-based human-robot interface. Industrial Electronics, IEEE Transactions on, 52(5):1206 –

1219, October 2005.

[10] R.S. Leder, G. Azcarate, R. Savage, S. Savage, L.E. Sucar, D. Reinkensmeyer, C. Toxtli, E. Roth,

and A. Molina. Nintendo wii remote for computer simulated arm and wrist therapy in stroke

74

Page 87: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

survivors with upper extremity hemipariesis. In Virtual Rehabilitation, 2008, page 74, August

2008.

[11] Hyun-Jean Lee, Hyungsin Kim, Gaurav Gupta, and Ali Mazalek. Wiiarts: creating collaborative

art experience with wiiremote interaction. In Proceedings of the 2nd international conference on

Tangible and embedded interaction, TEI ’08, pages 33–36, New York, NY, USA, 2008. ACM.

[12] Johnny Chung Lee. Hacking the nintendo wii remote. IEEE Pervasive Computing, 7:39–45, 2008.

[13] Giorgio Metta, Giulio Sandini, David Vernon, Lorenzo Natale, and Francesco Nori. The icub

humanoid robot: an open platform for research in embodied cognition. In Proceedings of the 8th

Workshop on Performance Metrics for Intelligent Systems, PerMIS ’08, pages 50–56, New York,

NY, USA, 2008. ACM.

[14] A. Murgia, R. Wolff, PM Sharkey, and B. Clark. Low−cost optical tracking for immersive collabora-

tion in the cave using the wii remote. In 7th International Conference on Disability, Virtual Reality

and Associated Technologies with ArtAbilitation. ICDVRAT, 2008.

[15] Sven Olufs and Markus Vincze. Robot on the leash − an intuitive inexpensive interface for robots

using the nintendo wii remote. In Torsten Kroger and Friedrich M. Wahl, editors, Advances in

Robotics Research, pages 311–321. Springer Berlin Heidelberg, 2009. 10.1007/978-3-642-01213-

6 28.

[16] Shichao Ou and Roderic Grupen. From manipulation to communicative gesture. In Proceeding of

the 5th ACM/IEEE international conference on Human-robot interaction, HRI ’10, pages 325–332,

New York, NY, USA, 2010. ACM.

[17] U. Pattacini, F. Nori, L. Natale, G. Metta, and G. Sandini. An experimental evaluation of a novel

minimum-jerk cartesian controller for humanoid robots. In Intelligent Robots and Systems (IROS),

2010 IEEE/RSJ International Conference on, pages 1668 –1674, October 2010.

[18] S.N. Purkayastha, N. Eckenstein, M.D. Byrne, and M.K. O’Malley. Analysis and comparison of low

cost gaming controllers for motion analysis. In Advanced Intelligent Mechatronics (AIM), 2010

IEEE/ASME International Conference on, pages 353 –360, july 2010.

[19] Eric Sauser, Brenna Argall, and Aude Billard. The life of icub, a little humanoid robot learning

from humans through tactile sensing. In Proceedings of the 6th international conference on

Human-robot interaction, HRI ’11, pages 393–394, New York, NY, USA, 2011. ACM.

[20] David Scherfgen and Rainer Herpers. 3d tracking using multiple nintendo wii remotes: a simple

consumer hardware tracking approach. In Proceedings of the 2009 Conference on Future Play

on @ GDC Canada, Future Play ’09, pages 31–32, New York, NY, USA, 2009. ACM.

75

Page 88: Evaluation of recent game interfaces for the …...Evaluation of recent game interfaces for the command of a humanoid robot Duarte Cordeiro Ferreira Osório de Aragão Dissertação

[21] C. Smith and H.I. Christensen. Wiimote robot control using human motion models. In Intelligent

Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on, pages 5509

–5515, October 2009.

[22] Wikipedia. Wii motionplus — wikipedia, the free encyclopedia, 2011. [Online; accessed 23-

February-2011].

[23] Wikipedia. Wii remote — wikipedia, the free encyclopedia, 2011. [Online; accessed 15-February-

2011].

[24] Chadwick A. Wingrave, Brian Williamson, Paul D. Varcholik, Jeremy Rose, Andrew Miller, Emiko

Charbonneau, Jared Bott, and Joseph J. LaViola Jr. The wiimote and beyond: Spatially convenient

devices for 3d user interfaces. IEEE Computer Graphics and Applications, 30:71–85, 2010.

[25] Holly A. Yanco. Classifying human-robot interaction: An updated taxonomy. In Proc IEEE SMC,

pages 2841–2846, 2004.

[26] James Young, JaYoung Sung, Amy Voida, Ehud Sharlin, Takeo Igarashi, Henrik Christensen, and

Rebecca Grinter. Evaluating human-robot interaction. International Journal of Social Robotics,

3:53–67, 2011. 10.1007/s12369-010-0081-8.

76