8
GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME DEVELOPMENT USING LEAP MOTION AND USABILITY EVALUATION Pu-Hsuan Chien and Yang-Cheng Lin Department of Industrial Design, National Cheng Kung University No.1, University Road, Tainan, Taiwan ABSTRACT The environment of augmented reality created by head-mounted displays can provide players with an immersive gaming experience and proper gestures can enable the integration of real-world objects with the virtual environment. This study explores the usability of gesture operations applied to games and tries to find new applications using current mature gesture sensors. This study develops a head-mounted augmented reality game based on gesture recognition, and evaluates their usability and impact on the user’s interaction with the virtual interface. It uses the Unity engine and open-source glasses designed by Leap Motion to develop an adventure game, and it contains elements of a first-person shooter game. There is no need for any joystick or touch screen to control the game, as the game relies only on the player's hands to operate through a controller-less interface. The results show that the system usability score is 79.4, which indicates a good usability level. This study may be of importance in clarifying problems that may be encountered in the development of interactive-gesture head-mounted display augmented reality games. Finally, this study offers suggestions for the challenges and future of head-mounted augmented reality games. KEYWORDS Natural User Interface, Gesture, Leap Motion, Head-Mounted Displays, Augmented Reality Game 1. INTRODUCTION Games are a daily pastime for many people, and various ways of playing can affect the mental state of players (Shin, 2019). In the last decade, many human-computer interaction methods ranging from traditional keyboards to motion sensors have provided new methods of interaction. As this technology advances, games can bring players an immersive experience. Head-mounted displays (HMDs) can impart players with an immersive gaming experience. The technology behind head-mounted virtual reality (VR) games is very mature, with several devices based on them available in the market. However, HMD technology is not used as much in augmented reality (AR) games currently, most augmented reality (AR) games are on mobile platforms, often requiring holding a mobile phone to play. In recent years, there has been an increasing research interest on AR (Gugenheimer et al., 2019), and most of the research on the use of HMDs in AR games has focused on either solving technical problems (Hua, 2017; Xiao & Benko, 2016) or on improving input factors (Xu et al., 2019). In this study, a set of head-mounted display AR games that can be controlled with both hands has been developed to bring players a new gaming experience. There are many gesture recognition sensors available. Among them, Microsoft’s Kinect and Leap Motion have much related research (Cabreira & Hwang, 2015, Vokorokos et al., 2016). Leap Motion allows users to use gestures to control the operation and is currently widely used in sign language recognition, rehabilitation, and interactive game development (Khademi et al., 2014, Potter et al., 2013). To realize the functions of gesture recognition and AR at the same time, we use the AR HMD developed by Leap Motion in 2018 with open-source glasses. Most of the components are 3D printed, so the price is lower than other head-mounted AR glasses. This method broke through the original form of operation and increased the application of AR, opening up more possibilities for human-computer interaction. Unlike the general Leap Motion operation method, it places the sensor on top of the head so it can effectively recognize gestures when moving. International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021 149

GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

GESTURE-BASED HEAD-MOUNTED AUGMENTED

REALITY GAME DEVELOPMENT USING LEAP MOTION

AND USABILITY EVALUATION

Pu-Hsuan Chien and Yang-Cheng Lin Department of Industrial Design, National Cheng Kung University

No.1, University Road, Tainan, Taiwan

ABSTRACT

The environment of augmented reality created by head-mounted displays can provide players with an immersive gaming

experience and proper gestures can enable the integration of real-world objects with the virtual environment. This study

explores the usability of gesture operations applied to games and tries to find new applications using current mature gesture

sensors. This study develops a head-mounted augmented reality game based on gesture recognition, and evaluates their

usability and impact on the user’s interaction with the virtual interface. It uses the Unity engine and open-source glasses

designed by Leap Motion to develop an adventure game, and it contains elements of a first-person shooter game. There is

no need for any joystick or touch screen to control the game, as the game relies only on the player's hands to operate through

a controller-less interface. The results show that the system usability score is 79.4, which indicates a good usability level.

This study may be of importance in clarifying problems that may be encountered in the development of interactive-gesture

head-mounted display augmented reality games. Finally, this study offers suggestions for the challenges and future of

head-mounted augmented reality games.

KEYWORDS

Natural User Interface, Gesture, Leap Motion, Head-Mounted Displays, Augmented Reality Game

1. INTRODUCTION

Games are a daily pastime for many people, and various ways of playing can affect the mental state of players

(Shin, 2019). In the last decade, many human-computer interaction methods ranging from traditional keyboards

to motion sensors have provided new methods of interaction. As this technology advances, games can bring

players an immersive experience. Head-mounted displays (HMDs) can impart players with an immersive

gaming experience. The technology behind head-mounted virtual reality (VR) games is very mature, with

several devices based on them available in the market. However, HMD technology is not used as much in

augmented reality (AR) games — currently, most augmented reality (AR) games are on mobile platforms,

often requiring holding a mobile phone to play. In recent years, there has been an increasing research interest

on AR (Gugenheimer et al., 2019), and most of the research on the use of HMDs in AR games has focused on

either solving technical problems (Hua, 2017; Xiao & Benko, 2016) or on improving input factors (Xu et al.,

2019). In this study, a set of head-mounted display AR games that can be controlled with both hands has been

developed to bring players a new gaming experience.

There are many gesture recognition sensors available. Among them, Microsoft’s Kinect and Leap Motion

have much related research (Cabreira & Hwang, 2015, Vokorokos et al., 2016). Leap Motion allows users to

use gestures to control the operation and is currently widely used in sign language recognition, rehabilitation,

and interactive game development (Khademi et al., 2014, Potter et al., 2013). To realize the functions of gesture

recognition and AR at the same time, we use the AR HMD developed by Leap Motion in 2018 with

open-source glasses. Most of the components are 3D printed, so the price is lower than other head-mounted

AR glasses. This method broke through the original form of operation and increased the application of AR,

opening up more possibilities for human-computer interaction. Unlike the general Leap Motion operation

method, it places the sensor on top of the head so it can effectively recognize gestures when moving.

International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021

149

Page 2: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

To study the potential of this device, we developed an AR game, considering the player’s operation process

and the game interface. The major purpose of this study was to assess the usability of this kind of prototyping

game, so participants were asked to rate its usability. Finally, we put forward some problems that may be

encountered in the development of this type of game.

2. RELATED WORKS

The game development in this study involves many areas, such as the choice of gesture sensors, how the

gestures are used in games, and the game interface design. We consider that the application of gestures in

current gesture-based game operation has not been fully studied, so we have limited literature to consider in

developing games.

2.1 Gesture Sensor

To date, there have been many studies on the application of gesture sensors. Previous studies have also

compared the advantages and disadvantages of these sensors. Some researchers have used Leap Motion in

various fields. For example, Cabreira and Hwang conducted experiments on three sensors and analyzed 250

applications, of which more than half were game programs. Their analysis of 15 gestures on each platform

indicated that Kinect is more limited than other devices for finger and hand tracking (Cabreira & Hwang, 2015).

Leap Motion recognized a wider range of gestures. The most commonly used gestures for Myo were rotation

and waving, which could be used consistently in the various programs. Vokorokos et al. (2016) also conducted

experiments on three sensors in their research, using pointing, waving, rotating, and fist to determine the gesture

accuracy of the devices. Many tests were conducted on the sensors of the three platforms for each movement.

Kinect performed well in waving, while Leap Motion performed well in pointing, rotating, and punching.

It was fast and intuitive, but the sensing distance was limited. Myo was also very good in rotation, with the

advantage that the user can put their hands down to feel more comfortable. Because it uses muscle detection,

motion can be easily detected even if the hands are covered or the device is placed in a pocket. Based on the

above research, it can be determined that Kinect is very smooth under large-scale gesture sensing such as

waving hands. However, other gestures are not very accurate. Its advantages are that it can detect the skeleton

of the face and the whole body and that the detection range is wider. Leap Motion's research on AR applications

is far more advanced than those of other manufacturers. (Katahira & Soga, 2015; Wozniak et al., 2016). Leap

Motion provides open-source AR HMD glasses, the sensor can be placed in a head-mounted device, which

solves the original distance limitation of the sensing, but no current research on these glasses has yet appeared.

Based on the above research and viewpoints, and because of the good accuracy of Leap Motion and the

availability for purchase, this study uses Leap Motion as the sensing device for the AR glasses. Although the

sensing range is somewhat limited, changing the Leap Motion device above the HMDs can effectively address

this question.

2.2 Gesture Interaction Applied to Games

Existing research on the application of gestures in games, based on their recognition method, can be broadly

categorized into two types: visual recognition and motion sensor recognition. Visual recognition primarily

relies on Kinect, Leap Motion, and mobile phone lenses for gesture analysis (Pirker et al., 2017; Yeo et al.,

2015), while motion sensor recognition uses six-axis sensors or electromyographic signals and usually requires

the user to wear additional devices (Esfahlani et al., 2018; Lee et al., 2017).Regarding the research on the use

of gesture interactions in games, Silpasuwanchai and Ren (2015) held that in gesture interaction games, not all

operational methods are suitable for all players. Therefore, they used user feedback and user-elicitation to get

a clear understanding of the actions the players felt were most intuitive and found that various people

experience the game differently, and there were also differences in their actions, but there were some gestures

that players agreed on. For example, the preferred gesture for shooting was the gesture of using one hand to

draw a pistol.

ISBN: 978-989-8704-31-3 © 2021

150

Page 3: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

There have been several studies that have investigated the application of Leap Motion in medical,

entertainment, and educational fields, such as two-handed drone operation (Fernandez et al., 2016) and hand

rehabilitation exercises (Alimanova et al., 2017). The above points out the potential of Leap Motion in the

natural user interface. However, little research has been done on the development of in-game applications.

Overall, we believe that it has great potential in gaming and entertainment, and we use Leap Motion with AR

HMDs to design and develop a game, which will help researchers study the application of human-computer

interaction in related game fields in the future.

2.3 Diegetic Interface

When only gestures are used, the interface plays a primary role in games. Previous studies have also pointed

out that the design of the interface affects the player’s immersion (Brown & Cairns, 2004, Qin et al., 2009).

The word “diegesis” was originally a term used in movies to refer to all the stories about the character.

Galloway applied this concept to the role of the game. The operation of the role of the game is regarded as

diegesis, and non-diegesis refers to the game outside the narrative world. He also distinguished between the

behavior of the player and the character. The behavior of the player is divided into diegetic and non-diegetic

elements, so according to the corresponding interface of its behavior classification, the heads-up display (HUD)

can be used. The HUD is mainly for players to see, so it is an interface superimposed on the game screen

(Galloway, 2006). Later writers have voiced the same view. Peacocke et al. studied the user interface of a

first-person shooter game based on Galloway’s theory. They compared the player’s preferences and immersion

between the diegetic and non-diegetic interfaces based on the display of the remaining ammunition, using a

5-point Likert scale to evaluate their immersion. He concluded that the digital display interface works best in

the diegetic interface (Peacocke et al., 2015). Fagerholt and Lorentzon’s theory offered a sounder theoretical

basis for interface design. They analyzed and classified first-person design game interfaces, constructed some

appropriate professional terms, and provided design guidelines to increase game immersion. However, they

deemed that to complete these design guidelines, it is still necessary to have a game prototype design for

evaluation (Fagerholt & Lorentzon, 2009). Because this study’s game is not a pure first-person shooter game,

we primarily divided the interface into the diegetic interface and non- diegetic interface and applied it to our

design theory.

3. GAME DESIGN AND DEVELOPMENT

Because this study uses HMDs, we think it is very suitable for first-person games. To allow players to be more

integrated into the game, we have created some stories and relatable characters, so it can be regarded as an

adventure game with first-person shooting elements. In this study, the player is acting as an adventurer and

must help the good dinosaurs defeat the evil dinosaurs within the time allotted for the mission.

The task of the game is to defeat the subordinates of the Demon Ass Dragon King. We established three

subordinates with different attributes. The player must switch to an attack with the relative attribute to damage

to the target. At the same time, the player must pay attention to supplement their magic value, or they will be

unable to attack. Play continues in this level until either the time expires or all monsters are defeated, at which

point the game ends.

The gameplay is different from typical games. Because there is no controller, all operations must be done

by hand. Operation of the game is as follows:

-Confirm/Next step: thumb up, right hand (Figure 1a).

-Shoot: make a pistol gesture with right hand (Figure 1b).

-Select: finger toggle button interface (Figure 1c).

International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021

151

Page 4: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

(a) (b) (c)

Figure 1. (a) Confirm, (b) Shoot, (c) Select the menu

According to Galloway’s theory, interfaces can be divided into diegetic and non-diegetic interfaces for

characters and players. However, there is no relevant research on the interface for first-person head-mounted

AR games. Therefore, this study adopted expert interviews and semi-structured interviews. We used expert

interviews to understand some professional knowledge of game interfaces to be appropriate for the player’s

operation.

We interviewed Associate Professor Hu Minjun, an expert in the fields of computer vision and AR/VR.

The main questions that the interview was intended to clarify were (1) the current interface problems of HMD

games, (2) The way in which players operate games and the priority of visual communication, and (3) the

advantages and disadvantages between the diegetic and non-diegetic interfaces and the application suggestions.

After compiling the data from the interview, the following conclusions were drawn. First, we should note

the detection sensitivity. It is recommended not to use the most complex gestures in AR or VR. Second, visual

communication is more important than the method of operation because a good design can be used to

communicate the interaction and make up for the player’s lack of experience. Finally, the diegetic interface

gives players higher interactivity. If it is desired that the player be a first-person character experiencing the

game, the diegetic interface is very important, but the non- diegetic interface is used to assist it, and the two

should be used in conjunction with the situation. Therefore, we decided to use the diegetic interface as the main

interface first to minimize the HUD on the screen.

The interface plan is as follows:

-Time: diegetic interface (Figure 2).

-Health bar and mana: diegetic interface (Figure 2).

-Menu: diegetic interface (Figure 1c).

Figure 2. Health bar, mana, and time interface

Thus, the left hand is mainly to display the interface, and the right hand is used for attacking and selecting.

After the game starts, a non-player character guides the player into the game, through the story of the game,

and to the operation instructions. The thumb is used to attack the monster, and the designated button is used to

switch the attack attribute. When the mana is exhausted, the mana must be replenished as soon as possible, or

the player will be unable to attack. They need to complete the task within the allotted time, and completing the

task sooner will lead to a higher score.

ISBN: 978-989-8704-31-3 © 2021

152

Page 5: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

Because we use it with HMDs and Leap Motion, we must first import some software development kit files

into Unity. We referred to Takahashi’s article to build the game environment, which includes hand calibration

and adjustment of the position of the glasses window. This is to allow the player to match the virtual hand in

Unity with the actual hand position during the game.

The game also needs related materials such as characters and interfaces. The 3D character model, its

skeleton, and animation were all created in Blender, and the material and rendering were done in Unity. The

special effects were drawn using Illustrator, with post-production in Photoshop, and the particle special effects

were done in Unity. Finally, after the program is created, the game detects the player’s hands, with the left

hand for the panel, and the right hand for attacking. We use Leap Motion to detect the skeleton of the hand;

when the distance between the thumb and the tip of the index finger reaches a certain value, an attack is

launched.

Figure 3. Game development screen: Thunder attribute attacks monsters

4. METHOD

This research is a relatively novel way of gaming, and Leap Motion’s input method is very different from

traditional operation. Most people have not used this type of input device. To understand the usability of the

device in the future and how players feel about the game. We evaluated the ease of use based on the system

usability score (SUS) and conducted semi-structured interviews to analyze subjective player feedback.

4.1 Participants

We recruited eight participants (four males and four females) ranging in age from 24 to 31 years old (M = 24.8,

SD = 1.6). All were right-handed. Regarding their gaming experience, all participants had experience with VR

games, five had experience with AR games, and all were in the habit of playing games. However, no one had

ever played a game using gesture controls.

4.2 Apparatus and Material

This research used the AR HMD Project North Star kit released by Leap Motion in 2018 (Figure 4). The helmet

was 3D printed and equipped with special polycarbonate lenses and two screens with a resolution of

1440 × 1600 pixels and a frame rate of 120 FPS. The screen image is reflected on the lens so the user can see

the real and virtual images at the same time. In addition, we combined headphones with the HMD so players

could hear sounds in the game. The game was developed using Unity rather than modifying an existing game.

A new game was chosen because existing games usually required a controller to operate and could not be used

on our equipment. The interface also needed to be developed by itself. The interface was designed based on

the results of the interview and with reference to the videos released by Leap Motion.

International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021

153

Page 6: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

Figure 4. Leap Motion AR HMD with Project North Star kit

4.3 Procedure

The participants selected were instructed to play the game to the best of their ability. Because there were

instructions in the game, we did not teach the subjects separately. This was to allow players to play the game

smoothly just by watching the instructions. After the game started, NPCs explained the plot, operation methods,

and tasks to the players. Monsters had different attributes, and players needed to attack them with

corresponding attributes. All players had eight points of mana, and each attack consumed one point. After the

mana was exhausted, the player needed to click on the interface to replenish the mana. The game was over

after they killed all the monsters. After completing the task, the players filled in the SUS to evaluate the

usability of the game subjectively, and we conducted semi-structured interviews to obtain more in-depth and

specific suggestions.

Figure 5. The process of players operating the game

5. RESULTS AND DISCUSSION

The participants were asked to fill out the SUS. The SUS score is divided into six levels: A, 90–100 points;

B, 80–89 points; C, 70–79 points; D, 60–69 points; and F, 0–59 points. A qualified system should be higher

than 68 points. The SUS was originally intended to evaluate system usability, but it can also be used to

evaluated websites and games. Therefore, the word “system” in the questions was replaced by “game.” The

results show that our game SUS average score was 79.4 points, which falls in the C grade, which means that

the usability qualified. The questions were as follows, of which 2, 4, 6, 8, and 10 are reverse questions:

ISBN: 978-989-8704-31-3 © 2021

154

Page 7: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

Table 1. SUS Questions and Score

Question Score STDev

1. I think that I would like to play this game frequently.

2. I found the game unnecessarily complex.

3. I thought the game was easy to use.

4. I think that I would need the support of a technical person to be able to play this game.

5. I found the various functions in this game were well integrated.

6. I thought there was too much inconsistency in this game.

7. I would imagine that most people would learn to play this game very quickly.

8. I found the game very cumbersome to play.

9. I felt very confident playing the game.

10. I needed to learn a lot of things before I could get going with this game.

8.4

6.9

7.8

7.2

7.2

9.1

8.8

8.1

6.6

9.4

0.52

0.72

0.64

0.83

0.64

0.52

0.53

0.89

0.74

0.46

The weighted means of each question are presented in Figure 6. All players thought that the rules of the

game were easy and they did not need to learn many things. Question 9 had the lowest score, which may be

due to players sometimes being inaccurate, which would lower their self-confidence.

In the post-experiment interviews, the responses were generally positive. This study yielded the following

results: Some players thought it was very interesting because they could see their hands operating the game.

Four subjects said they felt immersed and felt that the characters in the game seemed to exist in the real world,

and they could be even more integrated into the game than in VR games. The game instructions were simple

and easy to understand, which helped players enter the game. It is worth mentioning that the sound effects

played a helpful role in the game. Most players thought that the sound effects gave them significant feedback.

If they lacked these sound effects, they would not be able to control the game, and the music brought about

emotional changes, which were also indispensable.

However, our game is not without problems. Due to the limitation of Leap Motion’s detection range,

gestures may sometimes not be detected. Various angles should be attempted for detection. Regarding the

interface, some players suggested that the button interface should be larger and that a gap between the buttons

would make them easier to operate and reduce the chance of accidentally touching other buttons. In addition,

participants noted that the mana had to be filled too frequently. I interpret this to mean that the participants

probably did not like frequent operations.

Based on the information from previous interviews and the literature, we chose to use the diegetic interface.

The advantage is that it allows players to become more integrated into the game, but the disadvantage is that

some interfaces cannot be displayed immediately. Some players liked this interface because they thought it

made the screen concise, but it also required frequent operation of the interface. Some players also pointed out

that they preferred the HUD interface, that is, the non-diegetic interface, which helped them understand the

required information in real time, although it occupied part of the layout. It can be seen that players have

different preferences for the interface. The players’ immersion and preference for the interface need additional

study. Overall, we think this project is a success. Although there are some restrictions regarding sensor issues

that need to be resolved, all the players had a good gaming experience. Finally, we believe that the interface

should be able to successfully assist players. If players can fully master the game, their immersion will increase.

Therefore, future goals should focus on these projects.

6. CONCLUSION

Previous studies on the application of Leap Motion to games mostly focus on object manipulation, but this

research uses Leap Motion in the game mode of HMDs. There has thus far been relatively little research in this

area. This research adds many interface aids, such as buttons and the interface attached to the hand, which can

provide help for the development of gesture-related interactive games in the future. Our results show that even

without operating a controller, a player can still control the game and have an interesting gaming experience.

Players have indicated that this method of operation is very new. The lack of precision of the operation, if not

complete control of the game, will result in the loss of the player's sense of immersion. Thus, this input method

may be more suitable for the type of game that does not require precise operation rather than a competitive

type of game. Although this study has its limitations, this game can serve as a basis for relevant head-mounted

gesture-based game development.

International Conferences Interfaces and Human Computer Interaction 2021; and Game and Entertainment Technologies 2021

155

Page 8: GESTURE-BASED HEAD-MOUNTED AUGMENTED REALITY GAME

We think there are two directions for development in the future: First, additional location trackers should

be added to ensure that every gesture of the player can be detected. We also think that if gestures can achieve

more precise operations, the game will have broader development opportunities, not limited to simple

minigames but also more gesture operations can be developed. Second, we plan to do further research on the

game interface and test diegetic and non-diegetic interfaces, including displaying the player’s health bar, menu,

time, and remaining ammo/mana, to understand the player’s preference for the interface and create an interface

suitable for HMD AR games.

REFERENCES

Alimanova, M. et al., 2017. Gamification of hand rehabilitation process using virtual reality tools: Using leap motion for

hand rehabilitation. In IEEE 2017 First International Conference on Robotic Computing (IRC). Taichung, Taiwan,

pp. 336-339.

Brown, E. and Cairns, P., 2004. A grounded investigation of game immersion. In CHI'04 extended abstracts on Human

factors in computing systems. Vienna, Austria, pp. 1297-1300.

Cabreira, A. T. and Hwang, F., 2015. An analysis of mid-air gestures used across three platforms. Proceedings of the 2015

British HCI Conference, Lincoln Lincolnshire, United Kingdom, pp. 257-258.

Esfahlani, S. S. et al., 2018. Validity of the Kinect and Myo armband in a serious game for assessing upper limb movement.

Entertainment Computing, 27, pp. 150-156.

Fagerholt, E. and Lorentzon, M., 2009. Beyond the HUD-user interfaces for increased player immersion in FPS games.

Master's thesis. Chalmers University of Technology, Göteborg, Sweden.

Fernandez, R. A. S. et al., 2016. Natural user interfaces for human-drone multi-modal interaction. In 2016 International

Conference on Unmanned Aircraft Systems (ICUAS), IEEE. Arlington, USA, pp. 1013-1022.

Galloway, A. R., 2006. Gaming: Essays on algorithmic culture (Vol. 18). U of Minnesota Press, Minnesota.

Gugenheimer, J. et al., 2019. Challenges using head-mounted displays in shared and social spaces. Paper presented at the

Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems.

Hua, H., 2017. Enabling focus cues in head-mounted displays. Proceedings of the IEEE, 105(5), pp. 805-824.

Katahira, R. and Soga, M., 2015. Development and evaluation of a system for AR enabling realistic display of gripping

motions using Leap Motion controller. Procedia Computer Science, 60, pp. 1595-1603.

Khademi, M. et al., 2014. Free-hand interaction with leap motion controller for stroke rehabilitation. In CHI'14 Extended

Abstracts on Human Factors in Computing Systems. Toronto, Canada, pp. 1663-1668.

Lee, S. et al., 2017. User study of VR basic controller and data glove as hand gesture inputs in VR games. Paper presented

at the 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR).

Peacocke, M. et al., 2015. Evaluating the effectiveness of HUDs and diegetic ammo displays in first-person shooter games.

Paper presented at the 2015 IEEE Games Entertainment Media Conference (GEM). Toronto, Canada, pp. 1-8.

Pirker, J. et al., 2017. Gesture-based interactions in video games with the leap motion controller. Paper presented at the

International Conference on Human-Computer Interaction.

Potter, L. E. et al., 2013. The leap motion controller: a view on sign language. Paper presented at the Proceedings of the

25th Australian computer-human interaction conference: augmentation, application, innovation, collaboration.

Adelaide, Australia, pp. 175-178.

Qin, H. et al., 2009. Measuring player immersion in the computer game narrative. Intl. Journal of Human–Computer

Interaction, 25(2), pp. 107-133.

Shin, D., 2019. How does immersion work in augmented reality games? A user-centric view of immersion and engagement.

Information, Communication & Society, 22(9), pp. 1212-1229.

Vokorokos, L. et al., 2016. Motion sensors: Gesticulation efficiency across multiple platforms. In 2016 IEEE 20th Jubilee

International Conference on Intelligent Engineering Systems (INES), IEEE. Budapest, Hungary, pp. 293-298.

Wozniak, P. et al., 2016. Possible applications of the LEAP motion controller for more interactive simulated experiments

in augmented or virtual reality. Paper presented at the Optics Education and Outreach IV.

Xiao, R. et al., 2016. Augmenting the field-of-view of head-mounted displays with sparse peripheral displays. Paper

presented at the Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems.

Xu, W. et al., 2019. Pointing and selection methods for text entry in augmented reality head mounted displays. Paper

presented at the 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

Yeo, H.-S. et al., 2015. Hand tracking and gesture recognition system for human-computer interaction using low-cost

hardware. Multimedia Tools and Applications, 74(8), pp. 2687-2715.

ISBN: 978-989-8704-31-3 © 2021

156