5
Robust Classification of Hand Posture to Arm Posture Change Using Inertial Measurement Units Hwiyong Choi, Daehyun Hwang and Sangyoon Lee Department of Mechanical Design and Production Engineering Konkuk University Seoul Korea {genichy, hdh5143, slee}@konkuk.ac.kr Abstract—There have been many reports about misclassification generating factors during hand posture classification. Among them, arm posture change for a classifier which employs a physical change recording sensor is expected to lower the classification success rate. This work reports an robust classification of hand posture to arm posture change by adding an arm orientation feature to the classifier to overcome the factor. Two inertial measurement units and a forearm perimeter sensor were employed to measure the arm orientation and perimeter change of the forearm respectively. Two classes of hand postures were paired with continuous arm postures and classified with k- NN classifier. The results show that the suggested method improves 5% of classification success rate compared to a classifier without the arm orientation feature for two subjects. Keywords—hand posture classification; arm posture; arm orientation; k-NN classifier; inertial measurement unit I. INTRODUCTION Electrical potential or physical change of the forearm has been main streams of feature for hand posture classification [1- 4]. To develop a classifier with high accuracy, removal of misclassification generating factors contained in the feature is necessary. Many misclassification generating factors have reported from the both streams: muscle fatigue, sensor location, arm posture, and residual muscle volume change [5, 6]. Among them, especially arm posture may give a large effect to a classifier which employs a sensor that records physical change of the forearm. The other factors are time dependent or can be corrected at the first stage. However the arm posture are not. So, many hand posture classification researches have performed without arm movement. But real application of the hand posture classification mostly includes arm movement. This paper presents two hand postures classification independent of arm posture. II. MATERIALS AND METHODS A. Hand Postures and Arm Postures Paired hand postures and arm postures were used as input of a classifier. Fig. 1 shows hand postures used in this work. Two hand postures of muscle relaxed posture (P1) and hand grasp with all fingers closed (P2) are selected to simplify the problem. (a) (b) Fig. 1. Two hand postures [6]. Fig. 2. Six arm postures. Six arm postures of the right arm (A1~A6) are prepared as illustrated in Fig. 2. It has various arm postures including pronation and suspination of the forearm. Pairs of hand postures and arm postures comprise two classes C 1 and C 2 . They are the final forms that will be classified by a classifier. In other words, the goal is classification of two hand postures P1 and P2 independent of arm postures A1-A6: 1 {( 1, 1),( 1, 2), ,( 1, 6)} C P A P A P A (1) 2 {( 2, 1), ( 2, 2), ,( 2, 6)} C P A P A P A 978-1-4799-3669-4/14/$31.00 © 2014 IEEE 231 The 4th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent Systems June 4-7, 2014, Hong Kong, China

[IEEE 2014 IEEE 4th Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER) - Hong Kong, Hong Kong (2014.6.4-2014.6.7)] The 4th

Embed Size (px)

Citation preview

Page 1: [IEEE 2014 IEEE 4th Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER) - Hong Kong, Hong Kong (2014.6.4-2014.6.7)] The 4th

Robust Classification of Hand Posture to Arm Posture Change Using Inertial Measurement Units

Hwiyong Choi, Daehyun Hwang and Sangyoon Lee Department of Mechanical Design and Production Engineering

Konkuk University Seoul Korea

{genichy, hdh5143, slee}@konkuk.ac.kr

Abstract—There have been many reports about misclassification generating factors during hand posture classification. Among them, arm posture change for a classifier which employs a physical change recording sensor is expected to lower the classification success rate. This work reports an robust classification of hand posture to arm posture change by adding an arm orientation feature to the classifier to overcome the factor. Two inertial measurement units and a forearm perimeter sensor were employed to measure the arm orientation and perimeter change of the forearm respectively. Two classes of hand postures were paired with continuous arm postures and classified with k-NN classifier. The results show that the suggested method improves 5% of classification success rate compared to a classifier without the arm orientation feature for two subjects.

Keywords—hand posture classification; arm posture; arm orientation; k-NN classifier; inertial measurement unit

I. INTRODUCTION Electrical potential or physical change of the forearm has

been main streams of feature for hand posture classification [1-4].

To develop a classifier with high accuracy, removal of misclassification generating factors contained in the feature is necessary. Many misclassification generating factors have reported from the both streams: muscle fatigue, sensor location, arm posture, and residual muscle volume change [5, 6].

Among them, especially arm posture may give a large effect to a classifier which employs a sensor that records physical change of the forearm. The other factors are time dependent or can be corrected at the first stage. However the arm posture are not. So, many hand posture classification researches have performed without arm movement. But real application of the hand posture classification mostly includes arm movement. This paper presents two hand postures classification independent of arm posture.

II. MATERIALS AND METHODS

A. Hand Postures and Arm Postures Paired hand postures and arm postures were used as input

of a classifier. Fig. 1 shows hand postures used in this work. Two hand postures of muscle relaxed posture (P1) and hand

grasp with all fingers closed (P2) are selected to simplify the problem.

(a) (b)

Fig. 1. Two hand postures [6].

Fig. 2. Six arm postures.

Six arm postures of the right arm (A1~A6) are prepared as illustrated in Fig. 2. It has various arm postures including pronation and suspination of the forearm.

Pairs of hand postures and arm postures comprise two classes C1 and C2. They are the final forms that will be classified by a classifier. In other words, the goal is classification of two hand postures P1 and P2 independent of arm postures A1-A6:

1 {( 1, 1),( 1, 2), ,( 1, 6)}C P A P A P A (1)

2 {( 2, 1),( 2, 2), ,( 2, 6)}C P A P A P A

978-1-4799-3669-4/14/$31.00 © 2014 IEEE 231

The 4th Annual IEEE International Conference onCyber Technology in Automation, Control and Intelligent Systems

June 4-7, 2014, Hong Kong, China

cyber14-041.pdf 1 2014/09/09 17:19:35

Page 2: [IEEE 2014 IEEE 4th Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER) - Hong Kong, Hong Kong (2014.6.4-2014.6.7)] The 4th

B. Forearm Perimeter Sensor It is well known fact that the muscles placed in the forearm

contribute to hand motion or grasp force by generating muscle contraction. It leads perimeter change of the forearm.

A forearm perimeter sensor shown in Fig. 3 is composed of a strain gauge and a resister that it measures the perimeter change. So it can be classified as a sensor which records physical change of the forearm [6].

Fig. 3. Forearm perimeter sensor [6].

C. Inertial Measurement Unit (IMU) Several measurement methods for body posture have been

reported [8-10]. Inertial measurement unit (IMU) can be attached on the arm that it does not need any external sensors.

Orientation of arm can be a feature for recognizing an arm posture. Fig. 4 illustrates the orientation measurement. Using two IMUs (E2BOX EMIMU-9DOF) offers Euler angles φ, θ, ψ about a global coordinate system XG. When initial orientation of each arm is assigned to Xup, Xf respectively new orientation can be represented as follows:

, up new upX R X , (2)

, f new fX R X , where , , ,z y xR A A A

c c c s s c s s s c s cc s c c s s s c s c s ss c s c c

.

,

cos sin 0sin cos 00 0 1

zA , (3)

,

cos 0 sin0 1 0

sin 0 cosyA ,

,

1 0 00 cos sin0 sin cos

xA .

Fig. 4. Coordinate system of the right arm for orientation measurement.

When the offset of Xup and Xf about XG is ignored, θi can be defined as illustrated in Fig. 5.

(a) (b)

(c) (d)

(e)

Fig. 5. Definition of θi.

232

cyber14-041.pdf 2 2014/09/09 17:19:36

Page 3: [IEEE 2014 IEEE 4th Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER) - Hong Kong, Hong Kong (2014.6.4-2014.6.7)] The 4th

And θi are derived as follows: 1 , , , ,arctan 2( , )up y new up x newv v , (4)

2 , , , ,arctan 2( , )up y new up z newv v ,

3 , , , ,arctan 2( , )up x new up z neww w ,

4 , , , ,arctan 2( , )f z new f x newu u ,

5 , , , ,arctan 2( , )f y new f y neww w .

D. Classifier A classifier classifies a given feature vector x to a most

suitable class Ci based on training data sets which are acquired during training step. Here, x is a six dimensional vector:

, 1 2 3 4 5[ ]sens refx k V . (5)

θi are from (4) and Vsens,ref is output signal of the forearm perimeter sensor. Vsens,ref has a very small magnitude compared to θi. So, k=180,000 is used.

k-nearest neighbor (k-NN, k = 1) classifier first calculates Euclidean distance (6) between a given x and training samples i

nx where i and n stands for class number and sample number respectively. And classify the x to a class Ci which has a training sample with the smallest Euclidean distance.

inD x x . (6)

III. EXPERIMENTS AND RESULTS Experimental steps are composed of training and

classification. Two subjects were volunteered to the experiment. Both subject do not have a record of any forearm injuries.

As shown in Fig. 6, a forearm perimeter sensor and a IMU were fastened on the forearm and additional IMU was attached on the upper arm. Orientation of the IMUs are tuned to satisfy the orientation those are defined in Fig. 5.

Fig. 6. Sensors attached on the right arm [7].

Fig. 7. Pose sets Ci [7].

Training step was conducted by posing the pose sets Ci defined in (1) continuously. Fig. 7 shows snap shots during the training. Initial orientation of the arm was the orientation shown in Fig. 5.

During the posing, Euler angles from IMUs were collected through serial communication. And Vsens,ref was recorded using an analog to digital conversion device (National instrument NI-USB6009). A data Acquisition server was developed using LabVIEW (National instrument). Training data i

nx was collected using Matlab (MathWorks). Sampling time was 50 ms.

Classification step was conducted just after the training step by posing pose sets (1) continuously. k-NN classifier was programmed using Matlab. It classifies a given x to a class Ci using the classifier. x was sent to the classifier from the server using TCP/IP communication. Sampling time was 50 ms.

Fig. 8 shows classification results and input signals from the sensors during the classification of subject B. Fig.8(a) shows Vsens,ref, (b)-(f) are θi from the IMUs, and (g) is the classification results. The horizontal values of the plot are sample number.

Effectiveness of the arm orientation feature was verified by comparing the classification results with simulation results without the orientation feature. The simulation was conducted with recorded training data and x that was given during the classification before. However the feature vector was modified into (7). Simulation was conducted on Matlab environment using the same k-NN classifier.

,sens refx k V (7)

233

cyber14-041.pdf 3 2014/09/09 17:19:37

Page 4: [IEEE 2014 IEEE 4th Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER) - Hong Kong, Hong Kong (2014.6.4-2014.6.7)] The 4th

0 100 200 300 400 500 600 700 800 9000

200

400

600k

V sens

,ref

(a)

0 100 200 300 400 500 600 700 800 9000

50

100

1

(b)

0 100 200 300 400 500 600 700 800 900-50

0

50

2

(c)

0 100 200 300 400 500 600 700 800 900-50

0

50

3

(d)

0 100 200 300 400 500 600 700 800 9000

50

100

150

4

(e)

0 100 200 300 400 500 600 700 800 900

60

80

100

5

(f)

0 100 200 300 400 500 600 700 800 9001

2

Ci Transition state

(g)

Fig. 8. Vsens,ref, θi, and classification results [7].

Fig. 9 shows the simulation results without consideration of the arm orientation using data of subject B.

0 100 200 300 400 500 600 700 800 9001

2

Ci

Fig. 9. Classification results without the orientation feature.

Accuracy of the classifier was quantified by counting classification success rates. The first sample classified as C2 was assumed as transition state between poses (P1, A6) and (P2, A1). It is marked in Fig. 8(g). Table I shows the results.

TABLE I. CLASSIFICATION SUCCEES RATE

Without the orientation feature (%)

With the orientation feature (%)

Subject A 96.93 99.73

Subject B 88.90 96.11

IV. CONCLUSIONS This work suggests an robust hand posture classification to

arm posture. Two types of features were employed for the classification: orientation of the arm, and perimeter change of the forearm. Two IMUs measures the former feature and a forearm perimeter sensor measures the later one.

Two classes of paired hand posture-arm posture pose sets were prepared. These were posed by two subjects and trained to PC. After that, the pose sets were posed by the subjects again and classified with k-NN classifier.

To verify the effectiveness of the suggestion, classification was conducted without the orientation feature. Input of the classifier was the recorded data from the sensors during the classification before. For two subjects, it improves 5.01% of classification success rate.

Although it is a classification problem only with two classes, it enables a hand posture classifier to overcome misclassifications caused by the arm movement. Moreover it may help to remove the effect of the arm posture for hand posture classification using electrical potential feature.

Future works include employment of other classifier such as naive bayes classifier and support vector machine. Additionally classification of grasp force independent of arm posture is a considerable work.

ACKNOWLEDGMENT This research was supported by Leading Foreign Research

Institute Recruitment Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning(MSIP) (2010-00525).

REFERENCES [1] J. Chu, I. Moon, Y. Lee, S. Kim, M. Mun, “A Supervised Feature-

Projection-Based Real-Time EMG Pattern Recognition for Multifunction Myoelectric Hand Control,” Mechatronics, IEEE/ASME Transactions on , vol.12, pp.282,290, June 2007.

[2] S. Maier and PVD. Smagt, “Surface EMG suffices to classify themotion of each finger independently”, Proceedings of MOVIC 2008, 9th International Conference on Motion and Vibration Control, 2008.

[3] N. Li, D. Yang, L. Jiang, H. Liu, H. Cai, “Combined Use of FSR Sensor Array and SVM Classifier for Finger Motion Recognition Based on Pressure Distribution Map”, Journal of Bionic Engineering, vol. 9, pp. 39-47, March 2012.

234

cyber14-041.pdf 4 2014/09/09 17:19:37

Page 5: [IEEE 2014 IEEE 4th Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER) - Hong Kong, Hong Kong (2014.6.4-2014.6.7)] The 4th

[4] H. Han, “Mechanical sensing of skeletal muscle contraction and fatigue for physical human-robot interaction”, PhD thesis, KAIST, Korea, 2012.

[5] C. Castellini, PVD. Smagt, “Surface EMG in advanced hand prosthetics”, Biological Cybernetics, vol.100, pp.35-47, January 2009.

[6] H. Choi, S. Lee, “Improving the Performance of Hand Posture Classification by Perimeter Sensor with sEMG”, Proceedings of 2013 IEEE 2013 International Conference on Mechatronics and Automation, pp.819-824, August 2013.

[7] H. Choi, “Development of Robust Hand Posture Classifier Using the Forearm Perimeter Sensor”, MS thesis, Konkuk University, Korea, February 2014.

[8] P. Cerveri, E. D. Momi, N. Lopomo, G. Baud-Bovy, R. M. L. Barros, G. Ferrigno, “Finger Kinematic Modeling and Real-Time Hand Motion Estimation”, Annals of Biomedical Engineering, vol.35, pp.1989-2002, November 2007.

[9] A. Gallagher, Y. Matsuika, W. Ang, “An Efficient Real-Time Human Posture Tracking Algorithm Using Low-Cost Inertial and Magnetic Sensors”, Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS), pp.2967-2972, October 2004.

[10] P. Jung, G. Lim, K. Kong, “A Mobile Motion Capture System Based on Inertial Sensor and Smart Shoes”, Proceedongs of 2013 IEEE International Conference on Robotics and Automation (ICRA), pp.693-697, May 2013.

235

cyber14-041.pdf 5 2014/09/09 17:19:37