8
BodyRC: Exploring Interaction Modalities Using Human Body as Lossy Signal Transmission Medium Yuntao Wang 1 , Chun Yu 1 , Lin Du 3 , Jin Huang 1 , Yuanchun Shi 1 2 1 Key Laboratory of Pervasive Computing, Ministry of Education, Tsinghua National Laboratory for Information Science and Technology 1 Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China 2 Department of Computer Technology, Qinghai University, Xining, 810016, China 3 Zhigu, Beijing, 100085, China E-mail: {w.yuntaosea, yc2pcg}@gmail.com, [email protected], [email protected], [email protected] Abstract—With the increasing popularity of wearable computing devices, new sensing techniques that enable always- available interaction are highly demanded. In this paper, we propose BodyRC, a novel body-based device using human body as lossy signal transmission medium. This device supports on- body interaction and body gesture recognition. In particular, BodyRC recognizes the operations of on-body interaction and body gestures by analyzing the electrical properties when transmitting single high frequency analog signal through human body. We evaluate the capabilities and performance of BodyRC through two controlled experiments showing robust classification of both on-body interaction and body gesture recognition. In addition, we design a real-time recognition system demonstrating the utility of our technique. Keywords- Ubiquitous interfaces, on-body interaction, body gesture recognition, distributed circuit, lossy transmission medium, interaction modality. I. INTRODUCTION In recent years, the computing and display capabilities of wearable devices are becoming more powerful. However, the limited interaction space (e.g., diminutive screen) mars user experience and prevents wearable devices from realizing their full potential. One promising solution is to appropriate human body itself for new interaction modalities. Specifically, we can realize on-body interaction leveraging the body surface as input platform or use body gestures as input commands. Numerous researchers proposed new interaction methods for on-body interaction or body gesture recognition based on computer vision [9, 10, 23], electric field sensing [3, 4, 15, 25], ultrasound propagation [12, 14] and the conductivity of human body [11, 18]. In this paper, we present BodyRC, a body-based wearable remote control device using human body as lossy transmission medium. BodyRC can support interaction modalities including on-body interaction and body gestures. By adding two electrodes to user’s forearms as shown in Figure 1(a), BodyRC transmits single high frequency AC signal (12.5 MHz, 10 Vpp) through the circuit consisted by the human body and a series resistor. When user performing on-body touch, a new path is paralleled into the circuit that changes the original circuit structure, while body gestures alter the intra-body components leading to the change of the electrical properties, as shown in Figure 1(b) and Figure 1(c) respectively. The remarkable changes of circuit properties can be measured by the amplitude and phase delay of the voltage signal on the measured resistor. By classifying the pattern of these two parameters, BodyRC can recognize operations of on-body interaction and body gestures accurately. The specific contributions of this paper are listed as follows. 1. A novel wearable technique using human body as lossy signal transmission medium for body interaction. And we explain the sensing principle from the intra-body communication point of view. 2. We present a set of controlled experiments to evaluate the usability of this technique including 1) on-body interaction using the forearm and hand back and 2) body gesture recognition. The results of 96.32% accuracy of 5 touch positions and 2 sliding operations using 2 finger postures on forearm and the hand back, and 97.13% accuracy in body gesture recognition prove that this technique is highly usable. 3. We also design and develop real-time interactive application detecting 14 on-body interaction operations using forearm and hand back and also application for 7 body gestures. And we describe detailed implementation of the signal processing, event detection and classification. Figure 1. (a) Setup of BodyRC; (b) on-body interaction using forearm and hand back and (c) body gestures.

BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

BodyRC: Exploring Interaction Modalities Using Human Body as Lossy Signal Transmission Medium

Yuntao Wang1, Chun Yu1, Lin Du3, Jin Huang1, Yuanchun Shi1 2

1Key Laboratory of Pervasive Computing, Ministry of Education, Tsinghua National Laboratory for Information Science and Technology

1Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China 2Department of Computer Technology, Qinghai University, Xining, 810016, China

3Zhigu, Beijing, 100085, China E-mail: w.yuntaosea, [email protected], [email protected], [email protected], [email protected]

Abstract—With the increasing popularity of wearable computing devices, new sensing techniques that enable always-available interaction are highly demanded. In this paper, we propose BodyRC, a novel body-based device using human body as lossy signal transmission medium. This device supports on-body interaction and body gesture recognition. In particular, BodyRC recognizes the operations of on-body interaction and body gestures by analyzing the electrical properties when transmitting single high frequency analog signal through human body. We evaluate the capabilities and performance of BodyRC through two controlled experiments showing robust classification of both on-body interaction and body gesture recognition. In addition, we design a real-time recognition system demonstrating the utility of our technique.

Keywords- Ubiquitous interfaces, on-body interaction, body gesture recognition, distributed circuit, lossy transmission medium, interaction modality.

I. INTRODUCTION

In recent years, the computing and display capabilities of wearable devices are becoming more powerful. However, the limited interaction space (e.g., diminutive screen) mars user experience and prevents wearable devices from realizing their full potential.

One promising solution is to appropriate human body itself for new interaction modalities. Specifically, we can realize on-body interaction leveraging the body surface as input platform or use body gestures as input commands. Numerous researchers proposed new interaction methods for on-body interaction or body gesture recognition based on computer vision [9, 10, 23], electric field sensing [3, 4, 15, 25], ultrasound propagation [12, 14] and the conductivity of human body [11, 18].

In this paper, we present BodyRC, a body-based wearable remote control device using human body as lossy transmission medium. BodyRC can support interaction modalities including on-body interaction and body gestures. By adding two electrodes to user’s forearms as shown in Figure 1(a), BodyRC transmits single high frequency AC signal (12.5 MHz, 10 Vpp) through the circuit consisted by the human body and a series resistor. When user performing on-body touch, a new path is paralleled into the circuit that changes the original circuit structure, while body gestures alter the intra-body components leading to the change of the

electrical properties, as shown in Figure 1(b) and Figure 1(c) respectively. The remarkable changes of circuit properties can be measured by the amplitude and phase delay of the voltage signal on the measured resistor. By classifying the pattern of these two parameters, BodyRC can recognize operations of on-body interaction and body gestures accurately. The specific contributions of this paper are listed as follows. 1. A novel wearable technique using human body as lossy

signal transmission medium for body interaction. And we explain the sensing principle from the intra-body communication point of view.

2. We present a set of controlled experiments to evaluate the usability of this technique including 1) on-body interaction using the forearm and hand back and 2) body gesture recognition. The results of 96.32% accuracy of 5 touch positions and 2 sliding operations using 2 finger postures on forearm and the hand back, and 97.13% accuracy in body gesture recognition prove that this technique is highly usable.

3. We also design and develop real-time interactive application detecting 14 on-body interaction operations using forearm and hand back and also application for 7 body gestures. And we describe detailed implementation of the signal processing, event detection and classification.

Figure 1. (a) Setup of BodyRC; (b) on-body interaction using forearm and hand back and (c) body gestures.

Page 2: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

II. RELATED WORK

A. Human Body Interaction

Recently, researchers have explored body interaction modalities including on-body interaction and motion sensing. On-body interaction, using body surface as input platform, opened up a new interaction modality for wearable computing devices. Motion sensing has been widely applied in applications including body activity tracking, body gesture recognition, health monitoring and elder care [15]. We summarize the related work in two domains as follows.

1) On-body Interaction The commercial depth sensors provided many vision-

based solutions for on-body interaction [8, 10]. Harrison et al. presented OmniTouch [8], a depth-sensing and projection system that turning every surface into interactive multi-touch platform. Later on, they explored interaction modalities such as gestures and cursor control using arms and hands enhancing on-body interaction [10]. However, vision based approaches suffered from computationally expensiveness and light limitations in mobile scenarios. So several other sensing techniques were explored. Skinput proposed on-body finger input system on forearm and hand using the human body for acoustic transmission [12]. Mujibiya et al. extended Skinput with similar technique to enable pressure-aware continuous touch sensing as well as arm-grasping hand gestures on forearms [14]. Saponas et al. demonstrated muscle-computer interfaces sensing finger gestures based on forearm electromyography (EMG) [17].

2) Human motion recognition Xbox Kinect, the first successful commercial RGB and

depth-sensing system, supported body motion recognition for body games and other human motion related applications [23]. However, disadvantages such as cost and installation burden limit the wide deployment of vision-based motion sensing system especially in mobile scenarios [4]. To solve this problem, Body-mounted accelerometer sensors have been used for the detection of body gesture and human activity [13, 19]. However, these approaches can only monitor parts of the body equipped with the sensors. Recently, static electric filed sensing [3, 4, 15] were also explored for whole body gestures, human activity and also location detection. But these approaches only works where static electric field exists, i.e., in a building or at home.

B. Human Body as Transmission Medium

Several researchers have explored the interaction modalities based on the conductivity of human body. Gesture-wrist and gesture-pad [16] were able to detect hand gestures using capacitive sensing when transmitting electric wave signal through wrist. Touché [18] proposed a novel Swept Frequency Capacitive Sensing technique that can detect touch events on conductive objects and recognize complex configurations of human hands and body. Capacitive fingerprinting [11] used same technique for user identification on touchscreens by measuring the impedance difference of the human body to the environment. The sensing principle behind this technique is that the body impedance strongly depends on signal frequency. Touché can distinguish

on-body gestures based on the power change under different frequency with per-user classification accuracy of 84.0% and 52.9% for general classifier. Despite the latency caused by the swept frequency signal, the accuracy is too low for robust use. Instead of using swept AC signal, we utilize single high frequency AC signal and measure both amplitude and phase delay based on the explanation from distributed circuit.

According to distributed circuit model, lossy transmission wire is equivalent to infinite series of connected electronic elements, such as resistance, inductance. This can be used to qualitatively and quantitatively explain the power loss and phase delay in the transmission link. Human body, can also be considered as signal transmission medium, can be modeled by distributed RC circuit [1, 2, 24]. We design the sensing principles based on this model. In our work, we focus on exploring interaction modalities including on-body interaction and body gesture recognition for mobile scenarios, where the interaction space is very limited. This requires accurate recognition in real-time, which can only be achieved by better understanding of human body as signal transmission medium.

III. IMPLEMENTATION

A. Sensing Principle

Transmission Line Theory shows that “currents with a frequency high enough that wave nature must be taken into account” [21]. When transmitting high frequency analog AC signals in BodyRC, some signal travels through human body while others travel through air adjacent body parts [1, 2] as shown in Figure 2 (a).

In order to understand how the on-body touch and body gesture affect the received signal, we create an equivalent distributed circuit with resistors emulating the human body tissue and capacitors emulating the skin capacitance and parasitic capacitance between different body parts connected by air, as shown in Figure 2(b).

We find that on-body touch, such as using right hand to touch different position on the left arm, can be treated as switching on or off a parallel path for the signal transmission. This will lead to significant changes on the total resistance, as the tissue itself is a relatively good conductor. However, body gestures cause remarkable changes on the electrical properties that mainly are capacitance [20]. BodyRC detects the different electrical behaviors using the signal amplitude and phase delay independently and recognizes on-body touch and body gestures. We categorize two major changes and explain next.

1) Circuit Structure Change On-body touch will add another paralleled path into the body equivalent circuit that changes the structure of the circuit. This shunt path will decrease the total resistance of the body circuit and lead to a sudden increase of the analog signal amplitude. The start and stop events of on-body touch trigger instantaneous change of the signal, especially on amplitude. This can be accurately detected with almost no delay. The instantaneous changes of amplitude can also be used to segment on-body touch events.

Page 3: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

Figure 2. The sensing principle of BodyRC, (a) illustration of signal transmission paths around human body, (b) equivalent distributed circuit for human body.

2) Circuit Component Change Body movements mainly alter the intra-body capacitance

components leading to the continuous change of amplitude and phase delay, but the change of phase delay is much more significant than that of amplitude [20], as there is no new path added to the signal transmission path.

B. Hardware and Software Setup

The overall hardware architecture of BodyRC is shown in Figure 3. An FPGA processor Cyclone VI (EP4CE15F17C8, 64MB), running at 50MHz, controls an 8-bits digital-analog converter (DA9004) generating a single high frequency signal (12.5 MHz). The signal is filtered to remove noises and amplified to 10 Vpp. In order to obtain the phase delay caused by body characteristic, one copy of the signal is used as reference and the other is transmitted into human body using two electrodes (conductive silicone strip) attached on two forearms. By adding a resister (8.2 KΩ) on the circuit path, BodyRC transfers the alternating current into alternating voltage signal (monitoring signal). The reference signal and the monitoring signal are captured and transferred into 12-bit digital signals by two 12-bits analog-digital converters (AD 9227) working at 50MHz.

The processor measures three values that are 1) peak amplitude of monitoring signal; 2) peak amplitude of reference signal; and 3) phase delay relevant value between these two signals. Then the processor sends these three values to a personal computer using Bluetooth 4.0 with baud rate of 9600 Hz.

Figure 3. BodyRC hardware design.

C. Data acquisition

Assuming that the reference signal is, and the monitoring signal passing through body is

. In our prototype, the peak amplitude and are obtained by recording the maximum sampling value

over 2500 periods (5 KHz update rate). And we acquire phase delay using simplified Costas Loop, a phase-locked loop based circuit that is used for carrier phase recovery [5]. The phase delay relevant value (PDRV) can be obtained by pass the product of these two signals through a low pass filter as Equation 1 shows. Then , and are send to classifier running on the personal devices with update rate of 60 Hz. Then we obtained the amplitude data and phase delay data between reference and monitoring signal .

S ∗2

cos cos 2

cos (1)

IV. EXPERIMENT 1: ON BODY INTERACTION

In this experiment, the forearm and hand back are used as the touch area as Figure 4 shows. We develop an on-body interaction system that can sense one-dimension multi-touch including 5 touch positions ( and 2 sliding operations ( ) with 2 finger gestures (single finger, whole hand).

: center of forearm; :wrist area; :center of hand back; :proximal phalanx area of the middle finger; :fingernail of the middle finger ; : slide up; : slide down. This experiment is a proof-of-concept for our technique

used for on-body interaction. The data was processed offline using automatic segmentation, feature extraction and classification methods after the data acquisition phase. We used these methods in the real-time recognition system later in the paper.

To evaluate this system, we recruit 10 participants (4 females and 6 males) with ages between 22 to 35 years (M = 26.4, SD = 3.55), weight between 46 kg to 102 kg (M = 64.9, SD = 16.19) and heights between 156 cm to 182 cm (M = 171.2, SD = 9.02). Participants perform all the touch and sliding operations at same room without any other people nearby.

A. Procedure

Before the experiment, 5 touch positions on the participants’ forearm and hand back are marked. The participants are informed the purpose and the procedure of the experiment. In each session, Participants perform the experiment in a specified order ( , , , , , , ) using one finger and then whole hand. We call each action a block containing 10 repeats of each action. 2 finger gestures (5 touch locations 2 sliding operations) 10 repeats 140 actions are performed in each session. We repeat the session for 4 times. In total, we obtain 560 touch actions per participant. Each session takes about 7 minutes. Participants were required to have a break for 3 minutes after each session.

Page 4: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

Figure 4. 5 touch positions using 2 finger gestures.

During the data collection phase, BodyRC hardware board are tied on participants’ waist with two conductive silicone strips attached on their upper forearms. Participants are informed to stand in front of a screen showing the required action. The experimental procedure is controlled by a remote computer, on which we can control beginning and end of each block. During each block, participants have 5 seconds to prepare and then perform 10 repeats of each required action.

B. On-body Interaction Recognition System

We consider the recognition as classification problem consisting of three stages. 1) Segmentation: identifying the starting and end timestamps of each action; 2) feature extraction: taking features out of the raw data for classification; 3) classification: detect the touch positions and sliding operations with different finger gestures.

1) Segmentation Figure 5 illustrates the peak amplitude and phase delay

data along time of one finger touch event on center of forearm. The touch event causes much more obvious change in amplitude than phase delay. Two signal behaviors are applied in the segmentation method. 1) We can observe a sudden hop in amplitude signal when on-body touch occurs with an average amount of 2.15 samples (35.8 ms) during the rising edge and 3.54 samples (59 ms) during the falling edge. The sudden parallel path adding on the circuit that has been explained in section “sensing principle. 2) The stability of the signal except the rising and falling edges. The sudden parallel path change event transfer the signal between low level and high low level just as clock signal act. Based on the observation above, we design automatic segmentation algorithm.

To detect the rising edge of the amplitude signal, a window containing 20 peak amplitude samples moving along time as shown in Figure 5 (the light green one). Firstly, sample-to-sample difference between the 10th and 11th sample is calculated. If the difference is larger than a static threshold 50 mv and also 3 times the standard deviation of the previous 9 sample-to-sample difference namely edge threshold, the system consider this as a rising edge event candidate. Then the system calculates all the 19 sample-to-sample difference among 20 samples, namely D i1,2, … ,19 . Around the rising edge event candidateD , we find all the sample-to-sample difference bigger than edge threshold (D k m,… , n ). Then we add all of them to obtain the closest rising edge candidate value shown in the purple area in Figure 5.

Figure 5. Segmentation example of on-body interaction using peak amplitude signal

To confirm the rising edge, we summate the absolute value of D k 1,… ,m andD k n,… ,19 . The rising edge is confirmed when the closest rising edge candidate value is larger than 8 times of this absolute value. The falling edge is detected using similar method after the rising edge is detected.

2) Feature extraction In segmentation phase, we obtain the rising and falling

edges of the amplitude signal that lead us extract features used for classification. The main feature we used is the signal voltage amplitude and phase passing through the adding circuit. Figure 2 shows the circuit structure change when touch event occurs. Before the touch event,

and after the touch event . The difference value between low level and high level states is the voltage change caused by the adding parallel circuit. Assuming that cos , cos we can extract the voltage change

cos based on phasor addition as following equation shows.

(2)

arctan (3)

Nine features we extract from the signal includes: Amplitude (1): calculated amplitude of adding path

; Phase delay (1): the delay ( ) between signal

phase of adding circuit ( ) and phase before the touch ( );

Means (4): average amplitude and phase delay values of low level state and high level state in source signal;

Page 5: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

Difference Value (2): difference value between amplitude and phase delay value of end of the rising edge and begin of the falling edge.

Duration (1): high-level state duration.

3) Classification Per-user classifier and “walk up” general classifier are

both designed in this system. We use a Support Vector Machine (SVM, RBF kernel, c=1.0, g=0.0025) implemented in LIBSVM [7] as the classification tool. The amplitude and phase features are normalized using the average amplitude and phase delay data when participants stand still in “walk up” general classifier.

C. Results

For per-user classifier, we conduct 4-fold round-independent cross validation. We trained the classifier with 3 rounds of data collected from each participant, and tested on the other round of data. All train/test combinations were tested and then we averaged the accuracies of all the touch events. In the results, on-body interaction of 5 touch positions and 2 sliding operations using 2 finger gestures achieved an accuracy of 96.32% (SD = 2.12%).

To assess the general classifier, we conducted 10-fold participant-independent cross validation by training the classifier using data from 9 participants and testing using data from the other participant. We tested all the train/test combinations and averaged the accuracies. The average accuracy across 10 participants is 86.76% (SD = 5.44%). Close analysis of the recognition errors shows that 5 touch position recognition using whole hand is responsible for 87% of the errors. When only one-finger touch events are performed, the average accuracy reaches to 94.32% (SD=3.33%).

V. EXPERIMENT 2: BODY GESTURE RECOGNITION

In this experiment, we conducted a study on body gesture recognition based on our technique. We chose body gesture sets that test a variety of body movements as Figure 6 show. To evaluate the possibility of body gesture recognition, we recruited 8 participants (3 females) different from experiment 1 between 21 to 28 years old (M = 24.4, SD = 2.32), weighted between 51 to 84 kg (M = 69.7, SD = 12.83) and between 161 cm to 187 cm tall (M = 173.1, SD = 9.78). All the participants perform all the gestures at same room without any other people nearby.

A. Procedure

Before the experiment, participants were instructed of how to perform the body gestures and to train themselves to perform the gestures. Soon after, participants instructed to perform each gesture for 5 times following a specific order that we call a session. For each gesture, participants have 3 seconds to prepare and have 4 seconds to finish. We repeated the session for 4 times. Overall, we obtained 140 gesture samples per participant.

B. Body Gesture Recognition System

As illustrated in section “sensing principle”, body movement mainly changes the phase delay of the analog signal passing through human body. The difference between Figure 5 and Figure 7 proves this assumption. The phase delay changes much more obviously and continuously than peak amplitude compared with on-body touch. So body gesture recognition system processes using phase delay signal. Similar with experiment 1, we illustrate the procedure of segmentation, feature extraction and classification of the body gesture recognition system.

1) Segmentation Figure 7 shows the waveform of the amplitude and phase

delay during a left wave gesture. It is obvious that the stability of the signal represents the reset state of human body. When people perform body gestures, the signal changes significantly that triggers the segmentation. In our design, a 3rd order Butterworth IIR low-pass filter with a 3dB corner at 3 Hz is applied on the phase delay signal. And then the signal passes through two 0.33 seconds windows (1 and 2 in Figure 7) containing 20 samples to detect the activity.

Assuming the signal-representing particle moving along one-dimension line. We utilize the second window recognize the trigger of segmentation based on three factors. 1) Range: Range of the waveform; 2) speed: sum of absolute value of sample-to-sample difference of the samples; and 3) max speed: maximum sample-to-sample difference. When max speed is smaller than a static threshold of 1.5 degree, at the same time range is greater than 3 times of standard deviation of samples in window 1 or speed is greater than 2 degree, the first window is considered as the beginning window. After that, when range is smaller than 0.5 degree and speed is smaller than 0.5 degree, the window is considered as the last window. If there is no detection of such window for a long period of time (3 seconds, 9 windows), the 9th window is considered as the last window.

Figure 6. Stick figures (upper figures) and wave forms (black line: phase delay, red line: peak amplitude) of the 7 body gestures performed in experiment 2

Page 6: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

Figure 7. Body gesture (left wave) segmentation example using phase delay signal.

The example in Figure 7 shows the segmentation procedure. The trigger of segmentation is detected in window 2. The system begin to segment phase delay data and peak amplitude data from beginning window (1) to last window (7). However, this segmentation method has some limitations and we will discuss later in the paper.

2) Feature extraction and classification We observed similar shape of the phase delay and

amplitude signals of the same gesture cross all the participants. Figure 6 shows the waveform templates of each body gestures. So template matching, specifically dynamic time warping (DTW), is applied in our classification methods. DTW is a dynamic programing algorithm measuring similarity between two sequences that may vary in time or speed [6]. In our system, we measure the similarity based on Euler distance of vector consisted by phase delay and amplitude. And before the classification, we normalize gesture sample using the maximum and minimum value per sample. When the DTW is done, we select top three classes and corresponding templates with the smallest distance. And then we choose the most likely gesture in these three gestures using the distance of following features between the testing sample and corresponding template.

Shape distance (1): the distance obtained from DTW; Difference value (2): the phase delay and amplitude

difference value between means of samples in the beginning and last windows;

Peak and valley amount (2): amount of peaks and valleys;

Duration (1): duration of the body gesture. Similar as experiment 1, we design two kinds of classifiers:

1) per-user classifier; and 2) “walk up” classifier. Among 20 gesture samples per user, we select 5 gesture samples of session 1 as the templates for per-user classifier and test on the remaining 15 samples. And we build each gesture’s template used in general classifier by combining waveform of each bo-

TABLE I. ACCURACIES AMONG TWO CLASSIFIERS AND TWO SEGMENTATION METHODS

Manual

Segmentation Real-Time

Segmentation

Per-User Classifier

97.13% (SD=1.69%)

91.67% (SD=2.18%)

General Classifier

92.85% (SD=1.50 %)

89.57% (SD=2.52%)

dy gesture across 7 participants and tested on the data of the 8th participant. In total we obtained 20 templates for each gesture. All train/test combinations were tested (i.e., 8-fold participant-independent cross validation).

C. Results

To assess the performance of the classification, we compare the accuracies in Table 1.

Considering that DTW is time-consuming (time

complexity of O(n2) for standard algorithm), we can reduce the computing time by using FastDTW [6] with O(n) time and space complexity or decrease amount of the used templates. We calculate the trade-off curves between the average accuracy and number of templates (Figure 8). We use the samples in session 1 and 2 (10 samples) as the template candidates and test on the samples in session 3 and 4 (10 samples) for per-user classifier. The result shows that 5 templates are feasible for high accuracy (manual segmentation: 96.95%, real-time segmentation: 91.22%) for per-user classifier. For general classifier, we test the first 10 templates cross 7 participants. The result shows the accuracy reaches 91.8% for per-user classifier and 88.42% for general classifier when using 8 templates.

Figure 8. Per-user and “Walk Up” classification accuracy for 7 body gesture recognition.

Page 7: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

Figure 9. a) Set-up of the applications. b) real-time body gesture recognition application using BodyRC. c) real-time on body touch application using BodyRC.

VI. REAL-TIME ITERACTIVE APPLICATIONS

A. Set-up

To demonstrate the feasibility of BodyRC, we developed two real-time interactive applications for on-body touch and body gesture recognition separately. We used a remote computer (CPU: Intel®Core™i7-4770K @ 3.50GHz, RAM: 8GB) as the processor and display the user interface on a 53 inch screen. We use C# to build the application. The classifier is pre-designed following the procedure in section IV and V. The user was asked to stand in front of the screen and performing on-body touch and body gestures as Figure 9. The application provides real-time visualization of the raw data and the recognition results in real-time. Besides, the

application uses a camera to record the user actions and display it on the user interface in real-time.

B. Procedure

Firstly, the applications acquire the real-time data from the hardware and display the data on the user interface. Then, the on-body touch (body gesture) application segment the data following the method illustrated in section IV (V). Once the applications detect corresponding triggers, they extract the features for classification. And then the applications predict which action the user performed using the pre-designed classifier. Lastly, the applications display the predict result on the user interface.

VII. DISCUSION AND FUTURE WORK

The experiments show the feasibility of using human body as lossy transmission medium for interaction modalities including on-body interaction and body gestures. The sensing principle of BodyRC is proved to be reasonable and capable guiding the real-time classification. In addition, we demonstrated the capability of running BodyRC in real-time. This section discusses some limitations of current system and the future work to improve the system.

The biggest concern of passing electrical signals through human body is safety. The safety threshold of the current through body surface is 10 mA with frequency over 1MHz [8]. In our design, the sine wave signal (12.5 MHz, 10Vpp) is injected into the skin via an 8.2 KΩ resistor. The maximum current through human body is about 1 mA that is safe for human body. And 18 participants in our experiments said that there is no discomfort during 40-minutes experiments.

Lossy transmission property of human body, which is the sensing principle of BodyRC, is highly sensitive to peculiarities of the user. The voltage amplitude on the resistor varies from 3.15 v to 3.87 v (M = 3.48, SD=0.27) and phase delay varies from 41.2° to 51.8° (M=46.61, SD=3.33). In our current general classifier design, we remove the peculiarity by simply normalizing features by the signals of user’s standing state. However, the peculiarity of user still causes recognition errors in the general classifier especially in experiment 1 (drops by 9.56% compared with per-user classifier). In future work, we plan to extend the design using carrier analog signal to explore other possible features for general classification. Designers also should consider incremental machine learning approach based on the baseline model in our current implementation. Such approach would adapt the model involving the peculiarity as a person use the system.

While we still cannot find any general rule for the electrodes’ placement on human body, there are some suggestions to avoid limitation: 1) avoid short path of body part circuit (Figure 1). The system works if only the impedance of detection part is comparable with the impedance of the body part circuit; 2) place them to an area that does not need to be touched. We plan to explore the general rules of placement in the future work.

Another limitation of our current implementation is the wired connect between the electrodes and the BodyRC hardware board. Though the shielded wire has little effect on the signals, the wire might lead some discomfort in mobile

Page 8: BodyRC: Exploring Interaction Modalities Using Human Body ...pi.cs.tsinghua.edu.cn/lab/people/YuntaoWang/files/... · BodyRC: Exploring Interaction Modalities Using Human Body as

scenarios. In the future work, we want to explore the possibility of using body coupling used in intra-body communication [2]. Body coupling makes a current loop that is composed of the transmitter electrode, the human body, the receiver electrode and a return path through the external ground. We can explore the interaction modalities based on the lossy transmission property of human body and also the near field sensing.

Though the segmentation method in on-body interaction experiment works good, current segmentation approach in gesture recognition system is not. The real-time segmentation causes a lot of misrecognitions with accuracy drops by 5.46% in per-user classifier and 3.28% in general classifier. In future work, we would search for better classification method based on discrete state change such as a Hidden Markov Model (HMM). In addition the static segmentation method in body gesture recognition system causes high latency because of detection of last window. HMM may also benefit in reducing recognition latency.

VIII. CONCLUSION

Using human body as lossy transmission medium, we propose a novel approach for on-body interaction and body gesture recognition, namely BodyRC. This system allows truly always available computing for wearable devices in mobile scenarios. We have described the theory explaining how BodyRC works based on circuit structure change and circuit component change. Then we proposed on-body interaction and body gesture recognition based on the sensing principle. We have shown the capability of BodyRC to sense user’s on-body touch including 5 touch positions and 2 slide actions under 2 finger gestures with an average accuracy of 96.32% using per-user classifier and 86.76% for general classifier. And we also demonstrate our system can recognizing 7 body gestures at 97.13% accuracy and 92.85% for general classifier. The results suggest BodyRC should be feasible and accessible for some interaction modalities using human body as transmission medium. At last, we present a real-time recognition system to demonstrate the ability of BodyRC as a new interactive interface and we conclude with future work.

IX. ACKNOWLEGEMENT

This project is supported by National Science and Technology Major Project of China under Grant No. 2013ZX01039001-002, the Natural Science Foundation of China under Grant No.61272230, National Key Technology Research and Development Program of China under Grant NO.2012BAH25B01 and Tsinghua University Research Funding No.20111081113. Thanks all the participants for their time and useful suggestion.

REFERENCES

[1] Bae, J., Cho, H., Song, K., Lee, H., and Yoo, H. J. The signal

transmission mechanism on the surface of human body for body channel communication, IEEE TMTT, 60(3), 582-593.

[2] Cho, N., Yoo, J., Song, S. J., Lee, J., Jeon, S., and Yoo, H. J. The human body characteristics as a signal transmission medium for intra-body communication. IEEE Transactions on Microwave Theory and Techniques, 55(5), 1080-1086.

[3] Cohn, G., Morris, D., Patel, S. N., and Tan, D. S. Your noise is my command: sensing gestures using the body as an antenna. In Proc. CHI’11, pp. 791-800.

[4] Cohn, G., Morris, D., Patel, S., and Tan, D., Humantenna: using the body as an antenna for real-time whole-body interaction. In CHI’12, pp. 1901-1910.

[5] Costas Loop. http://en.wikipedia.org/wiki/Costas_loop.

[6] Dynamic time warping in time series analysis. http://en.wikipedia.org/wiki/Dynamic_time_warping.

[7] Fan R. E., Chen P. H., and Lin C.J. Working set selection using second order information for training SVM. Journal of Machine Learning Research 6, 1889-1918, 2005.

[8] Fukumoto, Masaaki, and Yoshinobu Tonomura. "“Body coupled FingerRing”: wireless wearable keyboard." Proc. CHI’97, 147-154.

[9] Harrison, C., Benko, H.,and Wilson, A. D. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST’12, pp. 441-450.

[10] Harrison, C., Ramamurthy, S., and Hudson, S. E. On-body interaction: armed and dangerous. In Proc. TEI’12, pp. 69-76.

[11] Harrison, C., Sato, M., and Poupyrev, I. Capacitive fingerprinting: exploring user differentiation by sensing electrical properties of the human body. In Proc. UIST’12, pp. 537-544.

[12] Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI’10. pp. 453-462.

[13] Mansfield, A., and Lyons, G. M. The use of accelerometry to detect heel contact events for use as a sensor in FES assisted walking. Medical Engineering & Physics, 25(10), 879-885.

[14] Mujibiya, A., Cao, X., Tan, D. S., Morris, D., Patel, S. N., and Rekimoto, J. The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation. In Proc. ITS’13, pp. 189-198.

[15] Mujibiya, A., and Rekimoto, J. Mirage: exploring interaction modalities using off-body static electric field sensing. In Proc. UIST’13, pp. 211-220.

[16] Rekimoto, J. Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In ISWC’01, pp. 21-27.

[17] Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proc. UIST’09, pp. 167-176.

[18] Sato, M., Poupyrev, I., and Harrison, C. Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proc. CHI’12, pp. 483-492.

[19] Scott, J., Dearman, D., Yatani, K., and Truong, K. N. Sensing foot gestures from the pocket. In Proc. UIST’10 pp. 199-208.

[20] Seyedi, M., Kibret, B., Lai, D. T., and Faulkner, M. A survey on intrabody communications for body area network applications.

[21] Transmission Line Theory and distributed circuit model. http://en.wikipedia.org/wiki/Transmission_line.

[22] Wimmer, R., and Baudisch, P. Modular and deformable touch-sensitive surfaces based on time domain reflectometry. In Proc. UIST’11, pp. 517-526.

[23] Xbox Kinect. http://www.xbox.com/en-US/kinect.

[24] Zimmerman, T. G. Personal area networks: near-field intrabody communication. IBM systems Journal, 35(3.4), 609-617.

[25] Zimmerman, T. G., Smith, J. R., Paradiso, J. A., Allport, D., and Gershenfeld, N. Applying electric field sensing to human-computer interfaces. In Proc. CHI’95, pp. 280-287.