6
Abstract—This paper proposes a robotic walking-aid system which aims to provide mobility assistance and remote monitoring for the elderly or the disable. The robotic system is able to provide physical support and guidance while avoiding static and dynamic obstacles. To effectively solve different situations against obstacles, a training stage of learning human user-adaptive characteristics is adopted. Through wireless communication and localization technique, the user operation status, including the position of robot, can be monitored by the server. In this server, we also develop a call-to-come service for the robotic system. The experimental test-bed has a cart-based configuration with a nonholonomic differential drive mainly for forward direction. Experimentation and evaluation are presented to present the validity of the developed robotic system. I. INTRODUCTION According to the definition of UN World Health Organization, a country's proportion of the population aged over 65 years is more than 7% that means that the country has entered the aging society. You can imagine that in the future more and more important in the elderly care home life issues to be faced. Both countries have invested considerable funds and research services for the elderly care related issues. On the other hand, the life of walking aids for the physically impaired or blind indispensable aids for the elderly is also a considerable degree of importance. Many studies have shown that the use of relevant human robot-assisted ways to reduce the burden on the leg, increased security when walking, etc., so a considerable number of experts and scholars presented their designs walking aids [1-3]. G. Wasson and J. Gunderson developed MARC Smart Walker [4] which can assist the user to be able to use any walking aids only when approaching obstacles or stairs to give active assistance. C. Huang, M. Alwan , and G. Wasson , who also developed the Cool Aide intelligent aids [5] , in order to detect the infrared sensor and built- in map of the surrounding environment than on updated rules set up several virtual and thresholds , by measuring the direction of the user's strength , shared- use controls to achieve compliance with the user direction or goal navigation route guidance and pre-planning . Y. Hirata et al walking robot passive forms of aid [6], mainly contribute to the implementation of the user must be able to go forward aids, use of brakes way to achieve robotic handling, *This work was supported by the National Science Council of Taiwan under Grants NSC-102-2221-E-030-013 and NSC-102-2218-E-002-009- MY2. S.-R. Lyu, W.-T. You, Y.-S. Chen and H.-H. Chiang are with the Department of Electrical Engineering, Fu Jen Catholic University, New Taipei City 24205, Taiwan (e-mail: hsinhan@ ee.fju.edu.tw). Y.-L. Chen is with the Department of Computer Science and Information Engineering, National Taipei University of Technology, Taipei 10608, Taiwan (e-mail: [email protected]). but requires additional force on walking aids, the effect did not reach the user's guide. We hoped that the design of the walking assist robot can guide the user, to further improve the efficiency of walking , and walking aids when used, may have a remote monitoring personnel to assist in increasing user safety and to avoid sudden and unexpected occur. Most of the intelligent robot walking aids do not function with the call-and-come service, and cannot be adjusted according to the operating characteristics of the track guide users themselves. This service is designed in our system to allow users to more easily operate the walking-aid robot. This can make robot automatically approach to the user without having to personally reach the parking place of operation. According to the characteristics of the user to adjust the guide track can be improved user friendliness and stability of operation of the guide is also more in line with the trajectory of human walking habits. II. ROBOTIC SYSTEM The developed robotic system consists of two parts: the home server with remote monitoring scheme and the embedded-based walking-aid (Robot) robot platform [7]. The robot system has a suitable size and weight for a home environment. The on-board embedded system integrates a 400 MHz real-time processor running the real-time operating system (RTOS), a reconfigurable FPGA, and 128 MB of DRAM. The laser scanner is mounted in front at 45 cm from the floor pointing horizontally. The laser measures environment information ahead with the scanning rate 5 Hz. The ultrasonic sensor is installed at the back of the robot, and measures the relative distance to a user. The robot wheels are driven by DC motors. The motor encoder information is available for dead reckoning. The grip force sensors are used to measure user’s intention. A wireless module with Ethernet port is available for communicating with the home server. This prototype system is presently limited to indoor purposes. The robot platform uses an embedded system for the main computing system, which connects to various sensors and also carries out the motion control algorithms. The home server plays the role of the interface and remote monitoring the status of the robot, and manages the information with the robot via wireless internet. The robotic system possesses the five principal functions in order to provide the walking support service with safety consideration for the elderly. As illustrated in Fig. 1, these functions include behavior control designs such as adaptive speed control, obstacle avoidance, and user-adaptive characteristics for the walking support process. To enhance the user-friendly feature, a hand gesture recognition based call-and-come service is developed. With the fundamental navigation assistance, the robot can be Development of Robotic Walking-aid System with Mobility Assistance and Remote Monitoring Syue-Ru Lyu, Wan-Ting You, Yung-Shin Chen, Hsin-Han Chiang, and Yen-Lin Chen 2014 IEEE International Conference on Automation Science and Engineering (CASE) Taipei, Taiwan, August 18-22, 2014 978-1-4799-5283-0/14/$31.00 ©2014 IEEE 830

[IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

  • Upload
    yen-lin

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

Abstract—This paper proposes a robotic walking-aid system which aims to provide mobility assistance and remote monitoring for the elderly or the disable. The robotic system is able to provide physical support and guidance while avoiding static and dynamic obstacles. To effectively solve different situations against obstacles, a training stage of learning human user-adaptive characteristics is adopted. Through wireless communication and localization technique, the user operation status, including the position of robot, can be monitored by the server. In this server, we also develop a call-to-come service for the robotic system. The experimental test-bed has a cart-based configuration with a nonholonomic differential drive mainly for forward direction. Experimentation and evaluation are presented to present the validity of the developed robotic system.

I. INTRODUCTION

According to the definition of UN World Health Organization, a country's proportion of the population aged over 65 years is more than 7% that means that the country has entered the aging society. You can imagine that in the future more and more important in the elderly care home life issues to be faced. Both countries have invested considerable funds and research services for the elderly care related issues. On the other hand, the life of walking aids for the physically impaired or blind indispensable aids for the elderly is also a considerable degree of importance.

Many studies have shown that the use of relevant human robot-assisted ways to reduce the burden on the leg, increased security when walking, etc., so a considerable number of experts and scholars presented their designs walking aids [1-3]. G. Wasson and J. Gunderson developed MARC Smart Walker [4] which can assist the user to be able to use any walking aids only when approaching obstacles or stairs to give active assistance. C. Huang, M. Alwan , and G. Wasson , who also developed the Cool Aide intelligent aids [5] , in order to detect the infrared sensor and built- in map of the surrounding environment than on updated rules set up several virtual and thresholds , by measuring the direction of the user's strength , shared- use controls to achieve compliance with the user direction or goal navigation route guidance and pre-planning . Y. Hirata et al walking robot passive forms of aid [6], mainly contribute to the implementation of the user must be able to go forward aids, use of brakes way to achieve robotic handling,

*This work was supported by the National Science Council of Taiwan under Grants NSC-102-2221-E-030-013 and NSC-102-2218-E-002-009- MY2.

S.-R. Lyu, W.-T. You, Y.-S. Chen and H.-H. Chiang are with the Department of Electrical Engineering, Fu Jen Catholic University, New Taipei City 24205, Taiwan (e-mail: hsinhan@ ee.fju.edu.tw).

Y.-L. Chen is with the Department of Computer Science and Information Engineering, National Taipei University of Technology, Taipei 10608,

Taiwan (e-mail: [email protected]).

but requires additional force on walking aids, the effect did not reach the user's guide. We hoped that the design of the walking assist robot can guide the user, to further improve the efficiency of walking , and walking aids when used, may have a remote monitoring personnel to assist in increasing user safety and to avoid sudden and unexpected occur.

Most of the intelligent robot walking aids do not function with the call-and-come service, and cannot be adjusted according to the operating characteristics of the track guide users themselves. This service is designed in our system to allow users to more easily operate the walking-aid robot. This can make robot automatically approach to the user without having to personally reach the parking place of operation. According to the characteristics of the user to adjust the guide track can be improved user friendliness and stability of operation of the guide is also more in line with the trajectory of human walking habits.

II. ROBOTIC SYSTEM

The developed robotic system consists of two parts: the home server with remote monitoring scheme and the embedded-based walking-aid (Robot) robot platform [7]. The robot system has a suitable size and weight for a home environment. The on-board embedded system integrates a 400 MHz real-time processor running the real-time operating system (RTOS), a reconfigurable FPGA, and 128 MB of DRAM. The laser scanner is mounted in front at 45 cm from the floor pointing horizontally. The laser measures environment information ahead with the scanning rate 5 Hz. The ultrasonic sensor is installed at the back of the robot, and measures the relative distance to a user. The robot wheels are driven by DC motors. The motor encoder information is available for dead reckoning. The grip force sensors are used to measure user’s intention. A wireless module with Ethernet port is available for communicating with the home server. This prototype system is presently limited to indoor purposes.

The robot platform uses an embedded system for the main computing system, which connects to various sensors and also carries out the motion control algorithms. The home server plays the role of the interface and remote monitoring the status of the robot, and manages the information with the robot via wireless internet. The robotic system possesses the five principal functions in order to provide the walking support service with safety consideration for the elderly. As illustrated in Fig. 1, these functions include behavior control designs such as adaptive speed control, obstacle avoidance, and user-adaptive characteristics for the walking support process. To enhance the user-friendly feature, a hand gesture recognition based call-and-come service is developed. With the fundamental navigation assistance, the robot can be

Development of Robotic Walking-aid System with Mobility Assistance and Remote Monitoring

Syue-Ru Lyu, Wan-Ting You, Yung-Shin Chen, Hsin-Han Chiang, and Yen-Lin Chen

2014 IEEE International Conference onAutomation Science and Engineering (CASE)Taipei, Taiwan, August 18-22, 2014

978-1-4799-5283-0/14/$31.00 ©2014 IEEE 830

Page 2: [IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

directed, by the home server, to automatically navigate to the user location. In addition, the user or caregiver can monitor the current operation status of robot, including sensory information, robot location, control behaviors, and so on.

Fig. 1. Conceptual diagram for the developed robot system.

Fig. 2. Structure of FAM.

III. SYSTEM DESIGN

In the following, each the designed functionalities are introduced.

A. User-adaptive Behavior Control The objective of the behavior control is to provide the

appropriate speed and orientation controlling for the robot to be able to assist the user in walking support. To express the flexible way, the control algorithm is based on the fuzzy additive model (FAM) [8]. The basic structure of FAM is presented as in Fig. 2, in which x and y denote the input and the output, respectively; Ai and B’i are the input and the output fuzzy sets, respectively. Note that the weighting Wi can be designed according to the idea of the synapse-weight which defines a weight variable for each rule. Thus, the weight value of Wi can be adjusted to change the contribution of each rule in the fuzzy system output.

In the behavior control design of the robot, this study employs 2 FAMs to the speed and orientation motion control. The major environmental information is detected by the laser scanner mounted ahead of the robot. As shown in Fig. 3, the scanned area from the laser scanner can be divided into three sectors. The scanned left area (LA) and the right area (RA) are the inputs of the fuzzy orientation controller which can determines the proper turning speed to avoid the static and

dynamic obstacles during the robot moving. To obtain the smooth moving speed and adapts the user’s gait, the fuzzy

Fig. 3. Illustration of forward detected area and the relative distance.

Sensor Data

Left

RightConversion

to Fuzzy Input Variables:Right AreaLeft Area Front Area

Laser Scan

Fuzzy OrientationController Rule Base

Robot MotorDriver System

Ultrasonic Scan

ω

v

Prefilter

Prefilter

Fuzzy Speed Controller Rule Base

Front

Left

RightConversion

to Fuzzy Input Variables:Right AreaLeft Area Front Area

Laser Scan

Fuzzy OrientationController Rule Base

Robot MotorDriver System

Ultrasonic Scan

ω

v

Prefilter

Prefilter

Fuzzy Speed Controller Rule Base

Front

Fig. 4. Control block diagram for the designed behavior control of the

walking-aid robot.

speed controller can calculates the proper speed controlling based on the relative distance (RD) between the robot and the user. Besides, the other input is the scanned front area (FA), as depicted in Fig. 3. The consideration into this design provides the required deceleration, even full stop, to avoid colliding with the dynamic obstacles such as people and other objects. Figure 4 shows the control block diagram in determining the inferred orientation and speed controlling according to the distance information from the laser scanner and the ultrasonic sensor. In each rule base, seven fuzzy sets are adopted for each fuzzy input and the fuzzy output. For the fast computation efficiency, the triangular functions and singleton values are utilized in the input the output membership functions, respectively. The associated fuzzy sets and fuzzy membership functions are shown in Fig. 5. The defuzzification strategy in our system employs the center of gravity (COG) method but with an adaptation scheme with fuzzy –rule weights.

With the constructed FAM in the fuzzy orientation and the speed controller, we can adjust the fuzzy rules via the different weight values to achieve better control effect. In the training phase, the statistics of firing rules can be analyzed to determine the higher weights which represent the user behavior according with the system output. Based on the rule excitation records, the fuzzy rules with higher using efficiency can be found so that the scale of the fuzzy rule base can be reduced. In addition, the weight values of fuzzy rule can also be obtained on the basis of firing frequency. The distributions of fuzzy rule

831

Page 3: [IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

using for the orientation controller and the speed controller are depicted in Fig. 6.

ZOLS RS RM RLLL LM

1

0 1.81.41-1-1.4-1.8 ω (rad/s)

ZOLS RS RM RLLL LM1

0 1.81.41-1-1.4-1.8 ω (rad/s)

(a) Scanned left and right area for fuzzy inputs (upper plot). Orientation speed command for fuzzy output (down plot).

5550454035300

VFFLFMLSSZO

1

v (cm/s)5550454035300

VFFLFMLSSZO

1

v (cm/s)

(b) Scanned front area and relative distance for fuzzy inputs (upper two plots). Forward speed command for fuzzy output (down plot).

Fig. 5. Fuzzy sets and membership functions for the behavior controller of robot.

B. Hand Gesture Recognition Based Call-and-come Service The call-and-come service illustrated in Fig. 7. This

function is mainly due to consideration has been mobility users, so you need walking aids to assist walking. Thus, we do not need to let users struggling to robot do use, as long as the remote calls through a computer monitor camera functions, namely allows users robot came from sidings autonomous navigation, does not need to take the initiative to use near robot. In the functional part of the remote call, we mainly gestures for remote users call the action, put a fist through the camera user gestures transmitted via remote monitoring PC Ethernet. The system determines robot, at this time of the call by the user, and perform autonomous navigation of the action, its function as a self-dodge obstacles ahead and reach the vicinity of the user, we set the default relative distance (RD) is 25 cm, with the front area of the speed controller adaptation is calculated by the forward speed v, w As you turn the control amount according

800 1090 1530 1870 2210 2600

800

1090

1530

1870

2210

2600

Left Area (cm2)

Rig

ht A

rea

(cm2 )

(a)

250 350 450 550 650 750

500

770

1010

1260

1500

1700

Relative Distance (mm)

Fron

t Are

a (c

m2 )

(b)

Fig. 6. Rule firing statistics. (a) Fuzzy orientation controller. (b) Fuzzy speed controller.

System Go

System Stop

CommandEthernet

WirelessEthernet

Remote Monitoring Server

Call-and Come Service

System GoSystem Go

System StopSystem Stop

CommandEthernet

WirelessEthernet

Remote Monitoring Server

Call-and Come Service

System Go

System Stop

CommandEthernet

WirelessEthernet

Remote Monitoring Server

Call-and Come Service

System GoSystem Go

System StopSystem Stop

CommandEthernet

WirelessEthernet

Remote Monitoring Server

Call-and Come Service

Fig. 7. Conceptual diagram for the call-and-come service.

to the size of the area determined by turning around the controller. In addition, the user's actions put fingers open, the system determines Robot stop this time, waiting for the user in place for further instructions or walking auxiliary functions.

In this study, the objective is to use human interface to computer vision-based gesture recognition system to reach the indoor environment provides mechanisms for user services platform. In terms of hardware image processing algorithms to do with CCD camera lens for laptops as to capture a portion of the input image. Software aspects of the use of Visual Studio 2008 version as the development of image processing software. As part of the first algorithm using background subtraction (Background Subtraction) to retrieve the target object, and then re-use morphology (Morphology) in the expansion (Dilation), erosion (Erosion), open (Opening), closed (Closing) ... and other operations outside of the target object to

832

Page 4: [IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

Fig. 8. Vision-based hand gesture recognition flowchart.

Fig. 9. Hand gesture recognition performance and the output results.

remove the noise and damage the image of the target object for the appropriate repair; then uses the object connectivity method to larger portions of the target object do to fill the internal noise block the action to show the full image part, and finally finished filling image processing through the skin detection algorithm can identify possible orders gesture block part, namely, the area of interest (Region of interest, ROI). The flowchart for the hand gesture recognition is shown in Fig. 8.

Identification of the posture calculated saturation level respectively, aspect ratio, etc. ... the relative position for the geometric characteristics of the block of interest (ROI). The calculation results blurred and fuzzy inference by inference the real orders gesture block section. After then made a gesture command block image, the image is sent to do gesture recognition unit identification; gesture recognition unit Sobel operator handling the first to use the outer contour of the capture and encode the action, and then combined with geometry features identification method based on the curvature of the last portion of the finger to detect the number of shot results. According to gesture recognition unit detects out the results to a computer on a wheeled robot receiving end via TCP/IP network transport mechanism unlimited transfer, and wheeled robot issued the appropriate commands, providing users with the call-and-come service.

The vision system output can be divided into two parts: image recognition results showed that some parts of the wheeled robot services. Imaging results showed that some were taking multiple window display processing results. Window displays the original image from the CCD camera lens to capture the original video images coming; gesture detection window displays the command area through fuzzy inference algorithm to detect the gesture order was given

region results in Fig. 9; gesture contour detection window display unit through gesture recognition after processing the results of the hand region contour diagram; identification window displays the number of finger gesture recognition unit through the center of curvature analysis and algorithms to remove noise of the resulted image; identification result window to show identification results through simple pictures. Figure 10 shows the exampled process from the input image to the final recognition result.

x

y

0

X

Y

2d

Right Wheel

Left Wheel

P0

PcLφ

x

y

0

X

Y

2d

Right Wheel

Left Wheel

P0

PcLφ

Fig. 10. Robot coordinate with respect to the global coordinate.

C. Navigation Motors furnished architecture we use two motors around

the robot individual fixed sides, mutually independently control the left side of the two wheels individually. Since we use the differential motor control, it does not need to rotate and drive the coordinates but requires coordination between two momentum wheels to be able to move forward and turn. Figure 10 presents the global coordinates of 0-xy, and P0 is the center

of the robot. Define [ , , , , ]Tr lq x y φ θ θ= , where (x, y)

denote the position of the robot, φ is the robot orientation, θr and θl are the rotation angle of the right wheel and the left wheel, respectively. Without considering the tire slip, the robot itself must comply with the following three motion constraints:

sin cos 0x yφ φ− = (1)

cos sin rx y b rφ φ φ θ+ + = (2)

cos sin lx y b rφ φ φ θ+ − = (3)

The above equations can be further arranged as

( ) 0A q q = (5)

sin cos 0 0 0( ) cos sin 0

cos sin 0A q b r

b r

φ φφ φφ φ

−⎡ ⎤⎢ ⎥= −⎢ ⎥⎢ ⎥− −⎣ ⎦ (6)

Rewrite the two-wheel drive to get the robot motion according to equation R. Fierro [9] proposed dynamic equation is as follows:

( ) ( )q S q v t= (7)

( ) ( , ) ( )M q v V q q v B q τ+ = (8)

where

833

Page 5: [IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

2d

SL

SR

c

φ

2d

SL

SR

c

φ

Fig. 11. Illustration of robot turning.

cos cos2 2

sin sin2 2

( )

2 21 00 1

r r

r r

S q r rd d

φ φ

φ φ

⎡ ⎤⎢ ⎥⎢ ⎥⎢ ⎥⎢ ⎥

= ⎢ ⎥⎢ ⎥−⎢ ⎥⎢ ⎥⎢ ⎥⎢ ⎥⎣ ⎦

2 22 2

2 2

2 22 2

2 2

( ) ( )4 4

( ) ( )4 4

r rmd I I md Id dM

r rmd I md I Id d

ω

ω

⎡ ⎤+ + −⎢ ⎥

⎢ ⎥=⎢ ⎥

− + +⎢ ⎥⎣ ⎦ 2

2

02

02

c

c

r m LdV

r m Ld

φ

φ

⎡ ⎤⎢ ⎥⎢ ⎥=⎢ ⎥−⎢ ⎥⎣ ⎦ ,

1 00 1

B ⎡ ⎤= ⎢ ⎥

⎣ ⎦ .

r is the radius of motor wheel, mc and mw are the mass of the robot and the motor wheel, respectively. τ = [τr, τr]T which denotes the torque of right motor and left motor, respectively. m=mc+2mω , I = mcL2 +2mωd2+Ic+2Im , where Ic is the moment of inertia of the body about the vertical axis through Pc; Iω is the wheel with a motor about the wheel axis; Im is the wheel with a motor about the wheel diameter.

We can measure the amount of left and right wheels via a walking distance of the very center of the current travel distance robot, as shown in Fig. 11. From this geometry, we can obtain the following:

= ( - )LS c dϕ ⋅ (9) = ( )RS c dϕ ⋅ + (10)

and yield the calculation of central angle

=( - )/2R LS S dϕ (11)

Then, using the angular velocity (Radians) multiplied by the radius equal to the tangential velocity of the forward equation:

=2

=2R R

L L

V r

V r

π θ

π θ

⋅ (12)

Finally, we get a differential motor control for the forward kinematics as follows:

1 12 2=21 1-

2 2

L

R

vr

wd d

θπ

θ

⎡ ⎤⎢ ⎥ ⎡ ⎤⎡ ⎤⎢ ⎥ ⎢ ⎥⎢ ⎥⎢ ⎥⎣ ⎦ ⎣ ⎦⎢ ⎥⎣ ⎦

(13)

with the inverse of (13), and set the motor rotation speed L lvθ = and R rvθ = , the robot coordinates with the orientation

can be determined as follows:

cos cos2 2

sin sin2 2

2 21 00 1

r

lr

l

r r

xr ry

vdr r vdtd d

φ φ

φ φφθθ

⎡ ⎤⎢ ⎥

⎡ ⎤ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎡ ⎤⎢ ⎥ = ⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎣ ⎦−⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎣ ⎦ ⎢ ⎥

⎢ ⎥⎣ ⎦

(14)

Based on the developed robot dynamics model and the SLAM-EKF algorithm, the robot is able to perform a certain degree of accurate path tracking especially when the map is also considered known and static. The only information that needs to be computed in real time is the robot updated pose. When the user activates the calling service, the home server will compute the shortest path from the current position of the robot to that location. This study employs the A star algorithm on the predefined map of the environment. The output points then will be transmitted to the robot for representing a trajectory. The trajectory is also smoothed using circle arcs to avoid sharp turns that the robot cannot perform.

IV. EXPERIMENTS

Most of the home environment and public walkway contrast corridor width is much smaller, and put things home environment to the many public occasions, relative to where things will affect the overall complexity of the environment Therefore, we hope to use the narrow aisle inside the testing laboratory to simulate the robot use in the home environment, and also test it in a more complex environment to examine the effect of its controller and the motor output is generated allows users uncomfortable happen. Figure 12 demonstrates the walking-aid operation. When the robot senses the user grip force signal, the robot can provide the proper speed control and the orientation control to assist the user to the selected goal. The right part of Fig. 14 is the remote monitoring interface that provides the sensory information from the robot and shows the robot motion control in real time.

Next we test the call-and-come service. The overall functional flowchart is depicted in Fig. 13. The ability to monitor a remote computer to a different user gestures, to call for robot service, first we make a fist gesture to be a call for robot, which aims to advance autonomous navigation function is turned on, as shown in Fig. 14, the sequence of such labels. We enter the remote PC user fist gesture, computer judged the autonomous navigation mode, the wireless network via WiFi function commands transmitted to robot. After receiving a command such as labels (a), navigation mode starts and the robot passed the narrow aisle room, came close to the user, then the user's gesture put fingers open, the server recognizes

834

Page 6: [IEEE 2014 IEEE International Conference on Automation Science and Engineering (CASE) - Taipei (2014.8.18-2014.8.22)] 2014 IEEE International Conference on Automation Science and Engineering

the stop command and the robot stops and waits for the operation of the user.

Fig. 14. Walking-aid trials (left) and the remote monitoring interface (right).

V. CONCLUSION In this paper, we propose an intelligent walking-aid robot

system. In terms of hardware, including the configuration and testing, development and configuration of embedded controller uses a laser range finder, ultrasonic radar, grip sensors and other environmental sensors. Overall walking accessibility robot can prevent collision with a navigation, adaptation speed control, remote monitoring, call service (autonomous navigation) and passive cornering control. In practical tests after several different environments, the safety walking assist features can be completed, and leading the user at smooth and stable speed with the obstacle by narrow spaces or corridors.

REFERENCES [1] O. Chuy, Y. Hirata and K. Kosuge, “A new control approach for a

robotic walking support system in adapting user characteristics,” IEEE Trans. on Systems, Man, and Cybernetics, Part-C, vol. 36, no. 6, pp. 725-733, 2006.

[2] H. M. Shim, C. Y. Chung, E. H. Lee, H. K. Min and S. H. Hong, “Silbo: Development walking assistant robot for the elderly based on shared control strategy,” Int. Journal of Computer Science and Network Security, vol. 6, no. 9A, Sep. 2006.

System On

All Sensors and Motors

Ready?

YES

NO

Standby

ReceiveSystem GoCommand

YES

NO

Self-navigation

ReceiveSystem StopCommand

YES

NO

User Detecting?

YES

Grip Force SensorOn

YES

NO

NO

Walking Support Function

Alarm

System On

All Sensors and Motors

Ready?

YES

NO

Standby

ReceiveSystem GoCommand

YES

NO

Self-navigation

ReceiveSystem StopCommand

YES

NO

User Detecting?

YES

Grip Force SensorOn

YES

NO

NO

Walking Support Function

Alarm

Fig. 13. System functional flowchart for the navigation and the walking-aid service.

Fig. 14. Photoshoot for the call-and-come service demonstration.

[3] Y. Hirata, A. Hara and K. Kosuge, “Passive-type intelligent walking

support system RT walker,” in Proc. 2004 IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 3871-3876, 2004.

[4] G. Wasson, J. Gunderson, S. Graves and R. Felder “Effective shared control in cooperative mobility aids,” in Proc. Fourteenth International Florida Artificial Intelligence Research Society Conference, pp. 509-513, 2001.

[5] C. Huang, G. Wasson, M. Alwan, P. Sheth and A. Ledoux, “Shared navigation control and user intent detection in an intelligent walker,” in Proc. AAAI Symp. (EMBC), Arlinton, 2005.

[6] Y. Hirata, A. Hara and K. Kosuge, “Motion control of passive intelligent walker using Servo Brakes,” IEEE Trans. on Robotics, pp. 981-990, 2007.

[7] H. H. Chiang, Y. L. Chen, and C. T. Lin, “Human-robot interactive assistance of robotic walking support in a home environment,” in Proc. IEEE 17th International Symposium on Consumer Electronics, pp. 263-264, 2013.

[8] B. Kosko, Neural networks and fuzzy systems, Prentice Hall, New Jersey, 1992.

[9] R. Fierro and F. L. Lewis, “Control of a nonholonomic mobile robot: backstepping kinematics into dynamics,’’ in Proc. 34th IEEE Conf. Decision Control, pp. 3805-3810, 1995.

835