23
Research Article Development of Robust Behaviour Recognition for an at-Home Biomonitoring Robot with Assistance of Subject Localization and Enhanced Visual Tracking Nevrez Imamoglu, 1 Enrique Dorronzoro, 2,3 Zhixuan Wei, 4 Huangjun Shi, 5 Masashi Sekine, 1 José González, 6 Dongyun Gu, 5 Weidong Chen, 4 and Wenwei Yu 2 1 Graduate School of Engineering, Chiba University, Chiba 263-8522, Japan 2 Research Center for Frontier Medical Engineering, Chiba University, Chiba 263-8522, Japan 3 Department of Electronic Technology, University of Seville, 41012 Seville, Spain 4 Institute of Robotics and Intelligent Information Processing, Department of Automation, Shanghai Jiao Tong University, Shanghai 200240, China 5 Department of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai 200030, China 6 Spanish National Research Council, Bioengineering Group, 28500 Madrid, Spain Correspondence should be addressed to Nevrez Imamoglu; [email protected] Received 20 August 2014; Accepted 4 November 2014; Published 21 December 2014 Academic Editor: Ji-Xiang Du Copyright © 2014 Nevrez Imamoglu et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Our research is focused on the development of an at-home health care biomonitoring mobile robot for the people in demand. Main task of the robot is to detect and track a designated subject while recognizing his/her activity for analysis and to provide warning in an emergency. In order to push forward the system towards its real application, in this study, we tested the robustness of the robot system with several major environment changes, control parameter changes, and subject variation. First, an improved color tracker was analyzed to find out the limitations and constraints of the robot visual tracking considering the suitable illumination values and tracking distance intervals. en, regarding subject safety and continuous robot based subject tracking, various control parameters were tested on different layouts in a room. Finally, the main objective of the system is to find out walking activities for different patterns for further analysis. erefore, we proposed a fast, simple, and person specific new activity recognition model by making full use of localization information, which is robust to partial occlusion. e proposed activity recognition algorithm was tested on different walking patterns with different subjects, and the results showed high recognition accuracy. 1. Introduction At-home biomonitoring systems have become an important solution in the medical field as an assistive technology for the people who have difficulties leaving their houses, such as elderly people or motor-function impaired persons (MIPs) [18]. Benefits of these systems include more convenient and comfortable ways of healthcare for patients and reduction in the workload of the therapists [58]. At-home healthcare applications cover a wide range of topics regarding physiological measurements or biomonitor- ing applications. Biomonitoring research in at-home health care emerged from the demand on improving the quality of life (QoL) of the people, such as to measure the vital signs, check their health over through the measurements, identify abnormalities from long-term at-home observation, and monitor rehabilitation assistance. In this research, our concern is mostly focused on the observation and analysis of motor function-related daily activities of the people in need such as the elderly and the motor-function impaired people. For this reason, tracking the subject and detection of daily activities are crucial tasks to be performed. e system should record data of continuous walking patterns for further analysis by the experts or Hindawi Publishing Corporation e Scientific World Journal Volume 2014, Article ID 280207, 22 pages http://dx.doi.org/10.1155/2014/280207

Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

Research ArticleDevelopment of Robust Behaviour Recognition foran at-Home Biomonitoring Robot with Assistance ofSubject Localization and Enhanced Visual Tracking

Nevrez Imamoglu1 Enrique Dorronzoro23 Zhixuan Wei4 Huangjun Shi5

Masashi Sekine1 Joseacute Gonzaacutelez6 Dongyun Gu5 Weidong Chen4 and Wenwei Yu2

1Graduate School of Engineering Chiba University Chiba 263-8522 Japan2Research Center for Frontier Medical Engineering Chiba University Chiba 263-8522 Japan3Department of Electronic Technology University of Seville 41012 Seville Spain4Institute of Robotics and Intelligent Information Processing Department of Automation Shanghai Jiao Tong UniversityShanghai 200240 China5Department of Biomedical Engineering Shanghai Jiao Tong University Shanghai 200030 China6Spanish National Research Council Bioengineering Group 28500 Madrid Spain

Correspondence should be addressed to Nevrez Imamoglu nevrezgmailcom

Received 20 August 2014 Accepted 4 November 2014 Published 21 December 2014

Academic Editor Ji-Xiang Du

Copyright copy 2014 Nevrez Imamoglu et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

Our research is focused on the development of an at-home health care biomonitoring mobile robot for the people in demand Maintask of the robot is to detect and track a designated subject while recognizing hisher activity for analysis and to provide warningin an emergency In order to push forward the system towards its real application in this study we tested the robustness of therobot system with several major environment changes control parameter changes and subject variation First an improved colortracker was analyzed to find out the limitations and constraints of the robot visual tracking considering the suitable illuminationvalues and tracking distance intervals Then regarding subject safety and continuous robot based subject tracking various controlparameters were tested on different layouts in a room Finally the main objective of the system is to find out walking activities fordifferent patterns for further analysis Therefore we proposed a fast simple and person specific new activity recognition model bymaking full use of localization information which is robust to partial occlusion The proposed activity recognition algorithm wastested on different walking patterns with different subjects and the results showed high recognition accuracy

1 Introduction

At-home biomonitoring systems have become an importantsolution in the medical field as an assistive technology forthe people who have difficulties leaving their houses suchas elderly people ormotor-function impaired persons (MIPs)[1ndash8] Benefits of these systems include more convenient andcomfortable ways of healthcare for patients and reduction inthe workload of the therapists [5ndash8]

At-home healthcare applications cover a wide range oftopics regarding physiological measurements or biomonitor-ing applications Biomonitoring research in at-home health

care emerged from the demand on improving the qualityof life (QoL) of the people such as to measure the vitalsigns check their health over through the measurementsidentify abnormalities from long-term at-home observationand monitor rehabilitation assistance

In this research our concern is mostly focused on theobservation and analysis of motor function-related dailyactivities of the people in need such as the elderly and themotor-function impaired people For this reason tracking thesubject and detection of daily activities are crucial tasks tobe performed The system should record data of continuouswalking patterns for further analysis by the experts or

Hindawi Publishing Corporatione Scientific World JournalVolume 2014 Article ID 280207 22 pageshttpdxdoiorg1011552014280207

2 The Scientific World Journal

MIP search and detection

MIP behavioranalysis andclassification

Mapping andlocalization

MIP trackingand localization

Sensors data(odometry lidar Kinect)

Emergencyaction

Robot behavior

Safetycriteria

Behaviortraining

Figure 1 Required modules to implement an at-home biomonitoring mobile robot for the support of the elderly or MIPs

computer-aided diagnosis systems Therefore in this studywe put our primary concern on tracking of subjects anddetection of daily activities at-home with a mobile robot asan assistive technology to support the people with motor-function impairment

There are several approaches for daily observation thatprovide information about the activities that the subjects areperforming [9ndash16] (i) wearable sensors that include a widerange of different sensors such as accelerometers and (ii)smart house systems that implement solutions with multiplevision devices attached to certain fixed sites in a house Themain advantages of wearable systems are that they providecheap and accurate solutions for activity recognition andanalysis The complexity of these systems regarding therecognition algorithms and maintenance is lower than othersolutions such as smart houseswithmultiple-camera systemsHowever the main disadvantage of wearable sensors is theattention required by the users Subjects can be bothered bythe fact that they have to wear many sensors and they have tobe careful to avoid damaging the sensors while doing dailyliving things sitting eating sleeping and so forth Quiteoften subjects have to put on and off and charge the batteriesof the sensors

On the other hand smart houses do not have thesedisadvantages daily activities can be tracked by systems thatincludemultiple observation devices such asmotion trackingsystems cmosccd cameras and color and infrared camerassuch as Kinect [8 10 11 17 18] For instance Tamura et al [7]proposed an at-home biomonitoring system in which auto-mated electrocardiogram (ECG) signals are measured andobserved in some of the arranged places at home such as bedtoilet or bathroom without using body surface electrodesEven if they provide reliable observation capabilities to detectand analyze the daily activities performed by the subjectsthey also have some disadvantages Due to the number ofobservation devices these systems are expensive and hard toset up and maintain Moreover despite the large number ofsensory devices it is still possible to have blind spots in indoorenvironments that can prevent the monitoring

As proposed in our previous works [5 8 19 20] ourmain idea is to use assistive robotics to support elderly ormotor-function impaired persons biomonitoring Althougha mobile system brings more complexity to handle dailyobservation tasks a mobile robot solution includes several

advantages (i) it is a cheaper smart system than smarthouses (ii) it could make full use of the capability of sensorssince the sensors could be brought to a position and anangle optimal or suboptimal to observation (iii) it avoidsthe multiple vision modules by using a mobile monitoringsystem and (iv) it prevents the need of the subjects forwearing any sensors [5 8 19 20] compared towearable sensorapplicationsThese advantages make the mobile robot a goodcandidate for assistive technologies In addition based onthe Robocare project Cesta et al [21] made a preliminaryexperiment about the way that people evaluate the use of arobot at home for different purposes such as surveillanceservice or companion tasks [21 22] In most cases the robotand subjects had some interactions considering the tasks foreach scenario [21] such as accepting voice commands fromsubjects and serving In these experiments with 40 elderlysubjects activities that are usual to occur during the dailyliving were investigated [21] After making a questionnaireto the experiment subjects they gave a positive response [21]and supported the idea of havingmobile robotics applicationsfor at-home assistive tasks

11 Background Therefore in our previous research [5 8 1920] we presented an at-home biomonitoring mobile robotproject to improve the quality of life of motor-functionimpaired persons (MIPs) The robot observes the subject andrecognizes the activity he is performing It is also able toprovide analysis of walking pattern if the lower limb jointtrajectories were traceable [5 8] To achieve these tasksthere are several modules that have to be developed Figure 1presents the scheme of these modules

Until this moment of our current research we havefocused on some of these modules Due to the low resolutionimage high illumination changes and low texture detailsfor our real time applications and color-based trackerssupported by the depth data are the most suitable optionsSo we developed a visual tracking model in [20] using animproved color tracking algorithm that takes advantage ofparticle filtermodel depth data and bottom-up saliencymap[20] Saliency maps are the visual attention computationalmodels inspired by the visual system [20] The idea isto identify attentive regions on the scene by reducing toredundancy which improved visual tracking in [20] whencombined with the particle filter color tracker Particle filter

The Scientific World Journal 3

based color tracker is a robust tracker however there aremany parameters to consider the motion of the robot shouldbe integrated to particle motion model to increase the visualtracker performance which is very difficult to implementbecause of the indeterministic and uncertain behaviour ofthe robot that depends on the subject position For the robotbehaviour in most of the cases we have made experimentsusing a reactive control model [5 8 19] for the robot move-ment while tracking the subject of interest However withouthaving a global reference of robot and subject position it isvery difficult to implement a robust mobile tracking systemBy knowing the global position it is possible to have differentoptions for the movement of the robot using path planningalgorithms

In addition activity recognition regarding the MIP be-haviour analysis and classification (see Figure 1) was accom-plished by analyzing the features from the human bodycontour This was done by extracting height and width ratiosfrom the several body parts from the binary images ofextracted subject tracking regions and then Hidden Markovmodel (HMM) [5 8] was used to classify the current action byobserving the changes in the body structure by using the jointangles of the lower limb [5 8] Moreover normal walkingand impaired walking were differentiated by HMM-basedclassification with high accuracy However the problem withthe features used in [5 8] is that it is not robust when there ispartial occlusion of the lower limb part of the body Lowerlimb part of the body is crucial to discriminate betweenwalking and standing during the activity recognition processAlso this algorithm requires specific training models foreach subject separately so it is hard to obtain good activitydetection results especially for dynamic activities such aswalking for several subjects by using only one trainedmodel

Due to the problems stated in the previous versions of ourwork [5 8 19 20] several improvements and experimentalanalysis have been done for the robot behavior and subjecttracking

(i) First of all instead of using reactive control robotoperating system (ROS) [23] was integrated to oursystem as many works also take advantage of ROSfor the mobile robot applications [23ndash26] The mainadvantages of using ROS are the global mappingand localization [23ndash26] functionalities that increasethe performance of the robot motion provide moreflexibility to robot during subject tracking and enablethe use of amore robust activity recognition approachunder occlusion cases

(ii) Then we have changed depth and saliency improvedparticle filter color tracker with a new color trackingapproach presented in [27] This approach mainlychanges the particle filter model with scale and ori-entation adaptive mean shift tracker (SOAMST) [27]since it also provides good tracking results even underscale and orientation changes [27] with less parameteradjustments Also depth integration [19 20] is alsodone for the SOAMST algorithm too to improve therobustness of visual tracking

12 Contribution Despite several stated improvements onour robot structure the main contribution of this paper issummarized in the following points

(i) To be able to achieve robust robot tracking and activ-ity recognition visual tracking should be satisfactoryenough to find subject on the scene in changing envi-ronmental conditions For example different lightconditions which are also affected by the pose anddistance between subject and robot can affect thecolor information that can lead to possible false track-ing So we have researched how illumination changesaffect the visual tracking which is crucial in dailyat-home biomonitoring By visual tracking analysiswith different illumination conditions it is possibleto determine which lux (quantitative measurementof the light conditions) values provide better orworse tracking results Then robot tracking can beimproved with the consideration of these constraintsand also as a future work the robot would be able toswitch between depth-based and color-based trackingto increase the visual tracking performance

(ii) There are many works on the analysis of navigationand path planning for mobile robot and how theenvironmental layout affects the behaviour duringnavigation [28ndash30] without the consideration of con-tinuous subject tracking But there are not stud-ies focused on analyzing the robot motion relatedparameters for robot based subject tracking regardingthe visual tracking constraints such as illuminationand distance environmental settings smoothness ofthe tracking path of the robot and subject safetyTherefore in this work several experiments weredone to analyze and find the suitable parameters forour indoor biomonitoring robot task

(iii) With the integration of localization and mappingalgorithms we proposed a new localization assistedactivity recognition algorithm which suggests a fastmodel and is robust to occlusion cases where thelower limb body is not visible A state machine-basedactivity recognition approach is used where globallocation the height and ratio of the upper body partwithin a defined region of the subject are used Thereis no need of using training data because it utilizesperson specific heuristics to classify the activity of thesubject by considering activity state changes

The paper continues as follows Section 2 describes therobot platform including both hardware and software com-ponents considering ROS and its integration to systemvisual and robot based tracking and activity recognitionSection 3 gives the visual tracking tests and the details forillumination change based experiments on color based visualtracking Then Section 4 shows the experimental analysiswith robot behaviour with consideration of illuminationconstraints obtained by visual tracking analysis with differentlighting conditions during subject tracking with differentparameters for robot path planning Section 5 demonstratesthe experimental results on the proposed activity recognition

4 The Scientific World Journal

Robot Rotating

Rotatingplatform

LRF

Sonar

Point cloud

Laser scan

Speed

MATLAB

MATLABVisual tracking

Activity recognitionRobot destination

ROS

Subject pose Subject pose(local) (global)

Robotpose

Path

Local planner

bridge

Command

LocalizationOdometry

Kinect

Pioneer P3-DX

Goal

Global path planner

Figure 2 System structure including hardware and software com-ponents

887

mm

400mm

Kinect

Rotating table

PC base

Laser range finder(LRF)Pioneer P3-DX

base platform

450mm

Figure 3 Mobile robot hardware structure

algorithm Finally discussion and conclusion remarks aregiven

2 System Architecture

The system structure is given in Figure 2 where all the com-ponents hardware and software of the system are presentedThe hardware is composed of the robot platform and itscomponentsThe software part includes the ROS [23 25] andalgorithms in MATLAB codes to perform a given task suchas visual tracking robot tracking and activity recognition

21 System Hardware The robot hardware (Figure 3) isassembled base on Pioneer P3-DX equipped with a laserrange finder (LRF) and a Kinect sensor on a rotating platform(Figure 4) The LRF is used for mapping localization andobstacle avoidance The Kinect is used for subject detectionand it is mounted on a rotating platform in order to extend

160mm

119

mm

60mm

60mm

Figure 4 Rotating table

the detection range so that the visual tracking can be achievedeven if there is the difficulty for the robot to change thepose

22 System Software Thesystemuses ROS for robot behaviorcontrol and MATLAB for an improved visual subject trackeractivity recognition and robot command control

23 ROS Integration into the System ROS is used for map-ping localization and robot movement (to a point estab-lished byMATLAB) and converts the local subject pose to theglobal subject pose In detail the ROS communicates with therobot using the ROSARIA package [23] The communicationpurpose is to have control of the robot and receive odometrydata To control the robot MATLAB sends three types ofcommands rotating with an angle going to a goal andstopping (shown as robot destination task in Figure 2) andthe communication between ROS system and MATLAB isachieved by using MATLAB engine

As for the control of the robot movement that allowssending the robot to a specific point a navigation system isimplemented in ROS First we manually control the robotto explore the surrounding environment and use Gmappingpackage [24 25] to map the environment by combining theinformation of LRF and odometry (see Figure 5) From thispoint ROS navigation stack including localization costmapglobal path planner and local planer is used for autonomousrobot motion planning Adaptive Monte Carlo localizationapproach (AMCL) [26] is used to locate the current pose ofrobot based on the map built by Gmapping The costmap[31] is an occupancy grid map which keeps all the obstacleinformation including the static map and dynamic updatinginformation provided by robot sensors The cost of eachgrid is related to the distance between the robot and anobstacle For example when the robot is closer the cost isbigger

According to the current position of robot the desti-nation point is established by MATLAB and the costmapand a Dijkstrarsquos algorithm based global path planner isimplemented to plan a shortest feasible path Trajectoryrollout [32] and dynamic window approach (DWA) [33] areused to generate the velocity commands that are sent tothe robot in order to follow the path generated by globalpath planning until the robot arrives to the destination or

The Scientific World Journal 5

Wall elapsed 68714 ROS time 139667385937 ROS elapsed 68714

Figure 5 ROS localization and path planing on the global map

Figure 6 Manually selected region

it is stopped by MATLAB command Since the costmapcontains the obstacle information the global path plannerand local planer can avoid the obstacles automatically (seeFigure 5)

24 Visual Tracking The visual tracking algorithm is imple-mented on MATLAB by using the scale and orientationadaptive mean shift tracking (SOAMST) algorithm [27]We made improvement by integrating Kinect depth imagesupport to increase the importance of weights based on thedepth likelihood as proposed in [19 20] In addition saliencymap integration is also possible by updating the weightsas proposed in [20] but in this work we omitted usingsaliency for real-time processing performance The details ofthe SOAMST color region tracker can be find in [27] anddepth and saliency improvement details can be implementedsimilar to the [19 20]

Depth improved SOAMST is explained as follows [19 2027]

(a) Initial targetmodel is calculated bymanually selectingthe region of interest (Figure 6) on the subject based onnormalized color histogram from R G and B color channelswith 16 bin quantization and initial target position (119910

0) is

decided by the target model

Figure 7 Depth likelihood map

(b) Based on the initial target position initialize the targetdepth or distance to the robot from depth data 119863 as follows

tempDepth = 119863 (119910 0 minus 5 119910 0 + 5 119909 0 minus 5 119909 0 + 5)

1198890= round (median (double (tempDepth ())))

(1)

(c) Set iteration number 119896 to 0(d) For the current depth map based on the previous

target position 1199100 find the depth likelihood map (Figure 7)

as in (2) in which reference target region point will have thehighest likelihood to the points at current frame with similardepth

119875119910=

1

120590radic2120587exp(

minus(119863119910minus 1198890)2

21205902) (2)

where 119875119910is the depth likelihood map calculated by the depth

map 119863119910based on the depth values at point 119910 and 119889

0is the

target depth value at previous frame(e) For the current data based on the previous target

position 1199100 find the target candidate model

(f) Calculate the weights of each pixel in the targetcandidate region as

119908119894= 119889119894

119898

sum119906=1

radic119902119906

119901119906(1199100) (3)

where 119889119894is the depth likelihood value obtained from 119875

119910

and 119902119906and 119901

119906(1199100) are the reference target model and target

candidate model of the search region(g) Then new target position can be calculated as

1199101=sum119894119909119894119908119894

sum119894119908119894

(4)

where 1199101is the new matched point for the subject position

obtained by comparing the initial target model and currentframes candidate model 119909

119894is the points in the search region

and 119908119894is the weights at 119909

119894

6 The Scientific World Journal

(h) This operation continues until matching point con-verges to a value as in

Δ119889 =10038171003817100381710038171199101 minus 119910

0

1003817100381710038171003817

1199100= 1199101

if Δ119889 lt 120576 119896 ge 15

go to step (i)

else

119896 = 119896 + 1

go to step (e)

(5)

(i) After the new tracking point is found the heightwidth and orientation of the target candidate model can becalculated by singular value decomposition of the covariancematrix obtained by the second order moment features asfollows

[11990611

11990612

11990621

11990622

] times [1198862 0

0 1198872] times [

11990611

11990612

11990621

11990622

]

119879

= SVD (Cov) (6)

Cov = [12058320

12058311

12058311

12058302

]119886 = 119896120582

1

119887 = 1198961205822

119880 = [11990611

11990612

11990621

11990622

] (7)

where [11990611 11990621]119879 and [11990612 119906

22]119879 can yield the orientation of

the tracked region and 1198862 and 1198872 are the scaled eigen valuesof Cov in (7) which can represent the height and width of thetracked ellipsoid region

(j)Then next step is to find next search region for the nextframe as

Cov2= 119880 times [

(119886 + Δ119889)2

0

0 (119887 + Δ119889)2] times 119880

119879 (8)

(119909 minus 1199101) times Covminus1

2times (119909 minus 119910

1)119879le 1 (9)

where the points 119909 that provide the conditions in (9) can be inthe next search region Δ119889 is the change for the search regionin the next frame

25 Activity Recognition By the integration of mapping andlocalization new activity recognition has been proposedto handle the problems of our previous approach such asocclusion cases [5 8] This update includes a new fast andperson specific model that does not require any trainingprocess using high amount of off-line training data The newmodel takes advantage of person specific heuristics statetransition rules and global localization of the subject byutilizing the information through robot localization Usingthese features the model classifies the possible indoor dailyactivities such as walking standing sitting bending lyingdown cycling and falling The state transition rules betweenactivities are given in Figure 8

Person specific features include variables such as

(1) height parameter based on the extracted contourregion of the tracked subject

Standing Walking

Sitting

Bending Lying-down

Cycling

Falling

Figure 8 State transition applied in activity recognition model

(2) change in the height parameter between frames(3) a ratio parameter that can be calculated through top

region of the extracted body contour (see Figure 9)(4) subject location based on the tracking point on the

subject calculated by the visual tracking algorithmusing Kinect and robot global position obtained bylidar and odometry (The transformation from localKinect 3D data to global 2D map is given in (10) [34]

[119878119866119909

119878119866119910

] = A times [119878119871119909

119878119871119910

] + [119879119866119909

119879119866119910

] (10)

A = [119904119909cos (120572) minus119904

119910sin (120572)

119904119909sin (120572) 119904

119910cos (120572) ] (11)

119879119866119909

= 119877119866119909

minus 119889 cos (120579)

119879119866119910

= 119877119866119910

minus 119889 sin (120579) (12)

where 119878119866119909

and 119878119866119910

are the subjectrsquos global 119909 and 119910

positions on global 2D map and 119878119871119909

and 119878119871119910

are thelocal Kinect data of the subject on the relevant 119909-axis and 119910-axis of Kinect to global 2D map 120572 is theKinect pose on global map calculated by the robotpose and rotating table angle The transformationmatrix A is shown in (11) and 119904

119909and 119904119910

of A arethe scaling coefficients on 119909-axis and 119910-axis 119879

119866119909

and 119879119866119910

are the translation values on 119909-axis and 119910-axis respectively due to the difference between robotcenter and Kinect position on the robot Translationvalues were computed as in (12) with the given 119877

119866119909

and 119877119866119910

which are the robot positions in global 2Dmap and 119889 is the distance between Kinect positionand the robot center And the robot pose on globalmap is given as 120579)

(5) velocity of the subject between a defined number offrames

As it has been mentioned that the new model can beused to classify static activities such as standing sitting and

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

2 The Scientific World Journal

MIP search and detection

MIP behavioranalysis andclassification

Mapping andlocalization

MIP trackingand localization

Sensors data(odometry lidar Kinect)

Emergencyaction

Robot behavior

Safetycriteria

Behaviortraining

Figure 1 Required modules to implement an at-home biomonitoring mobile robot for the support of the elderly or MIPs

computer-aided diagnosis systems Therefore in this studywe put our primary concern on tracking of subjects anddetection of daily activities at-home with a mobile robot asan assistive technology to support the people with motor-function impairment

There are several approaches for daily observation thatprovide information about the activities that the subjects areperforming [9ndash16] (i) wearable sensors that include a widerange of different sensors such as accelerometers and (ii)smart house systems that implement solutions with multiplevision devices attached to certain fixed sites in a house Themain advantages of wearable systems are that they providecheap and accurate solutions for activity recognition andanalysis The complexity of these systems regarding therecognition algorithms and maintenance is lower than othersolutions such as smart houseswithmultiple-camera systemsHowever the main disadvantage of wearable sensors is theattention required by the users Subjects can be bothered bythe fact that they have to wear many sensors and they have tobe careful to avoid damaging the sensors while doing dailyliving things sitting eating sleeping and so forth Quiteoften subjects have to put on and off and charge the batteriesof the sensors

On the other hand smart houses do not have thesedisadvantages daily activities can be tracked by systems thatincludemultiple observation devices such asmotion trackingsystems cmosccd cameras and color and infrared camerassuch as Kinect [8 10 11 17 18] For instance Tamura et al [7]proposed an at-home biomonitoring system in which auto-mated electrocardiogram (ECG) signals are measured andobserved in some of the arranged places at home such as bedtoilet or bathroom without using body surface electrodesEven if they provide reliable observation capabilities to detectand analyze the daily activities performed by the subjectsthey also have some disadvantages Due to the number ofobservation devices these systems are expensive and hard toset up and maintain Moreover despite the large number ofsensory devices it is still possible to have blind spots in indoorenvironments that can prevent the monitoring

As proposed in our previous works [5 8 19 20] ourmain idea is to use assistive robotics to support elderly ormotor-function impaired persons biomonitoring Althougha mobile system brings more complexity to handle dailyobservation tasks a mobile robot solution includes several

advantages (i) it is a cheaper smart system than smarthouses (ii) it could make full use of the capability of sensorssince the sensors could be brought to a position and anangle optimal or suboptimal to observation (iii) it avoidsthe multiple vision modules by using a mobile monitoringsystem and (iv) it prevents the need of the subjects forwearing any sensors [5 8 19 20] compared towearable sensorapplicationsThese advantages make the mobile robot a goodcandidate for assistive technologies In addition based onthe Robocare project Cesta et al [21] made a preliminaryexperiment about the way that people evaluate the use of arobot at home for different purposes such as surveillanceservice or companion tasks [21 22] In most cases the robotand subjects had some interactions considering the tasks foreach scenario [21] such as accepting voice commands fromsubjects and serving In these experiments with 40 elderlysubjects activities that are usual to occur during the dailyliving were investigated [21] After making a questionnaireto the experiment subjects they gave a positive response [21]and supported the idea of havingmobile robotics applicationsfor at-home assistive tasks

11 Background Therefore in our previous research [5 8 1920] we presented an at-home biomonitoring mobile robotproject to improve the quality of life of motor-functionimpaired persons (MIPs) The robot observes the subject andrecognizes the activity he is performing It is also able toprovide analysis of walking pattern if the lower limb jointtrajectories were traceable [5 8] To achieve these tasksthere are several modules that have to be developed Figure 1presents the scheme of these modules

Until this moment of our current research we havefocused on some of these modules Due to the low resolutionimage high illumination changes and low texture detailsfor our real time applications and color-based trackerssupported by the depth data are the most suitable optionsSo we developed a visual tracking model in [20] using animproved color tracking algorithm that takes advantage ofparticle filtermodel depth data and bottom-up saliencymap[20] Saliency maps are the visual attention computationalmodels inspired by the visual system [20] The idea isto identify attentive regions on the scene by reducing toredundancy which improved visual tracking in [20] whencombined with the particle filter color tracker Particle filter

The Scientific World Journal 3

based color tracker is a robust tracker however there aremany parameters to consider the motion of the robot shouldbe integrated to particle motion model to increase the visualtracker performance which is very difficult to implementbecause of the indeterministic and uncertain behaviour ofthe robot that depends on the subject position For the robotbehaviour in most of the cases we have made experimentsusing a reactive control model [5 8 19] for the robot move-ment while tracking the subject of interest However withouthaving a global reference of robot and subject position it isvery difficult to implement a robust mobile tracking systemBy knowing the global position it is possible to have differentoptions for the movement of the robot using path planningalgorithms

In addition activity recognition regarding the MIP be-haviour analysis and classification (see Figure 1) was accom-plished by analyzing the features from the human bodycontour This was done by extracting height and width ratiosfrom the several body parts from the binary images ofextracted subject tracking regions and then Hidden Markovmodel (HMM) [5 8] was used to classify the current action byobserving the changes in the body structure by using the jointangles of the lower limb [5 8] Moreover normal walkingand impaired walking were differentiated by HMM-basedclassification with high accuracy However the problem withthe features used in [5 8] is that it is not robust when there ispartial occlusion of the lower limb part of the body Lowerlimb part of the body is crucial to discriminate betweenwalking and standing during the activity recognition processAlso this algorithm requires specific training models foreach subject separately so it is hard to obtain good activitydetection results especially for dynamic activities such aswalking for several subjects by using only one trainedmodel

Due to the problems stated in the previous versions of ourwork [5 8 19 20] several improvements and experimentalanalysis have been done for the robot behavior and subjecttracking

(i) First of all instead of using reactive control robotoperating system (ROS) [23] was integrated to oursystem as many works also take advantage of ROSfor the mobile robot applications [23ndash26] The mainadvantages of using ROS are the global mappingand localization [23ndash26] functionalities that increasethe performance of the robot motion provide moreflexibility to robot during subject tracking and enablethe use of amore robust activity recognition approachunder occlusion cases

(ii) Then we have changed depth and saliency improvedparticle filter color tracker with a new color trackingapproach presented in [27] This approach mainlychanges the particle filter model with scale and ori-entation adaptive mean shift tracker (SOAMST) [27]since it also provides good tracking results even underscale and orientation changes [27] with less parameteradjustments Also depth integration [19 20] is alsodone for the SOAMST algorithm too to improve therobustness of visual tracking

12 Contribution Despite several stated improvements onour robot structure the main contribution of this paper issummarized in the following points

(i) To be able to achieve robust robot tracking and activ-ity recognition visual tracking should be satisfactoryenough to find subject on the scene in changing envi-ronmental conditions For example different lightconditions which are also affected by the pose anddistance between subject and robot can affect thecolor information that can lead to possible false track-ing So we have researched how illumination changesaffect the visual tracking which is crucial in dailyat-home biomonitoring By visual tracking analysiswith different illumination conditions it is possibleto determine which lux (quantitative measurementof the light conditions) values provide better orworse tracking results Then robot tracking can beimproved with the consideration of these constraintsand also as a future work the robot would be able toswitch between depth-based and color-based trackingto increase the visual tracking performance

(ii) There are many works on the analysis of navigationand path planning for mobile robot and how theenvironmental layout affects the behaviour duringnavigation [28ndash30] without the consideration of con-tinuous subject tracking But there are not stud-ies focused on analyzing the robot motion relatedparameters for robot based subject tracking regardingthe visual tracking constraints such as illuminationand distance environmental settings smoothness ofthe tracking path of the robot and subject safetyTherefore in this work several experiments weredone to analyze and find the suitable parameters forour indoor biomonitoring robot task

(iii) With the integration of localization and mappingalgorithms we proposed a new localization assistedactivity recognition algorithm which suggests a fastmodel and is robust to occlusion cases where thelower limb body is not visible A state machine-basedactivity recognition approach is used where globallocation the height and ratio of the upper body partwithin a defined region of the subject are used Thereis no need of using training data because it utilizesperson specific heuristics to classify the activity of thesubject by considering activity state changes

The paper continues as follows Section 2 describes therobot platform including both hardware and software com-ponents considering ROS and its integration to systemvisual and robot based tracking and activity recognitionSection 3 gives the visual tracking tests and the details forillumination change based experiments on color based visualtracking Then Section 4 shows the experimental analysiswith robot behaviour with consideration of illuminationconstraints obtained by visual tracking analysis with differentlighting conditions during subject tracking with differentparameters for robot path planning Section 5 demonstratesthe experimental results on the proposed activity recognition

4 The Scientific World Journal

Robot Rotating

Rotatingplatform

LRF

Sonar

Point cloud

Laser scan

Speed

MATLAB

MATLABVisual tracking

Activity recognitionRobot destination

ROS

Subject pose Subject pose(local) (global)

Robotpose

Path

Local planner

bridge

Command

LocalizationOdometry

Kinect

Pioneer P3-DX

Goal

Global path planner

Figure 2 System structure including hardware and software com-ponents

887

mm

400mm

Kinect

Rotating table

PC base

Laser range finder(LRF)Pioneer P3-DX

base platform

450mm

Figure 3 Mobile robot hardware structure

algorithm Finally discussion and conclusion remarks aregiven

2 System Architecture

The system structure is given in Figure 2 where all the com-ponents hardware and software of the system are presentedThe hardware is composed of the robot platform and itscomponentsThe software part includes the ROS [23 25] andalgorithms in MATLAB codes to perform a given task suchas visual tracking robot tracking and activity recognition

21 System Hardware The robot hardware (Figure 3) isassembled base on Pioneer P3-DX equipped with a laserrange finder (LRF) and a Kinect sensor on a rotating platform(Figure 4) The LRF is used for mapping localization andobstacle avoidance The Kinect is used for subject detectionand it is mounted on a rotating platform in order to extend

160mm

119

mm

60mm

60mm

Figure 4 Rotating table

the detection range so that the visual tracking can be achievedeven if there is the difficulty for the robot to change thepose

22 System Software Thesystemuses ROS for robot behaviorcontrol and MATLAB for an improved visual subject trackeractivity recognition and robot command control

23 ROS Integration into the System ROS is used for map-ping localization and robot movement (to a point estab-lished byMATLAB) and converts the local subject pose to theglobal subject pose In detail the ROS communicates with therobot using the ROSARIA package [23] The communicationpurpose is to have control of the robot and receive odometrydata To control the robot MATLAB sends three types ofcommands rotating with an angle going to a goal andstopping (shown as robot destination task in Figure 2) andthe communication between ROS system and MATLAB isachieved by using MATLAB engine

As for the control of the robot movement that allowssending the robot to a specific point a navigation system isimplemented in ROS First we manually control the robotto explore the surrounding environment and use Gmappingpackage [24 25] to map the environment by combining theinformation of LRF and odometry (see Figure 5) From thispoint ROS navigation stack including localization costmapglobal path planner and local planer is used for autonomousrobot motion planning Adaptive Monte Carlo localizationapproach (AMCL) [26] is used to locate the current pose ofrobot based on the map built by Gmapping The costmap[31] is an occupancy grid map which keeps all the obstacleinformation including the static map and dynamic updatinginformation provided by robot sensors The cost of eachgrid is related to the distance between the robot and anobstacle For example when the robot is closer the cost isbigger

According to the current position of robot the desti-nation point is established by MATLAB and the costmapand a Dijkstrarsquos algorithm based global path planner isimplemented to plan a shortest feasible path Trajectoryrollout [32] and dynamic window approach (DWA) [33] areused to generate the velocity commands that are sent tothe robot in order to follow the path generated by globalpath planning until the robot arrives to the destination or

The Scientific World Journal 5

Wall elapsed 68714 ROS time 139667385937 ROS elapsed 68714

Figure 5 ROS localization and path planing on the global map

Figure 6 Manually selected region

it is stopped by MATLAB command Since the costmapcontains the obstacle information the global path plannerand local planer can avoid the obstacles automatically (seeFigure 5)

24 Visual Tracking The visual tracking algorithm is imple-mented on MATLAB by using the scale and orientationadaptive mean shift tracking (SOAMST) algorithm [27]We made improvement by integrating Kinect depth imagesupport to increase the importance of weights based on thedepth likelihood as proposed in [19 20] In addition saliencymap integration is also possible by updating the weightsas proposed in [20] but in this work we omitted usingsaliency for real-time processing performance The details ofthe SOAMST color region tracker can be find in [27] anddepth and saliency improvement details can be implementedsimilar to the [19 20]

Depth improved SOAMST is explained as follows [19 2027]

(a) Initial targetmodel is calculated bymanually selectingthe region of interest (Figure 6) on the subject based onnormalized color histogram from R G and B color channelswith 16 bin quantization and initial target position (119910

0) is

decided by the target model

Figure 7 Depth likelihood map

(b) Based on the initial target position initialize the targetdepth or distance to the robot from depth data 119863 as follows

tempDepth = 119863 (119910 0 minus 5 119910 0 + 5 119909 0 minus 5 119909 0 + 5)

1198890= round (median (double (tempDepth ())))

(1)

(c) Set iteration number 119896 to 0(d) For the current depth map based on the previous

target position 1199100 find the depth likelihood map (Figure 7)

as in (2) in which reference target region point will have thehighest likelihood to the points at current frame with similardepth

119875119910=

1

120590radic2120587exp(

minus(119863119910minus 1198890)2

21205902) (2)

where 119875119910is the depth likelihood map calculated by the depth

map 119863119910based on the depth values at point 119910 and 119889

0is the

target depth value at previous frame(e) For the current data based on the previous target

position 1199100 find the target candidate model

(f) Calculate the weights of each pixel in the targetcandidate region as

119908119894= 119889119894

119898

sum119906=1

radic119902119906

119901119906(1199100) (3)

where 119889119894is the depth likelihood value obtained from 119875

119910

and 119902119906and 119901

119906(1199100) are the reference target model and target

candidate model of the search region(g) Then new target position can be calculated as

1199101=sum119894119909119894119908119894

sum119894119908119894

(4)

where 1199101is the new matched point for the subject position

obtained by comparing the initial target model and currentframes candidate model 119909

119894is the points in the search region

and 119908119894is the weights at 119909

119894

6 The Scientific World Journal

(h) This operation continues until matching point con-verges to a value as in

Δ119889 =10038171003817100381710038171199101 minus 119910

0

1003817100381710038171003817

1199100= 1199101

if Δ119889 lt 120576 119896 ge 15

go to step (i)

else

119896 = 119896 + 1

go to step (e)

(5)

(i) After the new tracking point is found the heightwidth and orientation of the target candidate model can becalculated by singular value decomposition of the covariancematrix obtained by the second order moment features asfollows

[11990611

11990612

11990621

11990622

] times [1198862 0

0 1198872] times [

11990611

11990612

11990621

11990622

]

119879

= SVD (Cov) (6)

Cov = [12058320

12058311

12058311

12058302

]119886 = 119896120582

1

119887 = 1198961205822

119880 = [11990611

11990612

11990621

11990622

] (7)

where [11990611 11990621]119879 and [11990612 119906

22]119879 can yield the orientation of

the tracked region and 1198862 and 1198872 are the scaled eigen valuesof Cov in (7) which can represent the height and width of thetracked ellipsoid region

(j)Then next step is to find next search region for the nextframe as

Cov2= 119880 times [

(119886 + Δ119889)2

0

0 (119887 + Δ119889)2] times 119880

119879 (8)

(119909 minus 1199101) times Covminus1

2times (119909 minus 119910

1)119879le 1 (9)

where the points 119909 that provide the conditions in (9) can be inthe next search region Δ119889 is the change for the search regionin the next frame

25 Activity Recognition By the integration of mapping andlocalization new activity recognition has been proposedto handle the problems of our previous approach such asocclusion cases [5 8] This update includes a new fast andperson specific model that does not require any trainingprocess using high amount of off-line training data The newmodel takes advantage of person specific heuristics statetransition rules and global localization of the subject byutilizing the information through robot localization Usingthese features the model classifies the possible indoor dailyactivities such as walking standing sitting bending lyingdown cycling and falling The state transition rules betweenactivities are given in Figure 8

Person specific features include variables such as

(1) height parameter based on the extracted contourregion of the tracked subject

Standing Walking

Sitting

Bending Lying-down

Cycling

Falling

Figure 8 State transition applied in activity recognition model

(2) change in the height parameter between frames(3) a ratio parameter that can be calculated through top

region of the extracted body contour (see Figure 9)(4) subject location based on the tracking point on the

subject calculated by the visual tracking algorithmusing Kinect and robot global position obtained bylidar and odometry (The transformation from localKinect 3D data to global 2D map is given in (10) [34]

[119878119866119909

119878119866119910

] = A times [119878119871119909

119878119871119910

] + [119879119866119909

119879119866119910

] (10)

A = [119904119909cos (120572) minus119904

119910sin (120572)

119904119909sin (120572) 119904

119910cos (120572) ] (11)

119879119866119909

= 119877119866119909

minus 119889 cos (120579)

119879119866119910

= 119877119866119910

minus 119889 sin (120579) (12)

where 119878119866119909

and 119878119866119910

are the subjectrsquos global 119909 and 119910

positions on global 2D map and 119878119871119909

and 119878119871119910

are thelocal Kinect data of the subject on the relevant 119909-axis and 119910-axis of Kinect to global 2D map 120572 is theKinect pose on global map calculated by the robotpose and rotating table angle The transformationmatrix A is shown in (11) and 119904

119909and 119904119910

of A arethe scaling coefficients on 119909-axis and 119910-axis 119879

119866119909

and 119879119866119910

are the translation values on 119909-axis and 119910-axis respectively due to the difference between robotcenter and Kinect position on the robot Translationvalues were computed as in (12) with the given 119877

119866119909

and 119877119866119910

which are the robot positions in global 2Dmap and 119889 is the distance between Kinect positionand the robot center And the robot pose on globalmap is given as 120579)

(5) velocity of the subject between a defined number offrames

As it has been mentioned that the new model can beused to classify static activities such as standing sitting and

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 3

based color tracker is a robust tracker however there aremany parameters to consider the motion of the robot shouldbe integrated to particle motion model to increase the visualtracker performance which is very difficult to implementbecause of the indeterministic and uncertain behaviour ofthe robot that depends on the subject position For the robotbehaviour in most of the cases we have made experimentsusing a reactive control model [5 8 19] for the robot move-ment while tracking the subject of interest However withouthaving a global reference of robot and subject position it isvery difficult to implement a robust mobile tracking systemBy knowing the global position it is possible to have differentoptions for the movement of the robot using path planningalgorithms

In addition activity recognition regarding the MIP be-haviour analysis and classification (see Figure 1) was accom-plished by analyzing the features from the human bodycontour This was done by extracting height and width ratiosfrom the several body parts from the binary images ofextracted subject tracking regions and then Hidden Markovmodel (HMM) [5 8] was used to classify the current action byobserving the changes in the body structure by using the jointangles of the lower limb [5 8] Moreover normal walkingand impaired walking were differentiated by HMM-basedclassification with high accuracy However the problem withthe features used in [5 8] is that it is not robust when there ispartial occlusion of the lower limb part of the body Lowerlimb part of the body is crucial to discriminate betweenwalking and standing during the activity recognition processAlso this algorithm requires specific training models foreach subject separately so it is hard to obtain good activitydetection results especially for dynamic activities such aswalking for several subjects by using only one trainedmodel

Due to the problems stated in the previous versions of ourwork [5 8 19 20] several improvements and experimentalanalysis have been done for the robot behavior and subjecttracking

(i) First of all instead of using reactive control robotoperating system (ROS) [23] was integrated to oursystem as many works also take advantage of ROSfor the mobile robot applications [23ndash26] The mainadvantages of using ROS are the global mappingand localization [23ndash26] functionalities that increasethe performance of the robot motion provide moreflexibility to robot during subject tracking and enablethe use of amore robust activity recognition approachunder occlusion cases

(ii) Then we have changed depth and saliency improvedparticle filter color tracker with a new color trackingapproach presented in [27] This approach mainlychanges the particle filter model with scale and ori-entation adaptive mean shift tracker (SOAMST) [27]since it also provides good tracking results even underscale and orientation changes [27] with less parameteradjustments Also depth integration [19 20] is alsodone for the SOAMST algorithm too to improve therobustness of visual tracking

12 Contribution Despite several stated improvements onour robot structure the main contribution of this paper issummarized in the following points

(i) To be able to achieve robust robot tracking and activ-ity recognition visual tracking should be satisfactoryenough to find subject on the scene in changing envi-ronmental conditions For example different lightconditions which are also affected by the pose anddistance between subject and robot can affect thecolor information that can lead to possible false track-ing So we have researched how illumination changesaffect the visual tracking which is crucial in dailyat-home biomonitoring By visual tracking analysiswith different illumination conditions it is possibleto determine which lux (quantitative measurementof the light conditions) values provide better orworse tracking results Then robot tracking can beimproved with the consideration of these constraintsand also as a future work the robot would be able toswitch between depth-based and color-based trackingto increase the visual tracking performance

(ii) There are many works on the analysis of navigationand path planning for mobile robot and how theenvironmental layout affects the behaviour duringnavigation [28ndash30] without the consideration of con-tinuous subject tracking But there are not stud-ies focused on analyzing the robot motion relatedparameters for robot based subject tracking regardingthe visual tracking constraints such as illuminationand distance environmental settings smoothness ofthe tracking path of the robot and subject safetyTherefore in this work several experiments weredone to analyze and find the suitable parameters forour indoor biomonitoring robot task

(iii) With the integration of localization and mappingalgorithms we proposed a new localization assistedactivity recognition algorithm which suggests a fastmodel and is robust to occlusion cases where thelower limb body is not visible A state machine-basedactivity recognition approach is used where globallocation the height and ratio of the upper body partwithin a defined region of the subject are used Thereis no need of using training data because it utilizesperson specific heuristics to classify the activity of thesubject by considering activity state changes

The paper continues as follows Section 2 describes therobot platform including both hardware and software com-ponents considering ROS and its integration to systemvisual and robot based tracking and activity recognitionSection 3 gives the visual tracking tests and the details forillumination change based experiments on color based visualtracking Then Section 4 shows the experimental analysiswith robot behaviour with consideration of illuminationconstraints obtained by visual tracking analysis with differentlighting conditions during subject tracking with differentparameters for robot path planning Section 5 demonstratesthe experimental results on the proposed activity recognition

4 The Scientific World Journal

Robot Rotating

Rotatingplatform

LRF

Sonar

Point cloud

Laser scan

Speed

MATLAB

MATLABVisual tracking

Activity recognitionRobot destination

ROS

Subject pose Subject pose(local) (global)

Robotpose

Path

Local planner

bridge

Command

LocalizationOdometry

Kinect

Pioneer P3-DX

Goal

Global path planner

Figure 2 System structure including hardware and software com-ponents

887

mm

400mm

Kinect

Rotating table

PC base

Laser range finder(LRF)Pioneer P3-DX

base platform

450mm

Figure 3 Mobile robot hardware structure

algorithm Finally discussion and conclusion remarks aregiven

2 System Architecture

The system structure is given in Figure 2 where all the com-ponents hardware and software of the system are presentedThe hardware is composed of the robot platform and itscomponentsThe software part includes the ROS [23 25] andalgorithms in MATLAB codes to perform a given task suchas visual tracking robot tracking and activity recognition

21 System Hardware The robot hardware (Figure 3) isassembled base on Pioneer P3-DX equipped with a laserrange finder (LRF) and a Kinect sensor on a rotating platform(Figure 4) The LRF is used for mapping localization andobstacle avoidance The Kinect is used for subject detectionand it is mounted on a rotating platform in order to extend

160mm

119

mm

60mm

60mm

Figure 4 Rotating table

the detection range so that the visual tracking can be achievedeven if there is the difficulty for the robot to change thepose

22 System Software Thesystemuses ROS for robot behaviorcontrol and MATLAB for an improved visual subject trackeractivity recognition and robot command control

23 ROS Integration into the System ROS is used for map-ping localization and robot movement (to a point estab-lished byMATLAB) and converts the local subject pose to theglobal subject pose In detail the ROS communicates with therobot using the ROSARIA package [23] The communicationpurpose is to have control of the robot and receive odometrydata To control the robot MATLAB sends three types ofcommands rotating with an angle going to a goal andstopping (shown as robot destination task in Figure 2) andthe communication between ROS system and MATLAB isachieved by using MATLAB engine

As for the control of the robot movement that allowssending the robot to a specific point a navigation system isimplemented in ROS First we manually control the robotto explore the surrounding environment and use Gmappingpackage [24 25] to map the environment by combining theinformation of LRF and odometry (see Figure 5) From thispoint ROS navigation stack including localization costmapglobal path planner and local planer is used for autonomousrobot motion planning Adaptive Monte Carlo localizationapproach (AMCL) [26] is used to locate the current pose ofrobot based on the map built by Gmapping The costmap[31] is an occupancy grid map which keeps all the obstacleinformation including the static map and dynamic updatinginformation provided by robot sensors The cost of eachgrid is related to the distance between the robot and anobstacle For example when the robot is closer the cost isbigger

According to the current position of robot the desti-nation point is established by MATLAB and the costmapand a Dijkstrarsquos algorithm based global path planner isimplemented to plan a shortest feasible path Trajectoryrollout [32] and dynamic window approach (DWA) [33] areused to generate the velocity commands that are sent tothe robot in order to follow the path generated by globalpath planning until the robot arrives to the destination or

The Scientific World Journal 5

Wall elapsed 68714 ROS time 139667385937 ROS elapsed 68714

Figure 5 ROS localization and path planing on the global map

Figure 6 Manually selected region

it is stopped by MATLAB command Since the costmapcontains the obstacle information the global path plannerand local planer can avoid the obstacles automatically (seeFigure 5)

24 Visual Tracking The visual tracking algorithm is imple-mented on MATLAB by using the scale and orientationadaptive mean shift tracking (SOAMST) algorithm [27]We made improvement by integrating Kinect depth imagesupport to increase the importance of weights based on thedepth likelihood as proposed in [19 20] In addition saliencymap integration is also possible by updating the weightsas proposed in [20] but in this work we omitted usingsaliency for real-time processing performance The details ofthe SOAMST color region tracker can be find in [27] anddepth and saliency improvement details can be implementedsimilar to the [19 20]

Depth improved SOAMST is explained as follows [19 2027]

(a) Initial targetmodel is calculated bymanually selectingthe region of interest (Figure 6) on the subject based onnormalized color histogram from R G and B color channelswith 16 bin quantization and initial target position (119910

0) is

decided by the target model

Figure 7 Depth likelihood map

(b) Based on the initial target position initialize the targetdepth or distance to the robot from depth data 119863 as follows

tempDepth = 119863 (119910 0 minus 5 119910 0 + 5 119909 0 minus 5 119909 0 + 5)

1198890= round (median (double (tempDepth ())))

(1)

(c) Set iteration number 119896 to 0(d) For the current depth map based on the previous

target position 1199100 find the depth likelihood map (Figure 7)

as in (2) in which reference target region point will have thehighest likelihood to the points at current frame with similardepth

119875119910=

1

120590radic2120587exp(

minus(119863119910minus 1198890)2

21205902) (2)

where 119875119910is the depth likelihood map calculated by the depth

map 119863119910based on the depth values at point 119910 and 119889

0is the

target depth value at previous frame(e) For the current data based on the previous target

position 1199100 find the target candidate model

(f) Calculate the weights of each pixel in the targetcandidate region as

119908119894= 119889119894

119898

sum119906=1

radic119902119906

119901119906(1199100) (3)

where 119889119894is the depth likelihood value obtained from 119875

119910

and 119902119906and 119901

119906(1199100) are the reference target model and target

candidate model of the search region(g) Then new target position can be calculated as

1199101=sum119894119909119894119908119894

sum119894119908119894

(4)

where 1199101is the new matched point for the subject position

obtained by comparing the initial target model and currentframes candidate model 119909

119894is the points in the search region

and 119908119894is the weights at 119909

119894

6 The Scientific World Journal

(h) This operation continues until matching point con-verges to a value as in

Δ119889 =10038171003817100381710038171199101 minus 119910

0

1003817100381710038171003817

1199100= 1199101

if Δ119889 lt 120576 119896 ge 15

go to step (i)

else

119896 = 119896 + 1

go to step (e)

(5)

(i) After the new tracking point is found the heightwidth and orientation of the target candidate model can becalculated by singular value decomposition of the covariancematrix obtained by the second order moment features asfollows

[11990611

11990612

11990621

11990622

] times [1198862 0

0 1198872] times [

11990611

11990612

11990621

11990622

]

119879

= SVD (Cov) (6)

Cov = [12058320

12058311

12058311

12058302

]119886 = 119896120582

1

119887 = 1198961205822

119880 = [11990611

11990612

11990621

11990622

] (7)

where [11990611 11990621]119879 and [11990612 119906

22]119879 can yield the orientation of

the tracked region and 1198862 and 1198872 are the scaled eigen valuesof Cov in (7) which can represent the height and width of thetracked ellipsoid region

(j)Then next step is to find next search region for the nextframe as

Cov2= 119880 times [

(119886 + Δ119889)2

0

0 (119887 + Δ119889)2] times 119880

119879 (8)

(119909 minus 1199101) times Covminus1

2times (119909 minus 119910

1)119879le 1 (9)

where the points 119909 that provide the conditions in (9) can be inthe next search region Δ119889 is the change for the search regionin the next frame

25 Activity Recognition By the integration of mapping andlocalization new activity recognition has been proposedto handle the problems of our previous approach such asocclusion cases [5 8] This update includes a new fast andperson specific model that does not require any trainingprocess using high amount of off-line training data The newmodel takes advantage of person specific heuristics statetransition rules and global localization of the subject byutilizing the information through robot localization Usingthese features the model classifies the possible indoor dailyactivities such as walking standing sitting bending lyingdown cycling and falling The state transition rules betweenactivities are given in Figure 8

Person specific features include variables such as

(1) height parameter based on the extracted contourregion of the tracked subject

Standing Walking

Sitting

Bending Lying-down

Cycling

Falling

Figure 8 State transition applied in activity recognition model

(2) change in the height parameter between frames(3) a ratio parameter that can be calculated through top

region of the extracted body contour (see Figure 9)(4) subject location based on the tracking point on the

subject calculated by the visual tracking algorithmusing Kinect and robot global position obtained bylidar and odometry (The transformation from localKinect 3D data to global 2D map is given in (10) [34]

[119878119866119909

119878119866119910

] = A times [119878119871119909

119878119871119910

] + [119879119866119909

119879119866119910

] (10)

A = [119904119909cos (120572) minus119904

119910sin (120572)

119904119909sin (120572) 119904

119910cos (120572) ] (11)

119879119866119909

= 119877119866119909

minus 119889 cos (120579)

119879119866119910

= 119877119866119910

minus 119889 sin (120579) (12)

where 119878119866119909

and 119878119866119910

are the subjectrsquos global 119909 and 119910

positions on global 2D map and 119878119871119909

and 119878119871119910

are thelocal Kinect data of the subject on the relevant 119909-axis and 119910-axis of Kinect to global 2D map 120572 is theKinect pose on global map calculated by the robotpose and rotating table angle The transformationmatrix A is shown in (11) and 119904

119909and 119904119910

of A arethe scaling coefficients on 119909-axis and 119910-axis 119879

119866119909

and 119879119866119910

are the translation values on 119909-axis and 119910-axis respectively due to the difference between robotcenter and Kinect position on the robot Translationvalues were computed as in (12) with the given 119877

119866119909

and 119877119866119910

which are the robot positions in global 2Dmap and 119889 is the distance between Kinect positionand the robot center And the robot pose on globalmap is given as 120579)

(5) velocity of the subject between a defined number offrames

As it has been mentioned that the new model can beused to classify static activities such as standing sitting and

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

4 The Scientific World Journal

Robot Rotating

Rotatingplatform

LRF

Sonar

Point cloud

Laser scan

Speed

MATLAB

MATLABVisual tracking

Activity recognitionRobot destination

ROS

Subject pose Subject pose(local) (global)

Robotpose

Path

Local planner

bridge

Command

LocalizationOdometry

Kinect

Pioneer P3-DX

Goal

Global path planner

Figure 2 System structure including hardware and software com-ponents

887

mm

400mm

Kinect

Rotating table

PC base

Laser range finder(LRF)Pioneer P3-DX

base platform

450mm

Figure 3 Mobile robot hardware structure

algorithm Finally discussion and conclusion remarks aregiven

2 System Architecture

The system structure is given in Figure 2 where all the com-ponents hardware and software of the system are presentedThe hardware is composed of the robot platform and itscomponentsThe software part includes the ROS [23 25] andalgorithms in MATLAB codes to perform a given task suchas visual tracking robot tracking and activity recognition

21 System Hardware The robot hardware (Figure 3) isassembled base on Pioneer P3-DX equipped with a laserrange finder (LRF) and a Kinect sensor on a rotating platform(Figure 4) The LRF is used for mapping localization andobstacle avoidance The Kinect is used for subject detectionand it is mounted on a rotating platform in order to extend

160mm

119

mm

60mm

60mm

Figure 4 Rotating table

the detection range so that the visual tracking can be achievedeven if there is the difficulty for the robot to change thepose

22 System Software Thesystemuses ROS for robot behaviorcontrol and MATLAB for an improved visual subject trackeractivity recognition and robot command control

23 ROS Integration into the System ROS is used for map-ping localization and robot movement (to a point estab-lished byMATLAB) and converts the local subject pose to theglobal subject pose In detail the ROS communicates with therobot using the ROSARIA package [23] The communicationpurpose is to have control of the robot and receive odometrydata To control the robot MATLAB sends three types ofcommands rotating with an angle going to a goal andstopping (shown as robot destination task in Figure 2) andthe communication between ROS system and MATLAB isachieved by using MATLAB engine

As for the control of the robot movement that allowssending the robot to a specific point a navigation system isimplemented in ROS First we manually control the robotto explore the surrounding environment and use Gmappingpackage [24 25] to map the environment by combining theinformation of LRF and odometry (see Figure 5) From thispoint ROS navigation stack including localization costmapglobal path planner and local planer is used for autonomousrobot motion planning Adaptive Monte Carlo localizationapproach (AMCL) [26] is used to locate the current pose ofrobot based on the map built by Gmapping The costmap[31] is an occupancy grid map which keeps all the obstacleinformation including the static map and dynamic updatinginformation provided by robot sensors The cost of eachgrid is related to the distance between the robot and anobstacle For example when the robot is closer the cost isbigger

According to the current position of robot the desti-nation point is established by MATLAB and the costmapand a Dijkstrarsquos algorithm based global path planner isimplemented to plan a shortest feasible path Trajectoryrollout [32] and dynamic window approach (DWA) [33] areused to generate the velocity commands that are sent tothe robot in order to follow the path generated by globalpath planning until the robot arrives to the destination or

The Scientific World Journal 5

Wall elapsed 68714 ROS time 139667385937 ROS elapsed 68714

Figure 5 ROS localization and path planing on the global map

Figure 6 Manually selected region

it is stopped by MATLAB command Since the costmapcontains the obstacle information the global path plannerand local planer can avoid the obstacles automatically (seeFigure 5)

24 Visual Tracking The visual tracking algorithm is imple-mented on MATLAB by using the scale and orientationadaptive mean shift tracking (SOAMST) algorithm [27]We made improvement by integrating Kinect depth imagesupport to increase the importance of weights based on thedepth likelihood as proposed in [19 20] In addition saliencymap integration is also possible by updating the weightsas proposed in [20] but in this work we omitted usingsaliency for real-time processing performance The details ofthe SOAMST color region tracker can be find in [27] anddepth and saliency improvement details can be implementedsimilar to the [19 20]

Depth improved SOAMST is explained as follows [19 2027]

(a) Initial targetmodel is calculated bymanually selectingthe region of interest (Figure 6) on the subject based onnormalized color histogram from R G and B color channelswith 16 bin quantization and initial target position (119910

0) is

decided by the target model

Figure 7 Depth likelihood map

(b) Based on the initial target position initialize the targetdepth or distance to the robot from depth data 119863 as follows

tempDepth = 119863 (119910 0 minus 5 119910 0 + 5 119909 0 minus 5 119909 0 + 5)

1198890= round (median (double (tempDepth ())))

(1)

(c) Set iteration number 119896 to 0(d) For the current depth map based on the previous

target position 1199100 find the depth likelihood map (Figure 7)

as in (2) in which reference target region point will have thehighest likelihood to the points at current frame with similardepth

119875119910=

1

120590radic2120587exp(

minus(119863119910minus 1198890)2

21205902) (2)

where 119875119910is the depth likelihood map calculated by the depth

map 119863119910based on the depth values at point 119910 and 119889

0is the

target depth value at previous frame(e) For the current data based on the previous target

position 1199100 find the target candidate model

(f) Calculate the weights of each pixel in the targetcandidate region as

119908119894= 119889119894

119898

sum119906=1

radic119902119906

119901119906(1199100) (3)

where 119889119894is the depth likelihood value obtained from 119875

119910

and 119902119906and 119901

119906(1199100) are the reference target model and target

candidate model of the search region(g) Then new target position can be calculated as

1199101=sum119894119909119894119908119894

sum119894119908119894

(4)

where 1199101is the new matched point for the subject position

obtained by comparing the initial target model and currentframes candidate model 119909

119894is the points in the search region

and 119908119894is the weights at 119909

119894

6 The Scientific World Journal

(h) This operation continues until matching point con-verges to a value as in

Δ119889 =10038171003817100381710038171199101 minus 119910

0

1003817100381710038171003817

1199100= 1199101

if Δ119889 lt 120576 119896 ge 15

go to step (i)

else

119896 = 119896 + 1

go to step (e)

(5)

(i) After the new tracking point is found the heightwidth and orientation of the target candidate model can becalculated by singular value decomposition of the covariancematrix obtained by the second order moment features asfollows

[11990611

11990612

11990621

11990622

] times [1198862 0

0 1198872] times [

11990611

11990612

11990621

11990622

]

119879

= SVD (Cov) (6)

Cov = [12058320

12058311

12058311

12058302

]119886 = 119896120582

1

119887 = 1198961205822

119880 = [11990611

11990612

11990621

11990622

] (7)

where [11990611 11990621]119879 and [11990612 119906

22]119879 can yield the orientation of

the tracked region and 1198862 and 1198872 are the scaled eigen valuesof Cov in (7) which can represent the height and width of thetracked ellipsoid region

(j)Then next step is to find next search region for the nextframe as

Cov2= 119880 times [

(119886 + Δ119889)2

0

0 (119887 + Δ119889)2] times 119880

119879 (8)

(119909 minus 1199101) times Covminus1

2times (119909 minus 119910

1)119879le 1 (9)

where the points 119909 that provide the conditions in (9) can be inthe next search region Δ119889 is the change for the search regionin the next frame

25 Activity Recognition By the integration of mapping andlocalization new activity recognition has been proposedto handle the problems of our previous approach such asocclusion cases [5 8] This update includes a new fast andperson specific model that does not require any trainingprocess using high amount of off-line training data The newmodel takes advantage of person specific heuristics statetransition rules and global localization of the subject byutilizing the information through robot localization Usingthese features the model classifies the possible indoor dailyactivities such as walking standing sitting bending lyingdown cycling and falling The state transition rules betweenactivities are given in Figure 8

Person specific features include variables such as

(1) height parameter based on the extracted contourregion of the tracked subject

Standing Walking

Sitting

Bending Lying-down

Cycling

Falling

Figure 8 State transition applied in activity recognition model

(2) change in the height parameter between frames(3) a ratio parameter that can be calculated through top

region of the extracted body contour (see Figure 9)(4) subject location based on the tracking point on the

subject calculated by the visual tracking algorithmusing Kinect and robot global position obtained bylidar and odometry (The transformation from localKinect 3D data to global 2D map is given in (10) [34]

[119878119866119909

119878119866119910

] = A times [119878119871119909

119878119871119910

] + [119879119866119909

119879119866119910

] (10)

A = [119904119909cos (120572) minus119904

119910sin (120572)

119904119909sin (120572) 119904

119910cos (120572) ] (11)

119879119866119909

= 119877119866119909

minus 119889 cos (120579)

119879119866119910

= 119877119866119910

minus 119889 sin (120579) (12)

where 119878119866119909

and 119878119866119910

are the subjectrsquos global 119909 and 119910

positions on global 2D map and 119878119871119909

and 119878119871119910

are thelocal Kinect data of the subject on the relevant 119909-axis and 119910-axis of Kinect to global 2D map 120572 is theKinect pose on global map calculated by the robotpose and rotating table angle The transformationmatrix A is shown in (11) and 119904

119909and 119904119910

of A arethe scaling coefficients on 119909-axis and 119910-axis 119879

119866119909

and 119879119866119910

are the translation values on 119909-axis and 119910-axis respectively due to the difference between robotcenter and Kinect position on the robot Translationvalues were computed as in (12) with the given 119877

119866119909

and 119877119866119910

which are the robot positions in global 2Dmap and 119889 is the distance between Kinect positionand the robot center And the robot pose on globalmap is given as 120579)

(5) velocity of the subject between a defined number offrames

As it has been mentioned that the new model can beused to classify static activities such as standing sitting and

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 5

Wall elapsed 68714 ROS time 139667385937 ROS elapsed 68714

Figure 5 ROS localization and path planing on the global map

Figure 6 Manually selected region

it is stopped by MATLAB command Since the costmapcontains the obstacle information the global path plannerand local planer can avoid the obstacles automatically (seeFigure 5)

24 Visual Tracking The visual tracking algorithm is imple-mented on MATLAB by using the scale and orientationadaptive mean shift tracking (SOAMST) algorithm [27]We made improvement by integrating Kinect depth imagesupport to increase the importance of weights based on thedepth likelihood as proposed in [19 20] In addition saliencymap integration is also possible by updating the weightsas proposed in [20] but in this work we omitted usingsaliency for real-time processing performance The details ofthe SOAMST color region tracker can be find in [27] anddepth and saliency improvement details can be implementedsimilar to the [19 20]

Depth improved SOAMST is explained as follows [19 2027]

(a) Initial targetmodel is calculated bymanually selectingthe region of interest (Figure 6) on the subject based onnormalized color histogram from R G and B color channelswith 16 bin quantization and initial target position (119910

0) is

decided by the target model

Figure 7 Depth likelihood map

(b) Based on the initial target position initialize the targetdepth or distance to the robot from depth data 119863 as follows

tempDepth = 119863 (119910 0 minus 5 119910 0 + 5 119909 0 minus 5 119909 0 + 5)

1198890= round (median (double (tempDepth ())))

(1)

(c) Set iteration number 119896 to 0(d) For the current depth map based on the previous

target position 1199100 find the depth likelihood map (Figure 7)

as in (2) in which reference target region point will have thehighest likelihood to the points at current frame with similardepth

119875119910=

1

120590radic2120587exp(

minus(119863119910minus 1198890)2

21205902) (2)

where 119875119910is the depth likelihood map calculated by the depth

map 119863119910based on the depth values at point 119910 and 119889

0is the

target depth value at previous frame(e) For the current data based on the previous target

position 1199100 find the target candidate model

(f) Calculate the weights of each pixel in the targetcandidate region as

119908119894= 119889119894

119898

sum119906=1

radic119902119906

119901119906(1199100) (3)

where 119889119894is the depth likelihood value obtained from 119875

119910

and 119902119906and 119901

119906(1199100) are the reference target model and target

candidate model of the search region(g) Then new target position can be calculated as

1199101=sum119894119909119894119908119894

sum119894119908119894

(4)

where 1199101is the new matched point for the subject position

obtained by comparing the initial target model and currentframes candidate model 119909

119894is the points in the search region

and 119908119894is the weights at 119909

119894

6 The Scientific World Journal

(h) This operation continues until matching point con-verges to a value as in

Δ119889 =10038171003817100381710038171199101 minus 119910

0

1003817100381710038171003817

1199100= 1199101

if Δ119889 lt 120576 119896 ge 15

go to step (i)

else

119896 = 119896 + 1

go to step (e)

(5)

(i) After the new tracking point is found the heightwidth and orientation of the target candidate model can becalculated by singular value decomposition of the covariancematrix obtained by the second order moment features asfollows

[11990611

11990612

11990621

11990622

] times [1198862 0

0 1198872] times [

11990611

11990612

11990621

11990622

]

119879

= SVD (Cov) (6)

Cov = [12058320

12058311

12058311

12058302

]119886 = 119896120582

1

119887 = 1198961205822

119880 = [11990611

11990612

11990621

11990622

] (7)

where [11990611 11990621]119879 and [11990612 119906

22]119879 can yield the orientation of

the tracked region and 1198862 and 1198872 are the scaled eigen valuesof Cov in (7) which can represent the height and width of thetracked ellipsoid region

(j)Then next step is to find next search region for the nextframe as

Cov2= 119880 times [

(119886 + Δ119889)2

0

0 (119887 + Δ119889)2] times 119880

119879 (8)

(119909 minus 1199101) times Covminus1

2times (119909 minus 119910

1)119879le 1 (9)

where the points 119909 that provide the conditions in (9) can be inthe next search region Δ119889 is the change for the search regionin the next frame

25 Activity Recognition By the integration of mapping andlocalization new activity recognition has been proposedto handle the problems of our previous approach such asocclusion cases [5 8] This update includes a new fast andperson specific model that does not require any trainingprocess using high amount of off-line training data The newmodel takes advantage of person specific heuristics statetransition rules and global localization of the subject byutilizing the information through robot localization Usingthese features the model classifies the possible indoor dailyactivities such as walking standing sitting bending lyingdown cycling and falling The state transition rules betweenactivities are given in Figure 8

Person specific features include variables such as

(1) height parameter based on the extracted contourregion of the tracked subject

Standing Walking

Sitting

Bending Lying-down

Cycling

Falling

Figure 8 State transition applied in activity recognition model

(2) change in the height parameter between frames(3) a ratio parameter that can be calculated through top

region of the extracted body contour (see Figure 9)(4) subject location based on the tracking point on the

subject calculated by the visual tracking algorithmusing Kinect and robot global position obtained bylidar and odometry (The transformation from localKinect 3D data to global 2D map is given in (10) [34]

[119878119866119909

119878119866119910

] = A times [119878119871119909

119878119871119910

] + [119879119866119909

119879119866119910

] (10)

A = [119904119909cos (120572) minus119904

119910sin (120572)

119904119909sin (120572) 119904

119910cos (120572) ] (11)

119879119866119909

= 119877119866119909

minus 119889 cos (120579)

119879119866119910

= 119877119866119910

minus 119889 sin (120579) (12)

where 119878119866119909

and 119878119866119910

are the subjectrsquos global 119909 and 119910

positions on global 2D map and 119878119871119909

and 119878119871119910

are thelocal Kinect data of the subject on the relevant 119909-axis and 119910-axis of Kinect to global 2D map 120572 is theKinect pose on global map calculated by the robotpose and rotating table angle The transformationmatrix A is shown in (11) and 119904

119909and 119904119910

of A arethe scaling coefficients on 119909-axis and 119910-axis 119879

119866119909

and 119879119866119910

are the translation values on 119909-axis and 119910-axis respectively due to the difference between robotcenter and Kinect position on the robot Translationvalues were computed as in (12) with the given 119877

119866119909

and 119877119866119910

which are the robot positions in global 2Dmap and 119889 is the distance between Kinect positionand the robot center And the robot pose on globalmap is given as 120579)

(5) velocity of the subject between a defined number offrames

As it has been mentioned that the new model can beused to classify static activities such as standing sitting and

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

6 The Scientific World Journal

(h) This operation continues until matching point con-verges to a value as in

Δ119889 =10038171003817100381710038171199101 minus 119910

0

1003817100381710038171003817

1199100= 1199101

if Δ119889 lt 120576 119896 ge 15

go to step (i)

else

119896 = 119896 + 1

go to step (e)

(5)

(i) After the new tracking point is found the heightwidth and orientation of the target candidate model can becalculated by singular value decomposition of the covariancematrix obtained by the second order moment features asfollows

[11990611

11990612

11990621

11990622

] times [1198862 0

0 1198872] times [

11990611

11990612

11990621

11990622

]

119879

= SVD (Cov) (6)

Cov = [12058320

12058311

12058311

12058302

]119886 = 119896120582

1

119887 = 1198961205822

119880 = [11990611

11990612

11990621

11990622

] (7)

where [11990611 11990621]119879 and [11990612 119906

22]119879 can yield the orientation of

the tracked region and 1198862 and 1198872 are the scaled eigen valuesof Cov in (7) which can represent the height and width of thetracked ellipsoid region

(j)Then next step is to find next search region for the nextframe as

Cov2= 119880 times [

(119886 + Δ119889)2

0

0 (119887 + Δ119889)2] times 119880

119879 (8)

(119909 minus 1199101) times Covminus1

2times (119909 minus 119910

1)119879le 1 (9)

where the points 119909 that provide the conditions in (9) can be inthe next search region Δ119889 is the change for the search regionin the next frame

25 Activity Recognition By the integration of mapping andlocalization new activity recognition has been proposedto handle the problems of our previous approach such asocclusion cases [5 8] This update includes a new fast andperson specific model that does not require any trainingprocess using high amount of off-line training data The newmodel takes advantage of person specific heuristics statetransition rules and global localization of the subject byutilizing the information through robot localization Usingthese features the model classifies the possible indoor dailyactivities such as walking standing sitting bending lyingdown cycling and falling The state transition rules betweenactivities are given in Figure 8

Person specific features include variables such as

(1) height parameter based on the extracted contourregion of the tracked subject

Standing Walking

Sitting

Bending Lying-down

Cycling

Falling

Figure 8 State transition applied in activity recognition model

(2) change in the height parameter between frames(3) a ratio parameter that can be calculated through top

region of the extracted body contour (see Figure 9)(4) subject location based on the tracking point on the

subject calculated by the visual tracking algorithmusing Kinect and robot global position obtained bylidar and odometry (The transformation from localKinect 3D data to global 2D map is given in (10) [34]

[119878119866119909

119878119866119910

] = A times [119878119871119909

119878119871119910

] + [119879119866119909

119879119866119910

] (10)

A = [119904119909cos (120572) minus119904

119910sin (120572)

119904119909sin (120572) 119904

119910cos (120572) ] (11)

119879119866119909

= 119877119866119909

minus 119889 cos (120579)

119879119866119910

= 119877119866119910

minus 119889 sin (120579) (12)

where 119878119866119909

and 119878119866119910

are the subjectrsquos global 119909 and 119910

positions on global 2D map and 119878119871119909

and 119878119871119910

are thelocal Kinect data of the subject on the relevant 119909-axis and 119910-axis of Kinect to global 2D map 120572 is theKinect pose on global map calculated by the robotpose and rotating table angle The transformationmatrix A is shown in (11) and 119904

119909and 119904119910

of A arethe scaling coefficients on 119909-axis and 119910-axis 119879

119866119909

and 119879119866119910

are the translation values on 119909-axis and 119910-axis respectively due to the difference between robotcenter and Kinect position on the robot Translationvalues were computed as in (12) with the given 119877

119866119909

and 119877119866119910

which are the robot positions in global 2Dmap and 119889 is the distance between Kinect positionand the robot center And the robot pose on globalmap is given as 120579)

(5) velocity of the subject between a defined number offrames

As it has been mentioned that the new model can beused to classify static activities such as standing sitting and

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 7

Sitting Bending

Figure 9 Extracted contour region lines on the top part of the body to calculate the ratio parameter and tracking point on the extractedregion

lying down However features to represent static activitiesare not enough to detect dynamic or location based specificbehaviors such as cycling on a stationary bicycle Hence wetake advantage of the subject global position on the 2D mapdefined in (10) If the subject is sitting or bending at a specificglobalmapposition (marked as the stationary bicycle zone) itcan be inferred that the subject is using cyclingmachine Andif the extracted body contour is changing its global positionon the global map and the height of the extracted region isfor the standing condition it can be inferred that the subjectis walking even if the subject body is not fully extracted orthere is partial occlusion

As long as visual tracking and localization of the subjectare successfully done activity recognition model can achievethe classification task with high accuracy independent ofposture and occlusion cases

Since the one purpose of the project is the biomonitoringof the human walking conditions for the different types ofsituations such as normal impaired and elderly walkingpatterns it is important to for the system to detect thewalking activity for the stated different walking patternsBecause it is important to know the walking activity forthe biomonitoring system to record the data of the subjectrsquoswalking behaviour especially for the people in demand suchas elderly or impaired people By this way experts or analgorithm can analyze the recorded data whenever necessary

There are many algorithms that can detect walkingactivity but most of the experimental results are generallybased on normal walking conditions including other dailyactivities Therefore in this work we tested the proposedactivity recognition of algorithm for thewalking condition onnormal impaired and elderly walking patterns Experimentsshowed that the proposed algorithm can robustly detectwalking activity for the various patterns as long as the subjecttracking is accurate The results for the walking activity fordifferent patterns can be seen in the experimental resultssection

To demonstrate how the state transitions were handledstate transition rules for standing and walking activitieswhich are representative activities in terms of their frequencyof occurrence and dependency on localization informationwere described as in Algorithm 1

switch 119878119905119886119905119890119904119905minus1

Case 0if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif119867119890119894119892ℎ119905119904119905minus1

minus 119867119890119894119892ℎ119905119904119905gt 119878NoFall

if 119903119905gt Ratio

119878119905119886119905119890119904119905= 3

else119878119905119886119905119890119904

119905= 2

endelseif119867119890119894119892ℎ119905

119904

119905gt 119867Standing

if 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) gt 119878Walking

119878119905119886119905119890119904119905= 1

else119878119905119886119905119890119904

119905= 0

end119890119899119889

Case 1if 119867119890119894119892ℎ119905119904

119905minus1minus 119867119890119894119892ℎ119905119904

119905gt 119878Fall

119878119905119886119905119890119904119905= 5

elseif 119894119904119874119899119862119910119888119897119894119899119892 (119875119904119905(119866119909 119866119910))

119878119905119886119905119890119904119905= 4

elseif abs (119875119904119905(119866119909 119866119910) minus 119875119904119905minus119873

(119866119909 119866119910)) le 119878Walking

119878119905119886119905119890119904119905= 0

end

end

Algorithm 1

In Algorithm 1 119878119905119886119905119890119904119905is the activity on frame 119905 Activity

labels is defined as follows (i) standing = 0 (ii) walking = 1(iii) sitting = 2 (iv) bending = 3 (v) cycling = 4 (vi)falling = 5 and (vii) lying down = 7 119905 is the current framenumber 119878Fall and 119878NoFall are the speed parameters to decidefall or non-fall cases based on the tracked subject height atthe current and previous frames as119867119890119894119892ℎ119905

119904

119905minus1and119867119890119894119892ℎ119905

119904

119905minus1

119878Walking is the subject speed threshold to decide whether

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

8 The Scientific World Journal

Table 1 Experimental data content

Subjects Walking case Abbreviation Number of frames

Subject 1Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000

Subject 2Impaired walking IW 1000Elderly walking EW 1000Normal walking NW 1000Total number of frames 6000

the subject is moving or not Localization assistance uses theglobal subject position 119875119904

119905(119866119909 119866119910) for the necessary cases

And 119903119905is the ratio parameter at frame 119905 defined around the

maximum height of the tracked region as shown in Figure 9to discriminate activities such as sitting bending and lyingdown based on person specific Ratio threshold

3 Experiments of VisualTracking Performances and Effect ofLighting Conditions

Obviously it is crucial factor for the monitoring and activityrecognition process to have good robot motion and robustvisual tracking performance Therefore initially this work isalso focused on two other aspects (i) visual tracking limita-tions occurs due to the color model changes on the trackedsubject in which color model is affected by the illuminationconditions from the light source or the distance betweenrobot and subject and (ii) then the experiment and analysisof robot behaviour with the given environmental conditionsincluding the visual tracking constraints Finally after thesystem tests improvements on robot tracking the proposedactivity recognition is tested on various walking patterns tocheck its robustness to subject and pattern variations

31 Depth Improved Visual Tracking Experiments First ofall we have tested the SOAMST [27] and depth improvedSOAMST tracking model proposed in this work in Section23 The experiments were done by using the recorded datadescribed in Table 1 that includes 6000 frames with variouswalking patterns and color representations from differentsubjects

In this study the same subjects were required to testelderly impaired and normal walking patterns to see how thewalking patterns from the same subject differ during trackingand activity recognition tasks Therefore each subject par-ticipated in experiments to record data on elderly impairedand normal walking situations Normal walking experimentswere done with self-paced walking speed without any limi-tations (Figure 10(a)) On the other hand the elderly walkingand impaired walking cases were simulated with a wearablesimulation kit as shown in Figures 10(b)-10(c) The impairedwalking kit (Figure 10(b)) was used on the right foot of tworight footed subjects to cause impairment on the right lowerlimb And for the elderly walking the kit causes walkingdifficulty due to tied joint in a way that neck and knees

(a) (b) (c)

Figure 10 (a)Normalwalking (b) simulated impairedwalking and(c) simulated elderly walking

Table 2 Visual tracking results

Type SOAMST [27] Depth improved SOAMSTTrue False True False

IW 774 226 1000 0Subject 1 EW 851 149 1000 0

NW 899 101 1000 0IW 1000 0 1000 0

Subject 2 EW 1000 0 1000 0NW 998 2 914 86

Overall results 5522 478 5914 86Accuracy 9203 797 9857 143

were constrained to cause difficulty on standing straight andmaking big moves as in normal walking Therefore the twosimulating kits resulted in quite realistic walking patterns forboth the elderly and impaired walking cases

Experimental data recorded in Table 1 is prepared withstable light conditions by opening all light sources in the roomto be able to see the improvement by the proposed depthlikelihood integration to SOAMST color region tracker whiledecreasing the effect of illumination changes

Table 2 gives the number of frames with correctly trackedsubjects and false trackings for each recorded data For theSOAMST we have seen that good tracking performances canbe obtained when the color of the subject representation hashigh contrast with the background condition On the otherhand color region tracker is not robust enough when theforeground and background representations are similar toeach other Also high illumination variations or shadows onthe foreground region may result in false tracking cases withthe SOAMST [27] algorithm

However proposed depth integration improved thetracking accuracy a lot in which tracking was accomplishedwithout any false tracking in five of the six cases of thedataset For the proposed depth improved SOAMST only

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 9: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 9

Depth

SOAMST

SOAMST

improved

[27]

Figure 11 Tracking results for SOAMST and depth likelihood map supported SOAMST

one case yielded worse result than SOAMST [27] using onlycolor model for target matching The reason of the failurecan be due to the sudden depth changes on the sceneand also depth likelihood and joint color representationon the background region can have more similarity to thereference color representation Otherwise in general depthintegration can help to improve the robustness of SOAMSTunder changing illumination conditions

In Figure 11 images of the tracking results of SOAMSTand depth likelihood map based improved SOAMST modelsare given It is obvious that depth data integration to weightupdates improves the tracking results significantly whilepreventing the failure cases as in the last two columnsof Figure 11 But still if the color distribution model isnot convenient and the depth values of the subject andbackground are similar there can be still false detectionor positions scale and orientation errors (see Figure 11 4thcolumnof the depth improved tracking result) on the selectedregion

In sum limitations and optimal condition of theSOMAST [27] color tracker should be examined underdifferent illumination and distance conditions to fully takeadvantage of depth data integration Therefore the exper-iments with various lux values and distance conditionsregarding the light strength were tested on SOAMST [27] forfurther analysis by examining the color features and visualtracking

32 Illumination Experiments for Visual Tracking In this partof the work we made several experiments by modifyingthe light conditions and distance between the robot andthe subject The aim of these experiments is to observe theimpact on the visual tracking due to the changes in thecolor information produced by the different light conditionsDepending on the changes of the color the robot may failwith possible false tracking Using a light meter illuminationvalues at different distances were tested using the SOAMST[27] visual tracking algorithm So it is possible to see whichlux (quantitative measurement of the light conditions) valuesprovide good or bad tracking results with the given system

321 Experiment Setup The experiments for the illumi-nation tests have been done in the scenario presented inFigure 12 The robot with the Kinect camera is in charge of

RobotLight source

140m-450

ndash500 lux

185m-350

ndash400 lux

230m-250

ndash300 lux

300

m-150ndash20

0 lux

360

m-100lux

495

m-50 lux

625

m-40 lux

Figure 12 Illumination experiment setup

Figure 13 Illumination sensor CUSTOM LX-2000SD light meter

recording the video and it is located in one of the corners ofthe laboratory close to the light source A halogen lamp is theonly light source used in the room

Different spots to evaluate the tracking system have beenestablished based on the luminosity ranges measured bythe illumination sensor CUSTOM LX-2000SD light meter(Figure 13) For example the first spot has an illumination

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 10: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

10 The Scientific World Journal

Table 3 Moving robot condition height and width variance of the tracking region with one-color representation

Covariance Red Green Blue

Regionheightvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

700

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

Regionwidthvariance

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450

020

040

060

080

010

0012

0014

0016

0018

0020

00

050

100150200250300350400450500

020

040

060

080

010

0012

0014

0016

0018

00

0

100

200

300

400

600

500

value between 450 and 500 lux and it is located at 140 metersfrom the camera (Figure 12) and the values at 495 and 625meters are 50plusmn 15 lux and 40plusmn 5 lux respectively (Figure 12)

322 Color Histogram Analysis with Changing ConditionsFor this experiment the robot with the camera is fixedat the position presented in Figure 12 It records a videoof the immobile subject for each of the marks to see thevariations in color representation by checking the histogramrepresentation

The color histogram results are presented in Figure 14Histograms 1 2 and 3 show the information when thesubject is at the marks at the distance of 185 360 and 625meters and in Figure 14 graphs 4 5 and 6 demonstrate thehistograms at the samemarkswhile the robot has beenmovedfrom the initial position closer to the subject within thedistance of around 15 to 2 meters

As it can be seen from the preliminary analysis lightcondition and the distance between the subject and robothave a great impact on the reference target model basedon normalized color histogram for visual color trackingalgorithm due to the fact that these conditional deviationsmay change the texture details of the region selected onthe subject However for a better understanding of thelimitations in this work additional experiments were done toinvestigate these conditions with SOAMST [27] color basedregion tracker

Hence at this scenario twomore different tests have beendone (i) One test was to study visual tracker while the robotand subject move (ii) Then visual tracking performance wastested when the robot is in a fixed position and just onlythe subject moves Both experiments were done with twodifferent scenarios where two different region templates were

tried in each case to see the performance difference by usingone- and two-color region models The following sectionspresent the experiments and their results of these tests

323 Robot and SubjectMoving At this experiment both therobot and subject are moving The robot moves while tryingto keep a constant distance within an interval from 15 to 2meters The subject moves in the marks set up at the distanceof 185 360 and 625 meters where there are three differentlight conditions around 350ndash400 lux around 100 lux andbelow 50 lux respectively

These experiments make it possible to determine whichlux values are in the safe or critical zone for region trackingmodels The experiments on the tracking were done usingthree different color representations separately (a) red (b)green and (c) blue while testing just one color for subjectrepresentation

Figure 15 and Table 3 present the tracking results for eachcolor space In Figure 15 the green ellipsoid region is thesearch area to match the target model and the red ellipsoidregion is the detected target region where centroid of thetarget region defines the tracking point of the current frameSo tracking success or failure is decided whether the centerof the target region is on the subject or not

Since during the subjectrsquos motion the distance betweenthe robot and the subject has been the same in all theexperiments one should expect that search and trackedregions by the SOAMST [27] would be similar too howeverthere are high deviations for the matched region becausethe illumination value changes The SOAMST [27] algorithmcalculates the covariance based on the current search regionand reference target distribution in which this matrix basedon the second order moments can represent the height

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 11: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 11

050

010

0015

0020

0025

0030

0035

0040

0045

00

0001002003004005006007008009 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

001020304050607 3

050

010

0015

0020

0025

0030

0035

0040

0045

000

001002003004005006007 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

0001002003004005006007008

050

010

0015

0020

0025

0030

0035

0040

0045

00

10

500

1000

1500

2000

2500

3000

3500

4000

4500

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0010203040506070809 3

0001002003004005006007008009

050

010

0015

0020

0025

0030

0035

0040

0045

00

4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035

04045

05 6

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014 1

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 2

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 3

050

010

0015

0020

0025

0030

0035

0040

0045

00

0002004006008

01012014016 4

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 5

050

010

0015

0020

0025

0030

0035

0040

0045

00

0005

01015

02025

03035 6

Red color

Green color

Blue color

Figure 14 Target feature analysis for different colors with normalized histogram

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 12: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

12 The Scientific World Journal

Red

Green

Blue

Figure 15 Moving robot condition tracking results with one color representation

Green and blue

Blue andbark blue

blueRed and

Figure 16 Moving robot condition tracking results with two-color representation

width and orientation variance of the ellipsoid that definesthe tracked region on the subject [27] So depending onthe orientation of the tracked ellipsoid region these highdeviations on the region representations can be visible eitheron the width or on the height scale as in Table 3 In particularthe lux values around 350ndash400 (frames around 1 to 400)keep the subject tracking region similar however when thesubjectmoves to darker region around 100 lux (around frame600 to 1200 in Table 3) target model and current matchingare not consistent enough due to the loss of informationin the normalized color histogram so tracking qualitydecreases as this problem was also presented in Figure 14priorly

But despite the loss of information tracking can bestill performed However if the light on the subject goesunder 50 lux it is appreciated that the selected target andtracked region are not consistent anymore (last two columnsof Figure 15) Moreover for the blue color the trackingalgorithm fails to calculate scale and orientation changeswith less than 50 lux For red and green colors the tracking

is possible but the tracked regions consistency decreasedsignificantly

In addition to the one-color based tests the other testhas been done using two colors red and Blue green andblue and blue and dark Blue The results of this experimentare given in Figure 16 and Table 4 The use of two colorsincreases the initial target region on the subject (see the firstcolumns of Table 4) when compared to the one-color modelsituation The impact of the light conditions has been provento be similar to the previous scenarios when only one-colorrepresentation is used for target model

324 Robot at a Fixed Position and Subject Moving For thisexperiment the robot with the camera is in a fixed positionthe same position as demonstrated in Figure 12 The camerarecords data while the subject is moving at the marks locatedin a distance of 185 360 and 625 meters It should be notedthat while the subject ismoving the same distancewill not bepreserved between the subject and the robot as in the previousexperiment The purpose of this experiment is to evaluate

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 13: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 13

Table 4 Moving robot condition height and width variance of the tracking region with two-color representation

Covariance Red amp Blue Green and blue Blue and dark blue

Regionheightvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

200

400

600

800

1000

1200

1400

1600

0200400600800

100012001400160018002000

020

040

060

080

010

0012

0014

0016

0018

0020

00

0

200

400

600

800

1000

1200

1400

1600

1800

0

200

400

600

800

1000

1200

1400

1600

1800

Regionwidthvariance

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0

100

200

300

400

500

600

0

200

400

600

800

1000

1200

1400

1600

1800

0

100

200

300

400

500

700

600

the impact on the tracking system when modifying distanceand illumination together by observing the relation betweenthese two parameters

When the distance was kept in a constant intervalbetween the subject and the robot tracking could be doneeven at low lux light conditions such as 100 lux even thoughit was not the desired case While using just one-colorrepresentation tracking performance was acceptable if thelight conditions were above 100 and distance was between 1to 3 meters

On the other hand when the robot is not moving andthe subject is 36 meters distance which has around 100 luxvalue green color based tracking (see Figure 17 4th column)failed because target color model matched with an irrelevantregion which also had similar color to the reference targetmodel inside the search area Finally red and blue casesbarely could continue tracking the subject (Figure 17) So thelight conditions below 50 lux with a distance higher than 5meter are not sufficient to rely on color based tracking forall tested color samples In addition for the two-color andthe fixed robot experiments subject tracking fails for all thecases when the light condition is less than 100 lux and thedistance between subject and robot is greater than 36 meters(Figure 18)

4 Robot Behaviour Experiments withDifferent Robot Motion Parameters WhileTracking a Subject

After the visual tracking experimental analysis based onillumination and distance factors the results state that someconstraints should be applied to the robot tracking such as

stable light conditions and reliable distance between subjectand robotThen with these constraints robot tracking can betested for further improvements

With the goal of making the system more robust theobjective of these tests is to analyze twomain parameters thatcan significantly affect the motion of the robot for tracking asubject while it is moving and changing his trajectory and toestablish the optimal values for themThe two parameters arethe speed of the robot and the inflation radius of the obstacledefinition that can affect the path planning during trackingtask

In addition regarding the at-home biomonitoring appli-cation the aim is to follow and recognize the behaviorof people with walking difficulties such as the elderly andmotor-function impaired people Since the robot can trackthe elderly or impaired walking patterns easier comparedto normal walking and the speed of the targeted subjects isslower than 3 kmhr which is the lower limit of self-pacednormal walking the experiments regarding robot velocityand inflation ratio were done with normal walking patternonly

The speed parameter is set up as the maximum speed ofthe robot This parameter has effects on keeping the targetin the visual scene and on the required time to slow downor stop The inflation radius affects the path of the robotwhile tracking the subject due to the fact that it expresses theobstacle definitions or available free space for the robot

This section describes the experiment setup and thetwo tests that have been performed (i) the collision speedand (ii) the robot performance tests The first test aims toestablish the maximum speed for the robot that prevents it tocollide with the user considering the subject and robot safety

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 14: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

14 The Scientific World Journal

Red

Green

Blue

Figure 17 Fixed robot condition tracking results with one-color representation

Green and blue

Blue andbark blue

blueRed and

Figure 18 Fixed robot condition tracking results with two-color representation

criteria The second one analyses the speed and the obstacleinflation radius of the robot to evaluate how they change theperformance of the tracking process

41 Experiment Setup The experiment setup is done bydefining a set of configuration parameters and building thelaboratory map To set up the experiment the first step isto create the laboratory maps using mapping function of therobot as in Figure 19 for the three different layouts used inthese experiments Also in our experiment the same roomwas used in all tests but different layouts also were testedby making smaller or larger free space for checking robotbehaviour in different situations In these layouts setting theexperiment in the same room or in a different room was notvery crucial since the important point is testing the robotmotion in different situations with various free space optionsThese layouts (see Figure 20) may not present a problem forthe system for autonomous navigation but they do so forperforming navigation with subject tracking task based on

Blac

kboa

rd

Blac

kboa

rdD

esk

130

137160

148105

190

70

Figure 19 Sample for experimental layout (the numerical values incentimeters) and global map

the visual tracking feedback For the visual tracking systemthe robot must keep a direct visible line with the subject forvision feedback and also it has to update the path to follow

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 15: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 15

105

130

1

2

3

4

1

2

1

2

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Blac

kboa

rd

Obs

tacle

s

Obs

tacle

s

DeskD

esk

Des

k

Des

k

TableBed

70

70

55

95

200105

178

200

73

55

182

137

160

190

70

105

148

Layout 1 Layout 2 Layout 3

Figure 20 Experiment layouts and subject tracks (the numerical values in centimeters)

the subject based on the position of the subject on globalmap

The layouts have been designed to represent differentsituations that may occur in the real environments The firstlayout represents a room with no obstacles in the center Allthe obstacles are in the borders of the room In the secondand third scenarios there is an obstacle in the center of theroom that makes it more difficult for the robot to move as itis forced to follow the subject where there is only one directpath available The main differences between second and thethird layouts are the size and shape of the central obstacle Inthe third layout the obstacle is bigger reducing the availablepath for the robot

Once the physical layout is already deployed in thelaboratory the next step is to build the map that the robotuses in the tracking process The map is built as describedin [23ndash25] In Figure 19 the image at the right is an exampleto present the robot map built from the laboratory layoutThe areas in black represent the obstacles for the robot andthe white areas are safe for the robot to move As it canbe seen the robot is free to move in the room except inthe area delimitated by the obstacle in the middle of theroom

Two parameters for robot decision on moving to followthe subject are common to all the experiments presented inthis paperThemaximum distance between the robot and thesubject is set to 12 meters This means that the robot startsmoving when the distance to the subject is higher than thisvalue in another case the robot is able to track the subjectwithout moving

The maximum distance has been established based onthe results of the illumination tests As stated in the previousexperiments for a distance higher than 36 meters colorbased visual tracking system does not provide reliable resultsdue to illumination and distance variations that cause lowsimilarity or texture details A value of 12 meters providessome reaction time to the robot to start moving when thesubject starts moving while keeping the distance betweenthem lower than 36 meters Besides the robot considers thatthe subject is moving when it covers a distance change of

at least 20 cm on the global map as described in the activityrecognition experiments

As stated previously there are two important parametersfor robot motion included in this part of the paper whichhave been studied in order to determine their impact for thetracking and movement of the robot These parameters arethe speed of the robot and the inflation radius of the obstaclesAt the system (ROS) for the robot motion (Figure 2) theseparameters are modeled with the following variables

(i) Maximum speed of the robot the speed of the robotis controlled by the max vel x parameter The valuesthat this parameter can take represent the maximumforward velocity in metersseconds of the robotThis parameter is included in the configuration filecostmap common params Yaml

(ii) Inflation radius this parameter represents the radiusin meters to which the map inflates obstacles costvalueThis parameter is included in the configurationfile costmap common paramsyaml

42 Collision Test Results Once the experiment has been setup it is possible to start the tests to analyze the impact ofthe speed and the obstacle inflation radius While increasingthe speed of the robot a new problem appears because ofthe robot deceleration At higher speeds robot needs moretime to stop If the speed is too high the robot may not haveenough distance with the subject and they may collide

Theobjective of the collision speed test is to determine themaximum speed of the robot that implies no collision withthe subject To perform these tests the subject walks from thestarting point to the mark ldquo2rdquo of the map (Figure 20 layout3) The speed of the subject has to be at least faster than therobot When the subject reaches the mark ldquo2rdquo subject waitsuntil the robot stops

Table 5 presents the results of the test by modifyingthe speed of the robot and the obstacle inflation radiusThe speed is decreased from 070meterssec until the robotdoes not crash with the subject In our tests the obstacleinflation radius seemed to have no influence on the results of

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 16: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

16 The Scientific World Journal

Table 5 Robot collision test results

Distancespeed 005 035070 Yes Yes065 Yes Yes060 No No

the collision tests such that using a low inflation radius value005 or a higher one 035 does not affect the collision testresults

43 Robot Performance Test Setups Speed and obstacle infla-tion radii are two parameters that affect the trajectory ofthe robot during subject tracking task This section describesthe tests to study the impact of both parameters on theperformance of the tracking process

The maps used for these experiments are the onespresented in Figure 20 With the layouts of these maps twodifferent tests are defined

(i) Nonstopping test this test only includes the thirdlayout in Figure 20 At this test the subject walksfaster than the normal speed around 1 step persecond He moves from the mark ldquo1rdquo to mark ldquo2rdquo andthen to the mark ldquo1rdquo to start over

(ii) Stopping tests these tests have been done on eachof the three layouts shown in Figure 20 The subjectmoves from one mark to another by following thepaths presented in Figure 20 in the order of thedestination numbers and at each destination pointthe subject stops and waits for 3 seconds

Once the subject has completed the number of turnsestablished for each experiment the test is over and the resultsare loggedThe results of the experiments containwhether therobot was able to complete the track the time to complete thepath and the path followed by the robot

There are three possible reasons about why the robot wasnot able to complete the track

(i) the robot lost the tracking of the subject(ii) the robot crashed with any of the obstacles of the

room(iii) the robot got stuck for more than 30 seconds

All tests have been done by modifying the differentvalues for the maximum robot speed and inflation radiusparametersThe rest of parameters in the localization processremain the same for all the tests

44 Nonstopping Tests The nonstopping tests represent anunnatural situation as the subject is continuously movingaround one obstacle at fast speed This scenario can barelyhappen in the real life However the objective of this scenariois to analyze the behaviour of the tracking system when therobot is pushed to move and turn fast It also representsa challenge for the Kinect moving camera as the synchro-nization between the quick movement of the robot and the

Table 6 Robot tracking performance (nonstopping tests)lowast

Speedinflation radius 005 015 025 035050 Yes (+30) No3 No3 Yes (+40)100 Yes No2 No2 No2

150 Yes (+15) No2 No3 No2

lowastReasons not to complete the track Stuck1 Crash2 and Lost3

moving platform is not an easy task Due to this fact it ispossible to loose the track of the subject

Faster speeds also mean more vibration for the robotwhich can contribute to loosing the localization of itself onthe map When the localization is lost the robot can collidewith the obstacles in the room

As it was previously proven in Section 42 the maximumspeed to prevent the robot to collide with the subject is060ms But for these tests it is possible to increase the speedof the robot over that speed as the subject is walking fasterso there is no risk that the robot could collide with him Sothe speed values tested in this scenario are 050 100 and150ms respectively and the obstacle inflation radius goesfrom 005 to 035 meters in increments of 01 meters

For this experiment the subject turns four times aroundthe obstacle starting from ldquosubject startrdquo point and finishingat mark ldquo1rdquo To complete one round to the circuit at theestablished subject speed it takes around 15 seconds Thismeans that the each test for a given speed and inflation radiusshould take around 1 minute for the whole 4 rounds

The results of the experiment are shown in Table 6 Thevalues in the fields of the table indicate if the robot was ableto complete four rounds in the circuit and if applicable theextra time over the normal time to complete the experimentWhen the total time ismuch higher than 60 seconds it meansthat the robot got stuck but it was able to get back to the trackand the experiment can be continued

45 Stopping Tests These tests were applied to the threedifferent layouts and tracks as presented in Figure 20 All thetestsweremade for 5 turns around the roomwhere the subjectstops in the different points that are marked on the tracks asshown in Figure 20

For each layout the maximum tested speed is 060mswhich is the maximum speed for the robot that prevents thecollision with the subject as proven in the previous sectionOther tested values are 050 and 040ms In addition theinflation radius has also beenmodified for these experimentsinwhich it takes values from005 to 045meters in incrementsof 01 meters

The results of the experiments are shown in Tables 7 8and 9 The values in the fields of the table indicate whetherthe robot was able to complete 5 rounds to the track and alsowhether the robot got stuck but it was able to recover withinthe 30 seconds previously specified

For the experiments performed in the two first layouts(see the results in Tables 7 and 8) the robot did not have anyproblems to complete the tracks For the second layout it gotstuck sometimes but it was able to recover the tracking andends the tests with success

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 17: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 17

Table 7 Robot tracking layout 1 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes4 Yes4

040 Yes Yes Yes4 Yes4 Yes4lowastThe index (4) indicates that the robot got stuck but recovered

Table 8 Robot tracking layout 2 (stopping tests)

SpeedInflation radius 005 015 025 035 045060 Yes Yes Yes Yes Yes050 Yes Yes Yes Yes Yes040 Yes Yes Yes Yes Yes

Table 9 Robot tracking layout 3 (stopping tests)lowast

Speedinflation radius 005 015 025 035 045060 No1 Yes4 Yes No1 Yes050 No1 No1 Yes4 Yes4 Yes040 No1 No1 No1 No1 No1

lowastThe index (4) indicates that the robot got stuck but recoveredThe index (1)for not complete track cases indicates that robot got stuck and failed to track

For the first layout (Figure 20) as presented in Table 9the results show that the robot has no problem to track thesubject in any of the combinations of speeds and inflationradiusThemain reason of the good results is that in this kindof layouts the robot can move in the center of the room Therobot is not forced to follow the same path as the subject butit can track himher just by moving in the center of the roomavoiding most of the obstacles that can cause a collision Bluelines in Figure 21 express the path followed by the robot whiletracking the moving subject with the pathmarked with greenlines

The second layout (Figure 20) presents a scenario wherean obstacle is in the middle of the room This obstaclerepresents furniture such as a table and a desk With thislayout we are preventing the robot from have a comfortableposition in the middle of the room as in the first layout andby this way we force it tomove around the room following thesubject In this layout the empty space around the obstacle ishigh and it allows the robot to move around without seriousdifficulties For some of tests in this condition the robot wasstuck but it could recover within the given 30 seconds

The third layout (Figure 20) represents the most restric-tive scenario where the robot is forced to follow a narrowpath while tracking the subject The shapes of the corners ofthe obstacle present also an extra difficulty for the robot Thelowest speed that has been tested was 04ms For this speedvalue there was no success in any of the tests independentof the inflation radius value The reason of why the robotdoes not complete the track is due to the fact that robotwas stuck during tracking especially because of enteringthe corner regions that restricted the robot to define a freepath to move In this layout faster speeds increased the

2

1

3

4

Figure 21 Track for layout 1 and robot path during the experiment

success ratio of the experiment For the 12 tests made 6 ofthem present good results while the robot was not able tocomplete the 10 rounds of the track in 8 of the remainingresults As presented in Table 9 the results are not as goodas for previous experiments The narrow paths and obstaclesdrastically reduce the success rate of the experiments

5 Experiments on the ActivityRecognition Algorithm

After completing visual and robot tracking experimentsfinally this section expresses the experiments on activityrecognition especially walking activity detection for datarecording to support possible health care assistance to thesubjects For the experiments first data were recorded fromtwo subjects with their normal walking and two types ofsimulated walking patterns such as impaired and elderlywalking where elderly and impairedwalking conditionsweresimulated by wearing tools as demonstrated in Figure 10

For each case 1000 frames were used to test the activityrecognition algorithm as given in Table 1 of visual trackingdata in which subjects walk or stand inside the room con-tinuously In our work walking activity detection has higherpriority on other activities since the walking patterns can beanalyzed whenever necessary especially for the elderly andimpaired people at-home health care support It is importantto record the data during the subjectsrsquo walking activity andit can be achieved for different subjects and walking patternsrobustly Hence in this part we tested the recognition rateof the algorithm for different walking patterns separate fromother activities even though including the static activitieswould increase the accuracy due to high detection rate ofstatic actions

In the experiments person specific parameters for activ-ity recognitions are decided for the subject 1 and impairedwalking case initially and these selected parameters areapplied on all cases without any change Impaired walking(IW) is the slowest motion among all cases so velocityparameter is defined according to IW where it is given as

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 18: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

18 The Scientific World Journal

Table 10 Activity recognition results for different types of walkingpatterns

Type True False AccuracyIW 984 16 9840

Subject 1 EW 974 26 9740NW 947 53 9470IW 976 24 9760

Subject 2 EW 963 37 9630NW 931 69 9310

Overall performance 5775 225 9625

20 cm per 10 frames that is around 02ms for the currentsystem

Table 10 gives the activity recognition results for therecorded data described in Table 1 The results show thatwalking condition can be recognized with high accuracy forvarious walking patterns even though the same heuristicswere used for all cases In both subjects impaired walkingtests have the best accuracy on detecting the differencebetween standing andwalking activities around 98Normalwalking of subjects had the lowest accuracies around 93 and95 however these results are still in acceptable accuracyand overall accuracy for all data is more than 96 which isquite high considering the usage of same heuristics in eachcase

Moreover our analysis on false recognition generallyincludes the cases during transition states such as (i) startof walking from standing (ii) transition from walking tostanding condition and (iii) rotation of the subjects aroundthemselves with high position changes On the other handthe accuracy of correctly detected walking activity is morethan 99and is almost 100when the subjects are in the stateof continuous walking motion without transition conditionfrom one state to another Therefore it can be concludedthat the proposed activity recognition algorithm is a reliableapproach to record the walking activity cases for the purposeof further analysis as an assistive technology for at-homebiomonitoring systems

In addition we also made experiments on other activitiesincluding standing sitting bending lying down on cyclingand falling where falling is another important case suchas walking in at-home monitoring applications In all theseexperiments since the activities such as standing and sittingwere static cases without any motion except falling activitywe started recognition when the activity was in its statewithout transition periods as in changing from one activityto another For example the frames during transition fromstanding to sitting standing to bending and so forthwere notcount because these transitions were not the actual activity ofstanding sitting bending or lying down cases Also for thefalling case the activity was classified as falling continuouslyuntil the subjectmoved and changed the activity state to otheractivities

In Table 11 the recognition results for various activitieswere givenwhere 1000 frameswere tested for each activity Asit can be seen that static behaviors such as standing sitting

Table 11 Activity recognition results for different activities

Activity True False AccuracyStanding 1000 0 10000Sitting 1000 0 10000Bending 997 3 9970Lying down 1000 0 10000Cycling 1000 0 10000Falling 991 9 9910Overall 5988 12 9980

and lying down yielded 10000 accuracy without any falserecognition since these activities were very stable due to theirstatic nature Bending had 3 false recognition cases among the1000 frames and this is probably because of the large amountof motion and the change in pose and height during eachbending activity which might have caused misclassificationto similar or transition activities such as sitting or standingand their transition states Again with the assistance oflocalization information activity on cycling machine couldbe recognized at 10000 accuracy Also falling activity wasrecognized with more than 99 accuracy which is crucial todetect this activity on emergency actions for the people livingalone And over all recognition rate was obtained as 9980from 6000 frames for the tested activities shown in Table 11

In addition to these results in Figure 22 two cases withocclusion of the lower limbs were given and these twoactivities were correctly classified by the proposed activityrecognition model Many algorithms including our previousimplementations [5 8] handle the activity recognition bytaking advantage of features extracted from lower limbs andupper body However when there is an occlusion on lowerlimbs activities such as walking can not be recognized accu-rately not being able to extract any observable features fromlower limbs [5 8] Therefore our proposed model makes fulluse of the tracking position height of the tracked subject andglobal location of the tracked subject as explained inmethod-ology section for activity recognition (Section 25) With theinformation even though there is not clear information aboutlower limb joints (Figure 22(a)) or the partial occlusion onboth upper body and lower body (Figure 22(b)) walkingactivity could be detected accurately only if we know that theheight of a subject is comparable to the height of walking poseand the subjectrsquos global position is changing on global mapAnd our walking tests in Table 10 demonstrate that walkingcan be recognized with high accuracy using person specificfeatures and localization assistance In sum it can be seen thatthe proposed system can handle occlusion situations whenthe subject is accurately tracked as in the examples shown inFigure 22

6 Discussion

This section provides an analysis of the results presented inthe previous experiments the illumination and the speed andobstacle inflation radius results in order to determine the bestsetup for the system

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 19: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 19

Sitting

(a)

Walking

(b)

Figure 22 Sample robot system tracking and activity recognition results with partial occlusion cases for (a) sitting and (b) walking activities

61 Visual Tracking and Illumination Results First of allthe experiments for the visual tracking demonstrate thatintegration of the depth information to handle the illumi-nation variations improved the tracking results obviously Itmeans that multiple sensory information fusion is requiredfor a robust indoor tracking to achieve at-home monitoringmobile robot Hence to be able to handle various illumina-tions different sensor types can support the system such asthe infrared sensors in the case of low illumination or darkroom conditions

Based on the results of the performed tests it is possibleto state that color informationmay not be reliable all the timeduring real-time tasks considering the subject that is trackedby the mobile robot during daily activities So it is beneficialto find out limitations of the color tracker used in our systemand safe regions to utilize the visual tracker relying on colorconsidering the distance and illumination conditions

In sum this experimental analysis concludes that somecertain conditions should be provided at the environment fora reliable color region based visual tracking algorithm (1) theillumination condition on the subject to be tracked should begreater than 100 or 200 lux for better results and (2) for thegiven lux and distance specifications as in Figure 12 if thedistance between robot and subject is able to keep the intervalless than 36 meters there is high chance of successful andcontinuous robot tracking supported by the visual trackingalgorithm

62 Speed and Obstacle Inflation Radius Results The speedof the robot and the inflation radius are two importantparameters to be consideredwhen performing the tracking ofthe subject and the results of the activitymay differ dependingon the layout of the room There are no studies that analyzeparameters that may impact the behaviour of a robot whenperforming a vision tracking activityTherefore the objectiveof the experiments related to those parameters was to identifythe speed range and the obstacle inflation radius where thetracking process is more stable with the given constraints ofvisual tracking algorithm

The speed parameter is needed to be high enough to beable not to lose the subject but slow enough not to make

the robot collide with the obstacles andor the subject Whenthe speed of the robot is high the robot covers more distancewhile stopping This distance while the robot is deceleratingcan be not enough and may cause the collision with thesubject or even with an obstacle Using the results presentedin Table 5 it is possible to determine the maximum robotspeed to prevent the collision with the subject The collisionis inevitable when the max vel x parameter takes values over065 Based on this results the robot speed must be setat 060ms or even lower than that to prevent this fromhappening The obstacle inflation radius has no influence onthe experiments

Experiments described in Sections 44 and 45 study theperformance of the tracking system based on the speed andthe obstacle inflation radius With the results of these testsit is possible to compare the path followed by the robot andto check if it is able to complete the specific track of theexperiments

As it has been mentioned nonstopping tests do notrepresent the normal behaviour of the activity performedby humans at home Table 6 shows the results of thesenonstopping tests As it is presented the rate of success isquite low especially when the speed is set to 100 and 150ms

It is possible to observe how the inflation radius affectsthe path followed by the robot The influence of the inflationradius is presented in Figure 23 For both maps the speed ofthe robot is set to 15ms The map on the left presents thetrack of the robot for an obstacle inflation radius value of005 and the one on the right for a value of 035 As it canbe appreciated the track is softer when the inflation radius islower At the map of the right the curves when the robot isturning are sharp bends A softer path makes the robot lesssuitable for loosing the track or the localization in the mapThis is the reason why the results at Table 6 present goodresults for an inflation radius value of 005 but the results getworse when this value is increased

Another interesting result presented in Table 6 is the timeto complete the track when the speed is set to 050ms Thetime values are higher for this value even when the subjectis always walking at the same speed Because the speed ofthe robot is not high enough the robot sometimes got stuck

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 20: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

20 The Scientific World Journal

Figure 23 Track for 15ms speed and 005035 of inflation radius

for a few seconds The subject has to wait for the robot to bereleased and that results in increasing the required time tocomplete the track

The situation when the subject stops is different from theone previously exposed People at home are used to walk andstop when walking in a roomThe walk to a desk and stop foramoment to pick something and then start walking again andso on This is the situation represented by these experimentsand this situation is more favorable for the tracking processAs the user stops for a short time the robot can use that breakto recover the lost distance and get a better position in theroom This allows for the use of lower speeds by increasingthe reaction time of the robot

In this part we have analyzed three different scenariosrepresenting three layouts with different characteristics In anopen layout Figure 20mdashlayout 1 when there are no obstaclesin the middle of the room there is no restriction for the speedof the robot the only limit is the maximum speed of 060msthat prevents the robot from crashing with the subject Therobot is able to perform its activity without any problemMoving in the roomcenter freely the speed or inflation radiusdo not have any effect on the performance The same resultsare obtained when the robot has a wide space tomove aroundan obstacle (Figure 20mdashlayout 2) The robot can follow thesubject and the success in all the tests states that any value inthe range of the one tested provides good results

When there is not enough space to move andor whenthe shape of the obstacles is complex the speed and infla-tion radius for the robot motion have to be considered toguarantee the success of the tracking process As a result ofthe experiments only speeds over 040ms lead to successValues around 050ms and inflation radius over 015 increasethe success rate of the tracking activity even if it does notprovide 100 success

The exception for this speed range was when the inflationradius was set to 005 meters Using this low value the robotwas not able to complete the track Probably with 005metersinflation ration the robot can get close to the obstacles andso this may make it easier for the robot to get stuck duringsubject tracking task Indeed the reason why the robot couldnot complete the track was always that the robot got stuckduring navigation

The success rate for these two speed parameters whenthe inflation radius is over 005 is 5 tests out of 8 At thesespeeds the robot looks quite stable and the only reasons notto complete the track are that if it was stuck in some pointsof the track But generally the parameter combination with

a speed of 050ndash60ms and an inflation radius 015ndash045 offerthe best results in complex scenarios

As a summary for open and wide areas speed andinflation radius do not affect the performance of the trackingactivity But for the not ideal conditions if the room containsa narrow path of complex obstacles the value of theseparameters should be restricted to the range proposed in thispaperThe results of the test show that setting the speed of therobot at 05 or 06ms and the inflation radius between 015and 045 generates better results

63 Activity Recognition Results With mobile robot if arobust visual tracking and robot based subject tracking isprovided the experimental results depict that it is possible toachieve activity recognition for different subjects andwalkingpatterns using the same simple heuristics which is crucial torecord reliable data of the subjects

Our previous works [5 8] had activity recognition perfor-mances over 98 with high accuracy however these modelsrequired training data and tuning for each specific subjectwhich is difficult to handle each time when the subject ischanged Also these calculations including the state-of-the-art models and that generally requires whole body contourfeatures for walking or dynamic activity detection But withthe proposed model the activities can be recognized underlimited conditions such as partial occlusion

Using the proposed model with simple person specificheuristics and localization assistance we were able to obtain98 activity recognition accuracy for all activities tested by12000 frames even without using large amount of trainingdata The proposed algorithm does not require color imagebased feature extractionwhich is also important by providingrobust activity recognition under changing environmentalconditions Moreover dynamic activities on static positionssuch as cycling on cycling machine can be detected alsowith the localization assistance provided by the subject globalposition

It can be concluded that the proposed activity recognitionalgorithm is a reliable approach to record the walking activitycases for the purpose of further analysis as an assistivetechnology to at-home biomonitoring systems

7 Conclusion

This paper has presented three main contributions theproposal of a new activity recognition algorithm the effectof illumination conditions in the visual tracking process andthe impact of parameters related to robot motion and pathpanning in the robot navigation behaviour when tracking asubject

Based on the illumination tests it has been proven thatthe ambient illumination also affects the behaviour of therobot The better results are obtained with illuminations of100ndash200 lux or more When this light condition is presentedthe robot can perform the tracking process with high chancesof success In addition to the light condition the distancewhich also effects the texture details obtained from the sensoris an important factor and better results are observed in theconditions that the subject to robot distance is less than 36

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 21: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

The Scientific World Journal 21

meters Since the reliability of color based visual tracking isdecreasing due to distance and illumination factors a betterapproach for visual tracking task to work with changingillumination conditions should be modeled for example byswitching from color to infrared sensor based tracking Inaddition to visual tracking robot existence as a assistivetechnology and its behaviour indoor environment shouldbe studied qualitatively concerning the human feelings andpsychological response to use the mobile robot systems

Analyzing robot parameters for the scenarios used forthe experiments presented in this paper it has been shownthat the two analyzed robot parameters speed and inflationradius can affect the behaviour of the robot while the robotis actively tracking the subject with visual tracking modelThe experiments presented in this paper have to determinethe best speed and obstacle inflation radius parameters forthe tracking process For simple scenarios the impact ofthese two parameters is not relevant but for more complexscenarios the robot fails while tracking the subject for someparameters combinations For these types of scenarios thetests speeds of 05 and 06ms offered better results Increas-ing the speed over these values provokes the collision of therobot with the subject Decreasing the speed can make iteasier for the robot to get stuck At the recommended speedsthe obstacle inflation radius must be greater than or equal to015 to offer better results

The new localization assisted activity recognition algo-rithm has been integrated with the robot platform Thisalgorithm is faster andmore robust and it is effective in thosecases where the lower limb body is not visible It also does notrequired training data to operate The results are promisingfor being able to detect walking activity for different subjectsand walking patterns

In summary the contribution of this paper can be statedas (i) integration of a new localization assisted activityrecognition algorithm in a mobile robot that tracks a subject(ii) analyzing how the illumination conditions and robotmotion parameters affect the tracking and (iii) finding theoptimal conditions for robot based subject trackingwhere theconditions can be listed as (a) quantitative light conditions(lux value) (b) robot to subject distances (c) robot speedand (d) map inflation radius during path planning and robotmotion

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

Nevrez Imamoglu and Enrique Dorronzoro equally con-tributed to this paper

References

[1] T S Barger D E Brown and M Alwan ldquoHealth-status mon-itoring through analysis of behavioral patternsrdquo IEEE Trans-actions on Systems Man and Cybernetics Part A Systems andHumans vol 35 no 1 pp 22ndash27 2005

[2] E D Zubiete L F Luque A V M Rodriguez and I G Gonza-lez ldquoReviewofwireless sensors networks in health applicationsrdquoin Proceedings of the Annual International Conference of theIEEE Engineering in Medicine and Biology Society (EMBC rsquo11)pp 1789ndash1793 2011

[3] C Jayawardena I Kuo C Datta R Q Stafford E Broadbentand B A MacDonald ldquoDesign implementation and field testsof a socially assistive robot for the elderly healthBot version2rdquo in Proceedings of the 4th IEEE RAS and EMBS InternationalConference on Biomedical Robotics andBiomechatronics (BioRobrsquo12) pp 1837ndash1842 Rome Italy June 2012

[4] N K Suryadevara and S C Mukhopadhyay ldquoWireless sensornetwork based home monitoring system for wellness determi-nation of elderlyrdquo IEEE Sensors Journal vol 12 no 6 pp 1965ndash1972 2012

[5] N Myagmarbayar Y Yuki N Imamoglu J Gonzalez MOtake and W Yu ldquoHuman body contour data based activityrecognitionrdquo in Proceedings of the 35th Annual InternationalConference of the IEEE Engineering in Medicine and BiologySociety (EMBC rsquo13) pp 5634ndash5637 July 2013

[6] H-M Gross C Schroeter S Mueller et al ldquoProgress indeveloping a socially assistive mobile home robot companionfor the elderly with mild cognitive impairmentrdquo in Proceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems pp 2430ndash2437 September 2011

[7] T Tamura A Kawarada M Nambu A Tsukada K Sasakiand K-I Yamakoshi ldquoE-healthcare at an experimental welfaretechno house in Japanrdquo The Open Medical Informatics Journalvol 1 no 1 pp 1ndash7 2008

[8] M Nergui Y Yoshida N Imamoglu J Gonzalez M Otakeand W Yu ldquoHuman activity recognition using body contourparameters extracted from depth imagesrdquo Journal of MedicalImaging and Health Informatics vol 3 no 3 pp 455ndash461 2013

[9] MOgawa S Ochiai K ShojiMNishihara and T Togawa ldquoAnattempt ofmonitoring daily activities at homerdquo in Proceedings ofthe 22nd Annual International Conference of the IEEE Engineer-ing inMedicine and Biology Society vol 1 pp 786ndash788 ChicagoIll USA July 2000

[10] A Fleury M Vacher and N Noury ldquoSVM-based multimodalclassification of activities of daily living in health smart homessensors algorithms and first experimental resultsrdquo IEEE Trans-actions on Information Technology in Biomedicine vol 14 no 2pp 274ndash283 2010

[11] T B Moeslund A Hilton and V Kruger ldquoA survey of advancesin vision-based humanmotion capture and analysisrdquo ComputerVision and Image Understanding vol 104 no 2-3 pp 90ndash1262006

[12] L Chen J Hoey C D Nugent D J Cook and Z Yu ldquoSensor-based activity recognitionrdquo IEEE Transactions on Systems Manand Cybernetics Part C Applications and Reviews vol 42 no 6pp 790ndash808 2012

[13] M Zhang and A A Sawchuk ldquoHuman daily activity recogni-tion with sparse representation using wearable sensorsrdquo IEEEJournal of Biomedical and Health Informatics vol 17 no 3 pp553ndash560 2013

[14] A Pantelopoulos and N G Bourbakis ldquoA survey on wearablesensor-based systems for health monitoring and prognosisrdquoIEEE Transactions on Systems Man and Cybernetics Part CApplications and Reviews vol 40 no 1 pp 1ndash12 2010

[15] N Ravi N Dandekar P Mysore and M L Littman ldquoActivityrecognition from accelerometer datardquo in Proceedings of the

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 22: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

22 The Scientific World Journal

17th Innovative Applications of Artificial Intelligence Conference(IAAI rsquo05) pp 1541ndash1546 July 2005

[16] A Scalmato A Sgorbissa and R Zaccaria ldquoDescribing andrecognizing patterns of events in smart environments withdescription logicrdquo IEEE Transactions on Cybernetics vol 43 no6 pp 1882ndash1897 2013

[17] O Brdiczka P Reignier and J Crowley ldquoDetecting individualactivities from video in a smart homerdquo in Proceedings ofthe International Conference on Knowledge-Based IntelligentInformation and Engineering Systems pp 363ndash370 2007

[18] R Poppe ldquoA survey on vision-based human action recognitionrdquoImage and Vision Computing vol 28 no 6 pp 976ndash990 2010

[19] N Imamoglu M Nergui Y Yoshida J Gonzalez and W YuldquoControl strategies and particle filter for RGB-D based humansubject tracking and behavior recognition by a bio-monitoringmobile robotrdquo in Intelligent Robotics and Applications vol 8102of Lecture Notes in Computer Science pp 318ndash329 2013

[20] N Imamoglu Z Wei H Shi et al ldquoAn improved saliencyfor RGB-D visual tracking and control strategies for a bio-monitoring mobile robotrdquo in Evaluating AAL Systems throughCompetitive Benchmarking vol 386 of Communications inComputer and Information Science pp 1ndash12 2013

[21] A Cesta G Cortellessa M V Giuliani et al ldquoThe ROBOCAREassistive home robot environment features and evaluationrdquoThe ROBOCARE Technical Reports RC-TR-0906-6 2004

[22] P Rashidi and A Mihailidis ldquoA survey on ambient-assistedliving tools for older adultsrdquo IEEE Journal of Biomedical andHealth Informatics vol 17 no 3 pp 579ndash590 2013

[23] ROSARIA httpwikirosorgROSARIA[24] G Grisetti C Stachniss andW Burgard ldquoImproved techniques

for grid mapping with rao-blackwellized particle filtersrdquo IEEETransactions on Robotics vol 23 no 1 pp 34ndash46 2007

[25] Gmapping httpwikirosorggmapping[26] S Thrun W Burgard and D Fox Probabilistic Robotics The

MIT Press Cambridge Mass USA 2005[27] J Ning L Zhang D Zhang and C Wu ldquoScale and orientation

adaptive mean shift trackingrdquo IET Computer Vision vol 6 no1 pp 52ndash61 2012

[28] EMarder-Eppstein E Berger T Foote BGerkey andKKono-lige ldquoThe officemarathon robust navigation in an indoor officeenvironmentrdquo in Proceedings of the International Conference onRobotics and Automation (ICRA rsquo10) pp 300ndash307 IEEE May2010

[29] S Zaman W Slany and G Steinbauer ldquoROS-based mappinglocalization and autonomous navigation using a Pioneer 3-DX robot and their relevant issuesrdquo in Proceedings of theSaudi International Electronics Communications and PhotonicsConference (SIECPC rsquo11) pp 1ndash5 Riyadh Saudi Arabia April2011

[30] E Ruiz R Acuna N Certad A Terrones and M E CabreraldquoDevelopment of a control platform for the mobile robotRoomba using ROS and a Kinect sensorrdquo in Proceedings of the10th IEEE Latin American Robotics Symposium (LARS rsquo13) pp55ndash60 Arequipa Peru October 2013

[31] Costmap httpwikirosorgcostmap 2d[32] B P Gerkey and K Konolige ldquoPlanning and control in unstruc-

tured terrainrdquo in Proceedings of the ICRA Workshop on PathPlanning on Costmaps 2008

[33] D Fox W Burgard and S Thrun ldquoThe dynamic windowapproach to collision avoidancerdquo IEEERobotics andAutomationMagazine vol 4 no 1 pp 23ndash33 1997

[34] ldquoTransformation matrixrdquo httpmathforumorgmathimagesindexphpTransformation Matrix

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 23: Research Article Development of Robust Behaviour ...downloads.hindawi.com/journals/tswj/2014/280207.pdf · image, high illumination changes and low texture details, for our real time

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014