8
IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO. 4, OCTOBER 2019 3481 Force Sensitive Robotic End-Effector Using Embedded Fiber Optics and Deep Learning Characterization for Dexterous Remote Manipulation Jae In Kim , DongWook Kim , Matthew Krebs, Young Soo Park , and Yong-Lae Park Abstract—Many of the tasks that require a high level of au- tonomy in complex and dangerous situations are still done by human operators with a high risk of accidents. Although various remotely controlled robot systems have been proposed, the remote operation has limitations in performance and efficiency compared with on-site operations. This letter proposes the design of a new force and tactile sensing mechanism for a robotic end-effector suitable for deployment in harsh environments with integrated force sensing based on fiber optic sensors embedded in a simple and rugged structure. The proposed end-effector was able to de- tect the magnitude and location of the applied force accurately for high-performance tele-manipulation using hierarchical deep neural network (root mean square errors of 0.43 and 1.11 mm for estimating the contact location in the x-axis and the y-axis, respectively, and 1.16 N for estimating the magnitude of the contact force). Gaussian smoothing was used to support the performance, reducing the error levels by 25%. Also, learning feasibility was performed based on the auto-encoder. Using preliminary bilateral remote control experiments, we demonstrated the feasibility of the telemanipulation with dexterity. Index Terms—Force and tactile sensing, dexterous manipulation, telerobotics, teleoperation. I. INTRODUCTION A DVANCEMENT of robotics over the last decades have replaced many human labors with robots for repetitive Manuscript received February 24, 2019; accepted June 27, 2019. Date of publication July 4, 2019; date of current version July 24, 2019. This letter was recommended for publication by Associate Editor Q. Xu and Editor D. Popa upon evaluation of the reviewers’ comments. This work was supported by the International Nuclear Energy Research Initiative through the National Research Foundation funded by Ministry of Science and ICT of Republic of Korea under Grant NRF-2017M2A8A1092482. (J. I. Kim and D. Kim contributed equally to this work.) (Corresponding author: Y.-L. Park.) J. I. Kim, D. Kim, and Y.-L. Park are with the Department of Mechanical and Aerospace Engineering; Soft Robotics Research Center; Institute of Advanced Machines and Design, Seoul National University, Seoul 08826, South Korea (e-mail: [email protected]; [email protected]; [email protected]). M. Krebs and Y. S. Park are with the Argonne National laboratory, Lemont, IL 60439 USA (e-mail: [email protected]; [email protected]). This letter has supplementary downloadable material available at http://ieeexplore.ieee.org, provided by the authors. The material contains an 8.23 MB video, which shows a design concept and application of a new end-effector, also containing data preprocessing and learning methodsfor FBGS data, and finally validating the use of deep learning. For further information, please contact [email protected]. Digital Object Identifier 10.1109/LRA.2019.2926959 and heavy-duty tasks in industrial sites where the types and the environments of the tasks have been well defined and fully controlled, respectively. However, many of tasks that re- quire a higher level of autonomy in complex and dangerous situations, such as decontamination and decommissioning of nuclear facilities [1], [2] and search and rescue missions in disaster sites [3], [4], are still done by human operators. Al- though there have been attempts to deploy robots for tasks in hazardous environments [5], they have not demonstrated great achievements yet for multiple reasons, such as lack of dexterity, poor sensing capability, unexpected interruptions to the sys- tem, etc. Therefore, it would be extremely beneficial if robots operate these tasks with the help of the autonomy of human operators, which is possible by robotics and remote systems (R&RS). To remotely manipulate robots with dexterity for autonomous and spontaneous task operation, force and tactile information from the slave side is one of the key requirements for the operator on the master side. Two different types of force measurement have been traditionally used in motor-driven robotic manipula- tors for force feedback: an indirect method via electric current measurement and a direct method using strain gauges attached to the structure [6], [7]. The indirect method is mechanically simple since it only needs to measure the torque-dependent current change of the motor requiring no extra electronics. However, the torque estimation based on the current change is not as accurate as of the direct method. The direct method measures the force applied to the structure accurately based on structural deformation. However, it requires extra sensors and wiring, as well as accurate placement and attachment of the strain gauges. Also, since strain gauges are fragile in general, they require mechanical and overload protection mechanisms. Compared to force feedback, relatively little work has been reported in the area of tactile feedback since data acquisition from tactile sensors is often complicated due to large numbers of individual sensing elements. Although wireless communication is possible, power must be provided to each sensing element. Several types of tactile measurement have been investigated using force-sensitive resistors [8], piezoelectric sensors [9], and capacitive sensors [10]. However, they tend to increase system complexity due to the need for additional circuitry. 2377-3766 © 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO. 4, OCTOBER 2019 3481

Force Sensitive Robotic End-Effector UsingEmbedded Fiber Optics and Deep Learning

Characterization for DexterousRemote Manipulation

Jae In Kim , DongWook Kim , Matthew Krebs, Young Soo Park , and Yong-Lae Park

Abstract—Many of the tasks that require a high level of au-tonomy in complex and dangerous situations are still done byhuman operators with a high risk of accidents. Although variousremotely controlled robot systems have been proposed, the remoteoperation has limitations in performance and efficiency comparedwith on-site operations. This letter proposes the design of a newforce and tactile sensing mechanism for a robotic end-effectorsuitable for deployment in harsh environments with integratedforce sensing based on fiber optic sensors embedded in a simpleand rugged structure. The proposed end-effector was able to de-tect the magnitude and location of the applied force accuratelyfor high-performance tele-manipulation using hierarchical deepneural network (root mean square errors of 0.43 and 1.11 mmfor estimating the contact location in the x-axis and the y-axis,respectively, and 1.16 N for estimating the magnitude of the contactforce). Gaussian smoothing was used to support the performance,reducing the error levels by 25%. Also, learning feasibility wasperformed based on the auto-encoder. Using preliminary bilateralremote control experiments, we demonstrated the feasibility of thetelemanipulation with dexterity.

Index Terms—Force and tactile sensing, dexterous manipulation,telerobotics, teleoperation.

I. INTRODUCTION

ADVANCEMENT of robotics over the last decades havereplaced many human labors with robots for repetitive

Manuscript received February 24, 2019; accepted June 27, 2019. Date ofpublication July 4, 2019; date of current version July 24, 2019. This letter wasrecommended for publication by Associate Editor Q. Xu and Editor D. Popaupon evaluation of the reviewers’ comments. This work was supported by theInternational Nuclear Energy Research Initiative through the National ResearchFoundation funded by Ministry of Science and ICT of Republic of Korea underGrant NRF-2017M2A8A1092482. (J. I. Kim and D. Kim contributed equally tothis work.) (Corresponding author: Y.-L. Park.)

J. I. Kim, D. Kim, and Y.-L. Park are with the Department of Mechanical andAerospace Engineering; Soft Robotics Research Center; Institute of AdvancedMachines and Design, Seoul National University, Seoul 08826, South Korea(e-mail: [email protected]; [email protected]; [email protected]).

M. Krebs and Y. S. Park are with the Argonne National laboratory, Lemont,IL 60439 USA (e-mail: [email protected]; [email protected]).

This letter has supplementary downloadable material available athttp://ieeexplore.ieee.org, provided by the authors. The material contains an8.23 MB video, which shows a design concept and application of a newend-effector, also containing data preprocessing and learning methodsfor FBGSdata, and finally validating the use of deep learning. For further information,please contact [email protected].

Digital Object Identifier 10.1109/LRA.2019.2926959

and heavy-duty tasks in industrial sites where the types andthe environments of the tasks have been well defined andfully controlled, respectively. However, many of tasks that re-quire a higher level of autonomy in complex and dangeroussituations, such as decontamination and decommissioning ofnuclear facilities [1], [2] and search and rescue missions indisaster sites [3], [4], are still done by human operators. Al-though there have been attempts to deploy robots for tasks inhazardous environments [5], they have not demonstrated greatachievements yet for multiple reasons, such as lack of dexterity,poor sensing capability, unexpected interruptions to the sys-tem, etc. Therefore, it would be extremely beneficial if robotsoperate these tasks with the help of the autonomy of humanoperators, which is possible by robotics and remote systems(R&RS).

To remotely manipulate robots with dexterity for autonomousand spontaneous task operation, force and tactile informationfrom the slave side is one of the key requirements for the operatoron the master side. Two different types of force measurementhave been traditionally used in motor-driven robotic manipula-tors for force feedback: an indirect method via electric currentmeasurement and a direct method using strain gauges attachedto the structure [6], [7]. The indirect method is mechanicallysimple since it only needs to measure the torque-dependentcurrent change of the motor requiring no extra electronics.However, the torque estimation based on the current changeis not as accurate as of the direct method. The direct methodmeasures the force applied to the structure accurately based onstructural deformation. However, it requires extra sensors andwiring, as well as accurate placement and attachment of thestrain gauges. Also, since strain gauges are fragile in general,they require mechanical and overload protection mechanisms.Compared to force feedback, relatively little work has beenreported in the area of tactile feedback since data acquisitionfrom tactile sensors is often complicated due to large numbers ofindividual sensing elements. Although wireless communicationis possible, power must be provided to each sensing element.Several types of tactile measurement have been investigatedusing force-sensitive resistors [8], piezoelectric sensors [9], andcapacitive sensors [10]. However, they tend to increase systemcomplexity due to the need for additional circuitry.

2377-3766 © 2019 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

Page 2: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

3482 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO. 4, OCTOBER 2019

The above attempts were not always feasible for deploymentof remotely-operated robots for increased safety, performance,and efficiency mainly because they require additional and com-plex systems. Addition of such systems not only implies sub-stantial cost of installation but also maintenance. Therefore, weneed a simple and robust robotic system capable of tactile andforce sensing for various tasks in a safe and efficient manner.

Fiber Bragg grating (FBG) sensors can detect small strainchanges of the structure to which the optical fiber is attachedbased on the principle that the wavelength of the light reflectedby the gratings shifts when the fiber experiences strain changes.FBGs are capable of precise strain measurement and long-rangesensing [11]–[14]. In addition, the physical size of an opticalfiber is so small that it can easily be integrated into any type ofstructure. There have been several approaches to develop robotsby taking advantage of FBGs [15]–[17]. However, the robotsare not enough accurate in estimation of the magnitude andlocation of force required in order to be deployed in real remoteoperation. Therefore, we propose a simple robotic end-effectorcapable of providing both force and tactile information withhigh accuracy using FBGs. This end-effector does not requireany additional sensors or circuitries besides small optical fibersnor complex systems, making the robotic system highly simple.Therefore, the proposed end-effector is expected to be robustand can be deployed in real remote control missions. Thisend-effector provides the operator with more useful informationon the applied force to the robot, such as the location and themagnitude with high accuracy, which will enable dexterousmanipulation in remotely controlled systems.

A model-based method and a model-free method [18]–[20]using a neural network were used to estimate the contact locationand magnitude of the force. The model-free method showedbetter performance, and the reason was discussed based on theprincipal component analysis [21], [22].

This letter is organized as follows. Section II describes thedesign and fabrication of the end-effector. Section III and IVdiscuss the network structures of deep learning and pre, post-processing of data. Section V compares the model-based and thelearning-based results. Section VI presents the application of theproposed system through experiments of bilateral remote con-trol. Section VII finally discusses the results with the principlesof learning in our work.

II. DESIGN AND FABRICATION

The proposed end-effector consists of four main components:a gripper pad, a sensing beam, and a base, assembled by bolt-nutconnections, as shown in Fig. 1. They were all made of stainlesssteel machined by a tabletop computer numerical control (CNC)milling machine. The FBGs were attached on the corner of thebeam structure to minimize exposure to any possible externalcontamination or impacts, as shown in Fig. 2.

When a force is applied to the gripper pad, it is distributed tothe four corner areas of the pad compressing the four areas of thebeam structures connected to the pad by the rectangular struc-tures. This causes bending of the beam, and the strain changed ofthe beams are measured by FBGs attached to the beam. Using the

Fig. 1. Design of the end-effector consisting of a sensing pad, a beam structure,and a mounting structure. The sensing pad and beam structure are in rectangularcontact at each corner of the beam structure.

Fig. 2. Locations of strain sensing FBGs (red retangles) and temperaturecompensation FBG (green rectangle) and the stress distribution of the beamstructure. The compression occurs where strain sensing FBGs are attached whenan external force is applied.

obtained FBGs data and the force-torque equilibrium equations,we can finally determine the magnitude and location of the forceapplied to the end-effector.

We first modeled the deformation of the beam structureanalytically, and then experimentally characterized the actualprototype.

A. Beam Modeling

The relationship between force and strain of a beam (i.e.a cantilever) can be found using the Airy function. The Airyfunction φ [23] is given as

φ(x, z) = a1xz +a26xz3 (1)

where x is a neutral axis and z is a height with arbitrary a1 anda2.

The stress components σxx, σzz , and σxz are then obtainedfrom the partial derivatives of the Airy function as

σxx =∂2φ

∂z2= a2xz

σzz =∂2φ

∂x2= 0

σxz = − ∂2φ

∂x∂z= −a1 − a2z

2/2

(2)

Page 3: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

KIM et al.: FORCE SENSITIVE ROBOTIC END-EFFECTOR USING EMBEDDED FIBER OPTICS AND DEEP LEARNING CHARACTERIZATION 3483

Boundary conditions of the beam with height 2h and length lare given by

σxz = 0, σzz = 0 at z = ±h

−∫ h

−hσxz dz = P,

∫ h

−hσxx dz = 0 at x = 0

(3)

Constants a1 and a2 can be found by solving the equationsfrom the boundary conditions. The final stress components arethen given by

σxx = −F

Ixz

σzz = 0

σxz = − F

2I(c2 − z2)

(4)

where, I is the moment of inertia of the cross section of the beam,given as I = c2(2h)2/12with width c, andF is an external force.Assuming a plane stress condition, the relationship between theforce and the strain εxx strain is

εxx =1

E(σxx − νσzz) = −Fxz

EI(5)

Eq. (5) shows that the largest εxx strain occurs in the beam(where x = l and z = −h). Based on this result, we decided toattach the four FBGs to the areas close to the bases of the beams,marked with the four read squares in Fig. 2.

B. FBG Sensing Principle

When a compressive or tensile strain is applied to a FBG,the peak wavelength of the reflection of the input light shifts,and an optical interrogator detects the shift. The reflectionwavelength also shifts with temperature change in addition tomechanical strain changes, which need to be compensated fordetecting strain changes accurately. The relationship betweenthe wavelength shift and the strain is given as [24].

Δλ

λ0= (1− pe)ε+ (αA + αn)ΔT (6)

where Δλλ0

is the relative shift in the Bragg wavelength, pe is thestrain optic coefficient, αA is the thermal expansion coefficient,andαn is the thermo-optic coefficient. Strain (ε) and temperature(ΔT ) are variables that are independent of each other. Therefore,adding a reference FBG to the area where strain is not generated,which means the reference FBG is only affected by temperaturechange, can compensate for the temperature effect of the otherFBGs. This temperature FBG was attached at the center of thesensing beam structure, marked with a green square in Fig. 2.

C. Force Estimation

The end-effector may make point or surface contact with anobject. The intrinsic contact sensing method [25] was used toobtain the information on the resultant external force for bothcases. Since the gripper pad is flat in our design, we assumed thatthe force applied from an object is normal to the surface withnegligible contact friction force. As mentioned earlier, when a

Fig. 3. (a) Forward model of ANN, (b) Backward model of ANN.

force is applied to the gripper pad, the force is distributed to thebeam structure. From Eq. (5) and Eq. (6), we can estimate theforces exerted to each cantilever beam using the data obtainedfrom the corresponding FBG. The sum of the magnitudes ofthe distributed forces acting on the four cantilevers equals themagnitude of the applied force. Also, the torque equilibriumequation of all the forces at the origin O in Fig. 2 can beeasily established and, then, the location of the force can bealso estimated by solving the equations.

III. DEEP LEARNING CHARACTERIZATION

Deep learning is a black box modeling techniques most widelyused for finding complex relationships between inputs and out-puts which cannot be obtained by model-based analyses. It doesnot require several common assumptions made in analyticalmodels, such as linearization and time invariance [26], [27].Although we analytically modeled the behavior of the beamstructure in the previous section, it is not accurate enough toimplement to an actual prototype. Therefore, we decided to usea deep learning technique to estimate both the location and themagnitude of applied forces.

A. Mathematical Preliminaries

The structures of the artificial neural network in general areshown in Fig. 3, and the relationships between the variables canbe expressed as [27]

Z [l] = W [l]A[l−1] + b[l]

A[l] = g[l](Z [l]) (7)

dZ [l] = dA[l] � g[l]′(Z [l])

dW [l] =1

mdZ [l]A[l−1]T

db[l] =1

mdZ [l] 1

dA[l−1] = W [l]T dZ [l] (8)

where A is an n×m matrix containing n independent layersand minibatch of size m. W is the weight matrix of size k × nand b is the bias. [l] is the layer number indicating the lth hidden

Page 4: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

3484 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO. 4, OCTOBER 2019

Fig. 4. A hierarchical deep network is structured with three deep neural networks. Each neural network estimates whether the end-effector is contacted or not,what is contact force magnitude, and where is contact location. Numbers in each neural network represent the number of hidden units.

layer. Function g is an activation function. The parameter updateis given with learning rate ε, W [l] ←W [l] − εdW [l] and b[l] ←b[l] − εdb[l].

B. Hierarchical Neural Network

We ran a deep-learning to determine whether the end-effectorwas actually pressed by an external force, how large the appliedforce was, and where the force was located. We particularly useda hierarchical neural network where each network had its ownpurpose, and all the networks were hierarchically connected toeach other. The structure of the hierarchical network is shownin Fig. 4. When a force is applied to the gripping pad, thepeak wavelengths of the corresponding FBGs shift and it ismeasured by the optical interrogator. Then, the wavelengthsdata are preliminarily processed for temperature compensationand Gaussian smoothing filtering. In this pre-process, the wave-length shift of the temperature compensation FBG is subtractedfrom that of a strain-sensing FBG rather than being consideredas another input in the network, which alleviates the burden onthe optimizer by reducing bias, thus saving the training time.The pre-processed input data from the four FBGs first enter thecontact detection network, which consists of three hidden layers,with 70, 50, and 20 hidden units, respectively. The number ofhidden layers and the hidden units in each network is heuristi-cally determined by the brute force algorithm. All the hiddenlayers use rectified linear unit (ReLU) activation functions. Thecontact detection network then returns the binary output valueindicating whether there was a contact force or not. The secondnetwork has the same structure as the first one and estimates theforce using the four FBG data if contact was detected by thefirst network. The last network also has the same structure asthe first and the second ones except the increased number of thehidden units to 40. In this final network, the four FBG data, thebinary contact data, and the estimated force value become theinput, and the network returns a two-dimensional (X,Y ) tuple of

the contact location on the gripper pad. The reason for stackingthe networks with a hierarchy is that the same location resultshould be returned even if contacts with different forces weremade on the same spot. In this way, the localization networkonly detects the location of the contact, not affected by its forcemagnitude.

C. Gaussian Smoothing

The optical interrogator gives peak wavelength data of anFBG with noise, and we can assume the noise has a uniformstandard deviation (i.e., a Gaussian distribution). Therefore, thenoise of FBGs can be removed by a Gaussian weighted movingaverage filter [28]. All training, validation, and testing data setsin this work underwent Gaussian smoothing processes to removethe noise from the interrogator and were used as inputs to thehierarchical neural network for learning or testing.

D. Training

We used Tensorflow in the Python 3 environment to train thenetwork. All the back-propagation steps used Adam Optimizerwith a learning rate decay method, and this optimal algorithmhad a goal of minimizing the cost function of all three differentnetworks. In the case of the contact detection network, soft-maxcross entropy loss was used as a cost function because it was abinary classification. Since the outputs of the force estimationand contact localization networks were continuous values, amean-squared error was used as a cost function. Furthermore,a mini-batch gradient descent of size 128 was used to preventover-fitting. The learning data were divided into training, vali-dating, and testing sets with a ratio of 8:1:1.

IV. DATA ACQUISITION

We generated the training data by pressing 300 different pointson the gripper pad using a motorized x-y-z stage with random

Page 5: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

KIM et al.: FORCE SENSITIVE ROBOTIC END-EFFECTOR USING EMBEDDED FIBER OPTICS AND DEEP LEARNING CHARACTERIZATION 3485

Fig. 5. (a) Horizontal (X-axis) location with model, (b) Vertical (Y -axis) location with model, (c) Force magnitude estimation with model, (d) Horizontal(X-axis) location with ANN, (e) Vertical (Y -axis) location with ANN, (f) Force magnitude estimation with ANN.

forces up to 70 N, gathering a total of 234, 300 data. Themagnitude of the applied force was measured by a commercialload cell. The forces were applied automatically by the motor-ized stage, and the load cell and the FBGs peak wavelengthvalues were synchronized through a universal asynchronousreceiver/transmitter connection (UART). Each tuple data con-sists of nine data: the five FBG wavelength values, contactstatus (On/Off), the horizontal and the vertical locations, andthe magnitude of the applied force.

V. RESULT

A. Contact Localization and Force Magnitude Estimationbased on Model

Fig. 5-(a)-(c) shows the results of estimating the locationand the magnitude of the applied force based on the FBGsdata collected from the interrogator and the beam model. Theroot-mean-squared errors (RMSE) in the X and Y directionsare 1.67 mm and 9.72 mm, respectively, and the RMSE offorce estimation was 10.18 N. It is noted that the error in thelocation estimation was large in the low force range, and theestimation error for contact localization in Y-axis was larger thanthat in X-axis. This is mainly due to manufacturing toleranceand imperfection in design between X and Y axes, and it can beimproved through machine learning.

B. Contact Localization and Force Magnitude EstimationBased on Neural Network

We implemented a neural network, and the results are shownin Fig. 5-(d)-(f) for estimation of both contact locations and forcemagnitudes. The RMSEs for estimating the contact locationwere 0.43 mm and 1.11 mm in X and Y axes, respectively, andthe RMSE for force estimation was 1.16 N. Considering that themaximum of the applied force is 70 N, it was confirmed thatthe learning was successful since the error was less than 2%

in estimating the force. Also, the success rate of localizing theforce contact was 98.4%.

C. Comparison With Previous Studies

To check the performance level of our method, we com-pared our results with studies that are similar to our sensingmethod. The intrinsic contact sensing method [29] showedthe localization error of 2.64%. The force estimation error ina multi-fingered gripper [30] was 3.5%. In our approach, thelocalization error was 2.04% in X-axis and 4.11% in Y-axis, andthe force estimation error was 1.66%. As a result, our approachshowed higher accuracy both in force estimation and in contactlocalization.

VI. APPLICATION

A. Bilateral Remote Control Experiment

We carried out a simple bilateral remote control experimentusing our developed end-effector and a commercial haptic device(Touch, 3D systems) capable of force feedback. The experimentsetup is shown in Fig. 6. The end-effector was combined with atwo-fingered gripper (RH-P12-RN, Robotis) and a collaborativerobot arm (Sawyer, Rethink Robotics).

The gripper was remotely controlled by mapping the displace-ment of the fingers for open and close motions to the vertical(Y -axis) displacement of the haptic device. The stiffness ofthe haptic device was varied according to the magnitude ofthe force feedback so that the operator was able to feel the forcevariation. The mission of this experiment was to control thegripping force to an egg using the haptic device while the robotautonomously moved the arm around. The result in Fig. 7 showsthat the operator was able to successfully maintain the properlevel of holding force of the gripper without breaking the eggduring manipulation by feeling the force to the gripper throughthe haptic device in real-time.

Page 6: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

3486 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO. 4, OCTOBER 2019

Fig. 6. Bilateral remote control setup. The mission is to pick up an egg withoutbreaking and move the robot arm using the haptic device.

Fig. 7. Experimental result of bilateral remote control for egg pick-up task(solid blue line: force measured by FBG end-effector of slave robot, solid orangeline: distance between two end-effectors of gripper of slave robot, and dottedorange line: motion of master haptic device.)

VII. DISCUSSION

A. Effect of Gaussian Filtering

We used a Gaussian smoothing method for removing thenoise of the FBGs data. This noise removal process not onlyimproves the uncertainty but also increases the performance ofthe networks for learning. In the case of the contact localizationnetwork, the success rate was 95.3% when training is performed

Fig. 8. Raw data and filtered data of FBGs are plotted with respect to thereference force applied at the yellow point. Raw data are filtered using a Gaussianweighted moving average filter. The red, blue, and green are FBG #2, #1, andFBG #4, respectively. The graph shows the result from a single attempt of loadingand unloading.

without filtering, but it was increased to 98.4% with filtering.While the errors occurred mainly in the low force range (lessthan 5 N), the success rate was 100% over 5 N of the appliedforce. In the case of the force estimation network, the RMSEvalue of the test case was 1.53 N before filtering, but it wasdecreased to 1.16 N after filtering. The effect of the Gaussianfilter was visualized in Fig. 8, by comparing the raw and thefiltered data when a force was applied to the location of 9.4 mmand 0.9 mm from the top-left corner. It can be noted that thenoise was significantly reduced after filtering.

B. Accuracy Analysis and Stability Test

The developed end-effector showed a relatively low accuracyfor contact localization around the boundary of the gripping padand also in the entire sensing area with a low force level.

The reason for the low accuracy at the boundary is thatthe gripping pad and the beam structure are in planar contactwith each other. Due to this planar contact, the strains of thebeam structures caused by the forces exerted on the boundaryarea have similar values. Therefore, the accuracy of the contactlocation estimation was considerably degraded. The problemof inaccuracy at the boundary can be solved by changing thecontact method between the gripper pad and the beam structureto a point contact.

The reason for the low localization accuracy with a low forcelevel is small is due to the low signal-to-noise (S/N) ratio ofthe interrogator. Although the interrogator (10 pm resolution)is precise enough to read microstrains, it has a high noise levelrelative to a sensor value in the low force region (less than 1.5 N≈ 10 microstrain ≈ 10 pm). As a result, the force resolutionof the end-effector is about 1.4 N to 1.7 N, and the locationresolution is varied by force levels.

One way to address this issue is to use a neural network, sinceadditional noise generated from the interrogator can be filteredout from the network. The interrogator used in this study hasan error level of approximately 3–10 pm in peak wavelength

Page 7: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

KIM et al.: FORCE SENSITIVE ROBOTIC END-EFFECTOR USING EMBEDDED FIBER OPTICS AND DEEP LEARNING CHARACTERIZATION 3487

Fig. 9. Impulse response in time domain.

detection. In other words, if a wavelength shift caused by astrain on a beam generated by an external force is less than10 pm, the S/N ratio will not be high enough to detect the strainaccurately. We used a neural network to remove the error fromthe interrogator generates a noise level proportional to the forcelevel. Our hierarchical neural network first estimates the forcevalue and utilizes this value as an input for contact localization.In this way, the class of noise which is proportional to force levelcan be rejected by the neural network. From our experiment,we confirmed that the S/N ratio was improved by 30% whenthe estimated force was used as one of the inputs of the neuralnetwork with the FBG data.

In addition to the accuracy analysis, we also tested the stabilityof the proposed system by applying an impact to the end-effector.The impulse response from the impact is presented in Fig. 9 andshows a decay time of 0.04 seconds for the response to con-verge to the initial noise level of the interrogator. The resonantfrequency was above 500 Hz and could not be detected, sincethe the highest sampling frequency of the interrogator we usedwas 1 kHz. This tells that our end-effector can be stable withan external vibration at least under 500 Hz by the aliasing insampling theory [31]. Furthermore, since the frequency rangeof physical vibrations which are mostly from nature does notexceed 500 Hz [32], we believe that our system can be used ina variety of harsh environments.

C. Further Analysis on Deep Learning

If we reproduce a new identical gripper that is not perfectlythe same as previous ones due to manufacturing tolerances, thedataset for learning needs to be recollected. It takes about 20–25minutes to collect the dataset through the x-y-z stage, and it takesabout 20 minutes to train the neural network using the collecteddataset. Therefore, re-initializing the variables and training againcould be a burden to reproduce the results. However, time totrain the neural network can be significantly reduced using theweights that were generated from the previous grippers. If weinitialize the weights of the neural network of the new gripperwith the weights already determined from the previous data, thetraining time will be reduced.

In Fig. 8, hysteresis appears in the force-wavelength curve,which is the result of the viscoelastic behavior of the rubber padattached to the sensing pad. Previous studies have shown that useof a recurrent neural network (RNN) or the Preisach-ANN model

Fig. 10. Data plotting with PCA result: (a) Overall data, (b) force level,(c) X-axis location level, and (d) Y -axis location level distinguished withvarying color.

can be used to directly characterize hysteresis [33], [34], whichwas not implemented in our system since it may the systemslower due to the heavy calculation of in an RNN. Instead, wefocused on a simple ANN that is still accurate enough in ourapplication.

D. Principal Component Analysis for Data Learning

In this letter, the use of deep learning is ultimately to predictthree parameters, the contact locations in X and Y axis andthe magnitude of the applied force, using the four FBG data. Inother words, the process of obtaining three outputs from the fourinputs can be viewed as data encoding. Here, we prove that deeplearning is suitable for this prediction using principal componentanalysis (PCA).

We use an auto-encoder to preform PCA [22] in order toencode the four FBG data into some number of features lessthan four [21]. This auto-encoder has four inputs and a singlehidden layer with three hidden units. When the number ofhidden units is one or two, the auto-encoder fails to recover theauto-encoder outputs (RMSEs of 0.16 and 0.003, respectively).In general, it is hard to recover the data when the number of thehidden unit decreases. It requires more effort when extractinglow-dimension data into high-dimension data because the lowerdimension means higher data compressibility. In the case of threehidden units, the inputs that go through the auto-encoder can becompletely recovered (RMSE of 1× 10−5). It means that at leastthree independent features are required to restore the FBGs data.We plotted these extracted three features of data using all trainingdata, which are shown in Fig. 10. In this figure, all the extracteddata outputs were plotted in a cone shape in Fig. 10-(a) with a ra-dial distance, an azimuthal angle, and a polar angle, respectively.Color gradation is given according to the force level,X-location,and Y -location in Fig. 10-(b)-(d). Lighter color means lowervalues. Finally, we were able to exactly extract the X and Ylocations and the magnitude of the force using the PCA withthree features data without any given prior knowledge since theauto-encoder is an unsupervised learning technique to encodea certain number of features. Overall, these results support that

Page 8: Force Sensitive Robotic End-Effector Using Embedded Fiber ...softrobotics.snu.ac.kr/publications/KimJI_IEEE_RAL_ICRA_2019.pdf · IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO

3488 IEEE ROBOTICS AND AUTOMATION LETTERS, VOL. 4, NO. 4, OCTOBER 2019

the X and Y locations and the magnitude of the applied forcecan be found from four FBG data through deep learning.

VIII. CONCLUSION AND FUTURE WORK

We propose a new force and tactile sensing mechanism fora robotic end-effector capable of detecting both the magnitudeand the locations of an external contact force with high accuracybased on fiber optic sensors. The proposed end-effector does notrequire any additional sensors or circuits besides FBGs, so ithas an advantage of its structural simplicity and robustness. Webelieve that our end-effector can play a key role in overcomingthe limitations of the performance and the efficiency of roboticsand remote systems. The model-based method based on theAiry function showed relatively high inaccuracies in locationestimation. The learning-based method introduced to solve thisproblem showed higher accuracy not only for contact localiza-tion but also for force estimation.

In the future, we will redesign the gripper pad and the sensingbeam structure to make point contacts between them in orderto improve the accuracy at the boundary learned from the PCAresult.. We will also combine a noise canceling auto-encoder,called a denoising auto-encoder [35], with the neural network toeliminate the additional process of Gaussian smoothing.

REFERENCES

[1] K.-S. Jeong, K.-W. Lee, and H.-K. Lim, “Risk assessment on hazards fordecommissioning safety of a nuclear facility,” Ann. Nucl. Energy, vol. 37,no. 12, pp. 1751–1762, 2010.

[2] K. Jeong, D. Lee, K. Lee, and H. Lim, “A qualitative identification andanalysis of hazards, risks and operating procedures for a decommissioningsafety assessment of a nuclear research reactor,” Ann. Nucl. Energy, vol. 35,no. 10, pp. 1954–1962, 2008.

[3] R. R. Murphy et al., “Search and rescue robotics,” Springer Handbook ofRobotics. New York, NY, USA: Springer-Verlag, 2008, pp. 1151–1173.

[4] G.-J. M. Kruijff et al., “Rescue robots at earthquake-hit Mirandola, Italy:A field report,” in Proc. IEEE Int. Symp. Saf., Secur., Rescue Robot., 2012,pp. 1–8.

[5] M. Jamshidi and P. J. Eicker, Robotics and Remote Systems for HazardousEnvironments. Upper Saddle River, NJ, USA: Prentice-Hall, 1993.

[6] H. Asada, T. Kanade, and I. Takeyama, “Control of a direct-drive arm,” J.Dyn. Syst., Meas. Control, vol. 105, no. 3, pp. 136–142, 1983.

[7] H. Asada, K. Youcef-Toumi, and S. Lim, “Joint torque measurementof a direct-drive arm,” in Proc. 23rd IEEE Conf. Decis. Control, 1984,pp. 1332–1337.

[8] S. Schostek, C.-N. Ho, D. Kalanovic, and M. O. Schurr, “Artificial tac-tile sensing in minimally invasive surgery–A new technical approach,”Minimally Invasive Therapy Allied Technol., vol. 15, no. 5, pp. 296–304,2006.

[9] J. Dargahi, M. Parameswaran, and S. Payandeh, “A micromachined piezo-electric tactile sensor for an endoscopic grasper-theory, fabrication andexperiments,” J. Microelectromech. Syst., vol. 9, no. 3, pp. 329–335, 2000.

[10] D. T. Pawluk, J. S. Son, P. S. Wellman, W. J. Peine, and R. D. Howe, “Adistributed pressure sensor for biomechanical measurements,” J. Biomech.Eng., vol. 120, no. 2, pp. 302–305, 1998.

[11] K. O. Hill and G. Meltz, “Fiber Bragg grating technology fundamentalsand overview,” J. Lightw. Technol., vol. 15, no. 8, pp. 1263–1276, Aug.1997.

[12] Y.-J. Rao, “In-fibre Bragg grating sensors,” Meas. Sci. Technol., vol. 8,no. 4, pp. 355–375, 1997.

[13] J. Wang, Y. Zeng, C. Lin, Z. Hu, G. Peng, and Y. Hu, “A miniaturizedFBG accelerometer based on a thin polyurethane shell,” IEEE Sensors J.,vol. 16, no. 5, pp. 1210–1216, Mar. 2016.

[14] M. Liu, W. Wang, H. Song, S. Zhou, and W. Zhou, “A high sensitivityFBG strain sensor based on flexible hinge,” Sensors, vol. 19, no. 8,pp. 1931-1–1931-11, 2019.

[15] Y.-L. Park, S. C. Ryu, R. J. Black, K. K. Chau, B. Moslehi, and M.R. Cutkosky, “Exoskeletal force-sensing end-effectors with embeddedoptical fiber-Bragg-grating sensors,” IEEE Trans. Robot., vol. 25, no. 6,pp. 1319–1331, Dec. 2009.

[16] L. Jiang, K. Low, J. Costa, R. J. Black, and Y.-L. Park, “Fiber opticallysensorized multi-fingered robotic hand,” in Proc. IEEE/RSJ Int. Conf.Intell. Robots Syst., 2015, pp. 1763–1768.

[17] C. Hong, Y. Yuan, Y. Yang, Y. Zhang, and Z. A. Abro, “A simple FBGpressure sensor fabricated using fused deposition modelling process,”Sensors Actuators A; Phys., vol. 285, pp. 269–274, 2019.

[18] M. A. Jucá and A. B. dos Santos, “Fiber Bragg grating interrogation usingFBG filters and artificial neural network,” in Proc. SBMO/IEEE MTT-SInt. Microw. Optoelectron. Conf., 2017, pp. 1–4.

[19] G. C. Kahandawa, J. A. Epaarachchi, H. Wang, and K. T. Lau, “Predictionof obsolete FBG sensor using ANN for efficient and robust operation ofSHM systems,” in Key Eng. Mater., vol. 558, pp. 546–553, 2013.

[20] B.-W. Jang, Y.-G. Lee, J.-H. Kim, Y.-Y. Kim, and C.-G. Kim, “Real-timeimpact identification algorithm for composite structures using fiber Bragggrating sensors,” Struct. Control Health Monit., vol. 19, no. 7, pp. 580–591,2012.

[21] Y. Wang, H. Yao, and S. Zhao, “Auto-encoder based dimensionalityreduction,” Neurocomputing, vol. 184, pp. 232–242, 2016.

[22] I. Jolliffe, Principal Component Analysis. New York, NY, USA: Springer-Verlag, 2011.

[23] M. E. Gurtin, “The linear theory of elasticity,” in Linear Theories ofElasticity and Thermoelasticity. New York, NY, USA: Springer-Verlag,1973, pp. 1–295.

[24] Y. S. Hsu, L. Wang, W.-F. Liu, and Y. J. Chiang, “Temperature compensa-tion of optical fiber Bragg grating pressure sensor,” IEEE Photon. Technol.Lett., vol. 18, no. 7, pp. 874–876, Apr. 2006.

[25] A. Bicchi, J. K. Salisbury, and D. L. Brock, “Contact sensing from forcemeasurements,” Int. J. Robot. Res., vol. 12, no. 3, pp. 249–262, 1993.

[26] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521,no. 7553, pp. 436–444, 2015.

[27] J. Schmidhuber, “Deep learning in neural networks: An overview,” NeuralNetw., vol. 61, pp. 85–117, 2015.

[28] C. E. Rasmussen, “Gaussian processes in machine learning,” in SummerSchool on Machine Learning. New York, NY, USA: Springer-Verlag, 2003,pp. 63–71.

[29] T. Tsuji, Y. Kaneko, and S. Abe, “Whole-body force sensation by forcesensor with shell-shaped end-effector,” IEEE Trans. Ind. Electron., vol. 56,no. 5, pp. 1375–1382, May 2009.

[30] T. Zhang, L. Jiang, X. Wu, W. Feng, D. Zhou, and H. Liu, “Fingertipthree-axis tactile sensor for multifingered grasping,” IEEE/ASME Trans.Mechatronics, vol. 20, no. 4, pp. 1875–1885, Aug. 2015.

[31] A. V. Oppenheim, Discrete-time Signal Processing. Chennai, India: Pear-son Education India, 1999.

[32] R. Rantz and S. Roundy, “Characterization of real-world vibration sourcesand application to nonlinear vibration energy harvesters,” vol. 4, no. 2,pp. 67–76, 2017

[33] D. Kim and Y.-L. Park, “Contact localization and force estimation of softtactile sensors using artificial intelligence,” in Proc. IEEE/RSJ Int. Conf.Intell. Robots Syst., 2018, pp. 7480–7485.

[34] S. Han, T. Kim, D. Kim, Y.-L. Park, and S. Jo, “Use of deep learning forcharacterization of microfluidic soft sensors,” IEEE Robot. Autom. Lett.,vol. 3, no. 2, pp. 873–880, Apr. 2018.

[35] C. Doersch, “Tutorial on variational autoencoders,” 2016, arXiv:1606.05908.