4
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination. IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING 1 Marker-Based Surgical Instrument Tracking Using Dual Kinect Sensors Hongliang Ren, Member, IEEE, Wei Liu, and Andy Lim Abstract—This paper presents a passive-marker-based optical tracking system utilizing dual Kinect sensors and additional custom optical tracking components. To obtain sub-millimeter tracking accuracy, we introduce robust calibration of dual infrared sensors and point correspondence es- tablishment in a stereo conguration. The 3D localization is subsequently accomplished using multiple back projection lines. The proposed system extends existing inexpensive consumer electronic devices, implements tracking algorithms, and shows the feasibility of applying the proposed low-cost system to surgical training for computer assisted surgeries. Note to Practitioners—This paper is motivated by the scarcity of tracking devices for the training of novices in computer-assisted and navigated surgeries. Reliable and low-cost tracking systems are important to ensure easily available surgical training outside operating rooms. The current off-the-shelf tracking systems have been employed in operating rooms for tracking surgical instruments, typically with submillimeter accuracy, but they are highly expensive and require extensive preparation prior to surgery. Hence, the proposed cost-effective tracking system aims to bridge the gap and enable general trainees to learn computer assisted surgery, which is otherwise only available in a limited number of labs or operating rooms. Index Terms—Kinect sensor, navigated surgery, optical tracking, tracking. I. INTRODUCTION I NCORPORATING accurate surgical tracking devices and in- tegrated navigation systems, Computer-Assisted Surgery is emerging as a viable paradigm shift in surgeries and interventions. Surgical assistance systems aim to extend the capability of surgeons for planning and carrying out surgical interventions more accurately and less invasively with imaging, tracking and positioning modules, especially when the surgeons cannot see where the surgical instru- ments are inside human body. Computer-assisted systems typically register the intra-operative pose (position and orientation) information of surgical instruments with preoperative 3D models of the patients, which are typically derived from Computed Tomography scans or Magnetic Resonance Imaging. Such a computer assistance and training system will help increase surgical precision and identify unnecessary or imperfect surgical manipulations, and thus effectively increase the success rate of the surgeries. Among the widely used tracking technologies, Electromagnetic Tracking and Optical Tracking are able to achieve submillimeter accuracy nowadays. Electromagnetic tracking systems use theoretical Manuscript received July 05, 2013; revised September 06, 2013; accepted September 21, 2013. This paper was recommended for publication by Associate Editor T. Kawahara and Editor D. Tilbury upon evaluation of the reviewers’ comments. This work was supported by the Singapore Academic Research Fund under Grant R397000139133, Grant R397000157112, and the NUS Teaching Enhancement Grant C397000039001 awarded to Dr. Hongliang Ren. The authors are with the Department of Biomedical Engineering, National University of Singapore, Singapore 117575, Singapore (e-mail: [email protected]. sg). This paper has supplementary downloadable multimedia material available at http://ieeexplore.ieee.org provided by the authors. The Supplementary Material includes an MP4 audio/video le (.mp4) for system setup and tracking experi- ment. This material is 1.61 MB in size. System setup and tracking experiment for the article of "Marker-Based Sur- gical Instrument Tracking Using Dual Kinect Sensors" by Hongliang Ren, Wei Liu, and Andy Lim. Digital Object Identier 10.1109/TASE.2013.2283775 Fig. 1. The setup of the instrument tracking system includes a pair of Kinect sensors, IR light sources, IR lters, and supporting tripod. model tting to determine the pose of the sensing coils within an excited magnetic eld. Electromagnetic tracking systems are suscep- tible to distortion errors if the working magnetic eld is interfered with ferromagnetic materials [1], [2]. The optical tracking systems triangulate the position of surgical instruments using active or passive markers and photosensors [3]–[5]. Typical example systems such as the Polaris or Vicra systems from Northern Digital Inc., were claimed to have a root-mean-square tracking accuracy of 0.25 mm [6]. It is critical for novices to have hands-on training in computer-as- sisted surgical procedures, which are typically navigated surgeries. However, this is constrained by the costly commercial surgical tracking devices involved, which are typically used in clinical procedures but not widely available for training purpose. Therefore, the main objective and primary contribution of this paper is to develop a cost-effective surgical instrument tracking system with sub-millimeter accuracy for boosting extensive surgical training, and we resort to the off-the-shelf inexpensive Microsoft Kinect sensors [7]. This is enabled by both the hardware development based on multiple sensing modules, software implementations for calibration, correspon- dence establishment, and 3D point reconstruction, as illustrated in the following sections. II. METHODS A. Materials As shown in Fig. 1, the proposed marker based surgical tracking system exploits dual Kinect sensors, which are placed side by side with overlapping eld of views. Each Kinect sensor consists of an in- frared (IR) projector, an IR camera, a RGB camera, a four-microphone array, and a motor to adjust the tilt [8]. The pseudo-random, nonre- peating pattern of infrared points can be projected onto surfaces by the IR projector [7]. The Kinect by default produces depth video which is 640 480 pixels at 30 Hz. It is actually capable of capturing pictures with 1280 960 pixels at twice the default resolution. Images of this resolution were used in calibration procedures to get better accuracy (http://ieeexplore.ieee.org). There were earlier studies mainly on the marker-less tracking of ob- jects using Kinect sensors. Noonan et al. [7] performed head tracking using the RGB-D data obtained from the Kinect and iteratively matched them with a template extracted from the head-phantom CT to deter- mine the pose and orientation of the head. To segment transparent ob- jects from the background, Lysenkov et al. [9] made use of the fact that the Kinect is unable to estimate depth when transparent objects are po- sitioned. As a result, the silhouette of the transparent objects can be observed with ease from the depth image and then iteratively matched with the silhouette generated with a prior. Oikonomidis et al. [10] de- veloped a Kinect-based tracking system capable of tracking complex hand articulations. The tracking system developed by Oikonomidis et al. is able to match the observed hand conguration with that of a prior template by using an optimization algorithm known as particle swarm 1545-5955 © 2013 IEEE

Marker-Based Surgical Instrument Tracking Using Dual Kinect Sensors

  • Upload
    andy

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Marker-Based Surgical Instrument Tracking Using Dual Kinect Sensors

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING 1

Marker-Based Surgical Instrument TrackingUsing Dual Kinect Sensors

Hongliang Ren, Member, IEEE, Wei Liu, and Andy Lim

Abstract—This paper presents a passive-marker-based optical trackingsystem utilizing dual Kinect sensors and additional custom optical trackingcomponents. To obtain sub-millimeter tracking accuracy, we introducerobust calibration of dual infrared sensors and point correspondence es-tablishment in a stereo configuration. The 3D localization is subsequentlyaccomplished using multiple back projection lines. The proposed systemextends existing inexpensive consumer electronic devices, implementstracking algorithms, and shows the feasibility of applying the proposedlow-cost system to surgical training for computer assisted surgeries.

Note to Practitioners—This paper is motivated by the scarcity of trackingdevices for the training of novices in computer-assisted and navigatedsurgeries. Reliable and low-cost tracking systems are important to ensureeasily available surgical training outside operating rooms. The currentoff-the-shelf tracking systems have been employed in operating roomsfor tracking surgical instruments, typically with submillimeter accuracy,but they are highly expensive and require extensive preparation prior tosurgery. Hence, the proposed cost-effective tracking system aims to bridgethe gap and enable general trainees to learn computer assisted surgery,which is otherwise only available in a limited number of labs or operatingrooms.

Index Terms—Kinect sensor, navigated surgery, optical tracking,tracking.

I. INTRODUCTION

I NCORPORATING accurate surgical tracking devices and in-tegrated navigation systems, Computer-Assisted Surgery is

emerging as a viable paradigm shift in surgeries and interventions.Surgical assistance systems aim to extend the capability of surgeonsfor planning and carrying out surgical interventions more accuratelyand less invasively with imaging, tracking and positioning modules,especially when the surgeons cannot see where the surgical instru-ments are inside human body. Computer-assisted systems typicallyregister the intra-operative pose (position and orientation) informationof surgical instruments with preoperative 3D models of the patients,which are typically derived from Computed Tomography scans orMagnetic Resonance Imaging. Such a computer assistance and trainingsystem will help increase surgical precision and identify unnecessaryor imperfect surgical manipulations, and thus effectively increase thesuccess rate of the surgeries.Among the widely used tracking technologies, Electromagnetic

Tracking and Optical Tracking are able to achieve submillimeteraccuracy nowadays. Electromagnetic tracking systems use theoretical

Manuscript received July 05, 2013; revised September 06, 2013; acceptedSeptember 21, 2013. This paper was recommended for publication by AssociateEditor T. Kawahara and Editor D. Tilbury upon evaluation of the reviewers’comments. This work was supported by the Singapore Academic Research Fundunder Grant R397000139133, Grant R397000157112, and the NUS TeachingEnhancement Grant C397000039001 awarded to Dr. Hongliang Ren.The authors are with the Department of Biomedical Engineering, National

University of Singapore, Singapore 117575, Singapore (e-mail: [email protected]).This paper has supplementary downloadable multimedia material available at

http://ieeexplore.ieee.org provided by the authors. The Supplementary Materialincludes an MP4 audio/video file (.mp4) for system setup and tracking experi-ment. This material is 1.61 MB in size.System setup and tracking experiment for the article of "Marker-Based Sur-

gical Instrument Tracking Using Dual Kinect Sensors" by Hongliang Ren, WeiLiu, and Andy Lim.Digital Object Identifier 10.1109/TASE.2013.2283775

Fig. 1. The setup of the instrument tracking system includes a pair of Kinectsensors, IR light sources, IR filters, and supporting tripod.

model fitting to determine the pose of the sensing coils within anexcited magnetic field. Electromagnetic tracking systems are suscep-tible to distortion errors if the working magnetic field is interferedwith ferromagnetic materials [1], [2]. The optical tracking systemstriangulate the position of surgical instruments using active or passivemarkers and photosensors [3]–[5]. Typical example systems such asthe Polaris or Vicra systems from Northern Digital Inc., were claimedto have a root-mean-square tracking accuracy of 0.25 mm [6].It is critical for novices to have hands-on training in computer-as-

sisted surgical procedures, which are typically navigated surgeries.However, this is constrained by the costly commercial surgical trackingdevices involved, which are typically used in clinical procedures butnot widely available for training purpose.Therefore, the main objective and primary contribution of this paper

is to develop a cost-effective surgical instrument tracking system withsub-millimeter accuracy for boosting extensive surgical training, andwe resort to the off-the-shelf inexpensive Microsoft Kinect sensors [7].This is enabled by both the hardware development based on multiplesensing modules, software implementations for calibration, correspon-dence establishment, and 3D point reconstruction, as illustrated in thefollowing sections.

II. METHODS

A. Materials

As shown in Fig. 1, the proposed marker based surgical trackingsystem exploits dual Kinect sensors, which are placed side by sidewith overlapping field of views. Each Kinect sensor consists of an in-frared (IR) projector, an IR camera, a RGB camera, a four-microphonearray, and a motor to adjust the tilt [8]. The pseudo-random, nonre-peating pattern of infrared points can be projected onto surfaces by theIR projector [7]. The Kinect by default produces depth video which is640 480 pixels at 30 Hz. It is actually capable of capturing pictureswith 1280 960 pixels at twice the default resolution. Images of thisresolution were used in calibration procedures to get better accuracy(http://ieeexplore.ieee.org).There were earlier studies mainly on the marker-less tracking of ob-

jects using Kinect sensors. Noonan et al. [7] performed head trackingusing the RGB-D data obtained from theKinect and iterativelymatchedthem with a template extracted from the head-phantom CT to deter-mine the pose and orientation of the head. To segment transparent ob-jects from the background, Lysenkov et al. [9] made use of the fact thatthe Kinect is unable to estimate depth when transparent objects are po-sitioned. As a result, the silhouette of the transparent objects can beobserved with ease from the depth image and then iteratively matchedwith the silhouette generated with a prior. Oikonomidis et al. [10] de-veloped a Kinect-based tracking system capable of tracking complexhand articulations. The tracking system developed by Oikonomidis etal. is able to match the observed hand configuration with that of a priortemplate by using an optimization algorithm known as particle swarm

1545-5955 © 2013 IEEE

Page 2: Marker-Based Surgical Instrument Tracking Using Dual Kinect Sensors

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

2 IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING

Fig. 2. Image of a checkerboard captured (a) under no IR illumination. (b)Under illumination from the built-in IR projector (obtained from ROS wiki,which is not desirable for tracking purposes). (c) Under illumination by LEDIR light source.

Fig. 3. Rigid bodies to be attached with passive markers for tracking purpose.The dimensions of the markers were given by the manufacturer and the truedistances between the marker pair m and are given by,mm, mm, mm, mm, mm, and

mm.

optimization (PSO). These tracking methods utilize the depth sensingcapability of Kinect sensor to observe 3D objects and a calibrationmodule in ROS (www.ros.org).Towards accurate tracking, the approach proposed in this paper

differs from earlier studies in the following respects. First, dedi-cate markers are employed to provide robust and accurate trackingwhen compared with the marker-less approaches. The marker-basedapproach is also general practice in commercial surgical trackingsystems. Second, custom active illuminators and IR filters are em-ployed to further improve tracking performances. Third, the expectedapplications of the system are in surgical environments other thanconventional scene understandings, where accuracy is not the toppriority. Moreover, the proposed approach is new in that it employsdual Kinect sensors in order to get larger field of view within thesurgical environment.The overlapping field of view of the cameras forms the working

volume of the tracking system. The viewing angle for the Kinect isapproximately 43 along the vertical axis and 57 along the horizontalaxis. Stereotriangulation between 2 RGB cameras and 2 Infrared (IR)cameras of the two Kinect sensors will produce two sets of 3D coordi-nates for the localization of a marker. Unlike previous studies such as[7], [9], and [10], only IR-images are used to determine the position ofthe 3D passive markers. The depth-sensing information of the Kinectis not utilized in this paper, but can be fused in the future for instrumenttracking.It was noted that under ambient conditions, there was insufficient IR

illumination to produce images with discernible contrast [Fig. 2(a)].The built-in IR projector on the Kinect sensor can be a potential IRlight source but it produces speckled patterns that might result in agrainy and noisy IR image [Fig. 2(b)]. Hence, the IR projector on theKinect was physically obscured and two 850 nm IR LED light sourceswere mounted on top of each Kinect sensor to provide adequate illu-mination of the reflective marker spheres. Meanwhile, 850 nm IR passfilters are used for enhancing the image perception and Fig. 2(c) is anexample of the image acquired through the filtered lens under this illu-mination. These components are commercially available and inexpen-sive including IR illuminators (e.g., thorlabs.de) and IR filters (e.g.,naturalpoint.com). Furthermore, more hardware options are availablefor building the systems, with competitors to Microsoft’s Kinect sen-sors, such as ASUS Xiton Pro Live (ASUS Inc.). This makes it feasibleto build up a low-cost surgical tracking system.Fiducial markers can be placed on a Polaris Passive 4-Marker Probe

as well as rigid bodies (Fig. 3). The next step is to implement a trackingmethod based on stereo configuration and fiducial markers.

B. Calibration of the Kinect RGB and IR Camera

The Kinect comes with factory calibration, and the distortion is notvisibly evident with the naked eye. However, for the purpose of sur-gical application, further calibration is essential in order to enhance thetracking accuracy. This calibration is performed using Camera Calibra-tion Toolbox, designed in [11]. The pinhole camera model is assumedin this calibration. The calibration yields the intrinsic parameters of thecameras, which relates the position of a 3D point and its pixel posi-tion. An initial calibration is done with the IR and the RGB cameras ofthe Kinect sensors. Subsequently, the extrinsic camera parameters areobtained, i.e., the relative transformation of the cameras with respectto each other. The extrinsic camera parameters are important for thedetermination of epipolar geometry. A stereo-calibration between twocameras is then performed using the same calibration toolbox. The leftcamera is taken as the world coordinate frame and the camera coordi-nate of the right camera is determined with respect to the left one.

C. Algorithm for Marker Detection

The reflective spherical markers appear as bright circles in the IRimages. In each image frame, the circles are detected using the seededregion growing (SRG) algorithm, which segments the intensity imagesrapidly and robustly and without the need to tune parameters [12].In SRG, the regions in the image that have pixel intensities belowthe threshold are excluded and only the regions with intensities abovethe threshold are used as seeds. Subsequently, the seeded regions aregrowing. An analogy for the growing of regions is that the seeds arepoints where water is being poured onto and pixels of similar intensitycan be flat ground. As water pours from the seeds, regions of similarintensity will be flooded and become homogenized [12]. The regionstops growing when the water encounters a boundary, where the pixelintensities are significantly different. Eventually, only the homogenousregions remain to represent the reflectivemarker spheres. In our experi-ments, the markers are determined to be regions that have radii rangingfrom 3 to 12 pixels. Finally, the centers of the markers can be foundfrom the centers of mass of the regions.

D. Determination of Correspondence

Following the determination of the centers of the four markers in theimages from both the left and the right cameras, the correspondence be-tween these points have to be determined (Fig. 4). This is done usingthe fundamental matrix, which represents the intrinsic projective ge-ometry between the two views. It can be calculated from the cameramatrices of both cameras and the known position of one camera withrespect to the other. The fundamental matrix between two cameras, ,and a 3 3 matrix of rank 2 [13], is defined as

(1)

where is the projection of a 3D point, , on the image plane of oneof the cameras and is epipolar line of in the other camera’s imageplane. and are in homogeneous coordinates. The line isconstructed such that the point , which is the projection of on theimage plane of the second camera, is incident on it. The opposite of thisrelation holds as well, and has to lie on . In addition, it is knownthat when a point is incident on a line, the cross product between thehomogeneous coordinates of the point and the line is equal to zero.Therefore

(2)

and it follows that:

(3)

Page 3: Marker-Based Surgical Instrument Tracking Using Dual Kinect Sensors

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

REN et al.: MARKER-BASED SURGICAL INSTRUMENT TRACKING USING DUAL KINECT SENSORS 3

Fig. 4. Correspondence between marker points in images captured by the leftand right IR cameras of Kinect sensors.

Suppose that and are the images of a 3D point in the left andright cameras, respectively, it can be derived that

(4)

This relation holds true in the absence of noise, but in actual imagesthe term has a nonzero value. However, it is postulated thatit should still yield a reasonably small value. Hence, the correspon-dence can be determined by computing between a point onone image and the four points on the other, and choosing the pairingwith the lowest value.

E. 3D-Point Reconstruction (3DPR)

3-D point reconstruction plays a key role in the system. A fast androbust reconstruction method is desirable for tracking purposes. Tri-angulation based reconstruction methods have been used for biocularsystems.In ideal cases, the 3-D coordinate of the target can be triangulated

by its 2-D correspondent points in each image plane according to thePin-Hole model. However, due to the model error, image noise, lensdistortion, etc., traditional optimization-based reconstruction methodsare time-consumable and sensitive to the initial guess value. We adoptthe geometry-based method of finding the midpoint of the perpendic-ular distance between two BPL [14]. The BPL is a line in 3D space thatprojects from the camera center towards the 3D point. This method ac-knowledges the fact that the BPLs from the two cameras are unlikelyto meet in the 3D space practically. Hence, the objective is to makethe distance between the two BPLs minimal between the perpendicularfoots (PF) of the two back-projected lines) [14]. The point between the2 PFs is a good estimate for the reconstructed 3D point. Without in-volving iterative optimization, the algorithm is light on computationalrequirements, making it a good algorithm for real-time surgical appli-cation. In the midpoint method, the BPL has to be determined first. Thepinhole model can be expressed as

(5)

where is a nonzero constant, is the 2D pixel coordinates in theIR camera image, and , , and are the coordinates of a 3D-Point.The equation shows the relationship between the 3D point and the pixelposition of its image. and are the rotation matrix and the translationvector respectively that relates the camera center with the world coor-dinates. is the intrinsic matrix of the camera. From (5), the equationof the BPL can be expressed as

(6)

where is the back projection line, i.e., any 3D point along theBPL with parameter . is the vector repre-senting the direction of the back projection line, which can be decidedby the 2D image coordinates and projection matrix . Theparameter is a real number larger than zero, and is the 3Dworld coordinates of the camera center. The BPL passes each cameracenter and its 2-D image point. The reason for using is to describe

Fig. 5. RMS error against average -direction.

Fig. 6. Marker-distance measurement error, that is, the normalized differencesbetween measured and actual distances. stands for distance betweenmarkers 1 and 2, stands for distance between markers 2 and 3, etc.

the 3D line in a parameterized way. Different scaling parameter canbe calculated [14] corresponding to different 3D point on the 3D lineof BPL .

III. EXPERIMENT RESULTS

The experimental data for the RMS error is obtained at varying dis-tances away from the Kinect tracking system. 150 data points are ob-tained for each distance so that the mean and the standard deviation canbe determined. The average time taken for acquiring each data point isapproximately 6 milliseconds. The RMS error is plotted against the-distance away from the left camera (Fig. 5). It is noted that the RMSerror is calculated as follows. A normal vector to the best-fit planeis computed from the 3D positions of the four markers recovered bytracking system. As it is known that three points define a plane, andthus four points result in a redundancy. Due to the presence of noise inour data, it is therefore unlikely to find a plane that contains all four de-tected positions of the markers. After the determination of the best-fitplane, the root-mean-square (RMS) of the perpendicular distances be-tween the markers and the best-fit plane is computed and taken to bethe error.As the average distance from the markers to the left camera is in-

creased from 0.4 to 1.2 m, the RMS error shows a generally increasingtrend. The average error decreased from 0.17 to 0.052 mm as the probeis moved from 0.49 to 0.64 m, but subsequently increases to 0.13 mmat 0.74 m. At distance of 1.0 m, the average error is approximately0.30 mm. From this it shows that 0.7 m is an optimal distance for thistracking system. In addition, compliance to the co-planarity constraintis not sufficient as an estimation of the accuracy of the tracking system.The true distances between the markers are known as mm,

mm, mm, mm, mm,

Page 4: Marker-Based Surgical Instrument Tracking Using Dual Kinect Sensors

This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.

4 IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING

and mm, where stands for distance between markersand . The percentage difference of the measured distance with re-

spect to the actual distance is shown in Fig. 6. It is observed that themeasured dimensions are all within 2% of the actual dimensions.

IV. CONCLUSION AND FUTURE WORK

The proposed surgical tracking system uses the images captured bythe infrared cameras of a pair of Kinect sensors with additional illu-mination sources. The proof-of-concept experiments were performedto validate the proposed dual-kinect tracking system and showed itstracking accuracy in submillimeter, which is deemed feasible for theapplications in surgical training. This paper focused on using theIR cameras of two Kinect sensors for the tracking of retroreflectivemarkers. The next step is to combine the data from the IR camera withdata from other sensing modalities such as the RGB camera of theKinect sensor to enhance the robustness and accuracy of the trackingsystem.

REFERENCES

[1] H. Ren and P. Kazanzides, “Investigation of attitude tracking usingan integrated inertial and magnetic navigation system for hand-heldsurgical instruments,” IEEE/ASME Trans. Mechatronics, vol. 17, no.2, pp. 210–217, Apr. 2012.

[2] H. Ren, D. Rank, M. Merdes, J. Stallkamp, and P. Kazanzides,“Multi-sensor data fusion in an integrated tracking system for endo-scopic surgery,” IEEE Trans. Inform. Technol. Biomed., vol. 16, no. 1,pp. 106–111, Jan. 2012.

[3] J. Stoll, H. Ren, and P. E. Dupont, “Passive markers for tracking sur-gical instruments in real-time 3D ultrasound imaging,” IEEE Trans.Med. Imaging, vol. 31, no. 3, pp. 563–575, Mar. 2012.

[4] H. Ren, W. Liu, and S. Song, “Towards occlusion-resistant surgicalinstrument tracking,”Bone Joint J. Orthopaedic Proc. Supplement, vol.95-B, no. SUPP 28, p. 52, 2013.

[5] H. Ren and P. Kazanzides, “A paired-orientation alignment problemin a hybrid tracking system for computer assisted surgery,” J. Intell.Robot. Syst., vol. 63, pp. 151–161, 2011.

[6] K. Cleary and T. M. Peters, “Image-guided interventions: Technologyreview and clinical applications,” Annu. Rev. Biomed. Eng., vol. 12,pp. 119–142, 2010.

[7] P. Noonan, T. Cootes, W. Hallett, and R. Hinz, “The design and initialcalibration of an optical tracking system using the microsoft kinect,” inProc. IEEE Nuclear Sci. Symp. Med. Imaging Conf. (NSS/MIC), 2011,pp. 3614–3617.

[8] Z. Zhang, “Microsoft kinect sensor and its effect,” IEEE Multimedia,vol. 19, no. 2, pp. 4–10, 2012.

[9] I. Lysenkov, V. Eruhimov, and G. Bradski, “Recognition and pose es-timation of rigid transparent objects with a kinect sensor,” in Proc.Robotics: Sci. Syst. Conf., 2012.

[10] I. Oikonomidis, N. Kyriazis, and A. Argyros, “Efficient model-based3d tracking of hand articulations using kinect,” in Proc. British Ma-chine Vision Conf., 2011, pp. 1–11.

[11] J. Heikkila, “Geometric camera calibration using circular controlpoints,” IEEE Trans. Pattern Anal. Machine Intell., vol. 22, no. 10,pp. 1066–1077, Oct. 2000.

[12] R. Adams and L. Bischof, “Seeded region growing,” IEEE Trans. Pat-tern Anal. Machine Intell., vol. 16, no. 6, pp. 641–647, 1994.

[13] R. Hartley and A. Zisserman, Multiple View Geometry in ComputerVision. Cambridge, U.K.: Cambridge Univ Press, 2000, vol. 2.

[14] Q. He, C. Hu, W. Liu, N. Wei, M.-H. Meng, L. Liu, and C. Wang,“Simple 3-d point reconstruction methods with accuracy prediction formultiocular system,” IEEE/ASME Trans. Mechatronics, vol. 18, no. 1,pp. 366–375, 2013.