104
Robot Tool Center Point Calibration using Computer Vision Master’s Thesis in Computer Vision Link¨ oping Department of Electrical Engineering by Johan Hallenberg LiTH - ISY - EX - - 07/3943 - - SE Department of Electrical Engineering Link¨ opings University, SE-581 83, Link¨ oping, Sweden Link¨ oping February 2007

Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

  • Upload
    others

  • View
    20

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Robot Tool Center Point Calibrationusing Computer Vision

Master’s Thesis in Computer VisionLinkoping Department of Electrical Engineering

by

Johan Hallenberg

LiTH - ISY - EX - - 07/3943 - - SE

Department of Electrical EngineeringLinkopings University, SE-581 83, Linkoping, Sweden

Linkoping February 2007

Page 2: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Printed by:UniTryck, Linkoping, Sweden

Distributed by:Linkopings UniversityDepartment of Electrical EngineeringSE-581 83, Sweden

Page 3: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Robot Tool Center Point Calibrationusing Computer Vision

Master’s Thesis at Computer Vision LaboratoryUniversity of Linkoping

by

Johan Hallenberg

Reg nr: LITH - ISY - EX - - 07/3943 - - SE

Supervisors: Klas NordbergISY, Linkoping University

Ivan LundbergABB Corporate Research Center, Sweden

Examiner: Klas NordbergISY, Linkoping University

Linkoping 2007 - 02 - 05

Page 4: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 5: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Abstract

Robot Tool Center Point Calibration using Computer VisionBy: Johan HallenbergMSc, Linkoping University, February 2007

Examiner: Klas NordbergSupervisors: Klas Nordberg and Ivan Lundberg

Today, tool center point calibration is mostly done by a manual procedure.The method is very time consuming and the result may vary due to how skilledthe operators are.

This thesis proposes a new automated iterative method for tool center pointcalibration of industrial robots, by making use of computer vision and image pro-cessing techniques. The new method has several advantages over the manual cali-bration method. Experimental verifications have shown that the proposed methodis much faster, still delivering a comparable or even better accuracy. The setupof the proposed method is very easy, only one USB camera connected to a lap-top computer is needed and no contact with the robot tool is necessary during thecalibration procedure.

The method can be split into three different parts. Initially, the transforma-tion between the robot wrist and the tool is determined by solving a closed loopof homogeneous transformations. Second an image segmentation procedure isdescribed for finding point correspondences on a rotation symmetric robot tool.The image segmentation part is necessary for performing a measurement with sixdegrees of freedom of the camera to tool transformation. The last part of the pro-posed method is an iterative procedure which automates an ordinary four pointtool center point calibration algorithm. The iterative procedure ensures that theaccuracy of the tool center point calibration only depends on the accuracy of thecamera when registering a movement between two positions.

v

Page 6: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 7: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Acknowledgements

I have had the opportunity to write my Master’s Thesis at ABB Corporate Re-search Center, Mechatronics department, in Vasteras. This period has been agreat experience for me and I have really enjoyed working with the project.

I want to thank my supervisor Development Engineer Ivan Lundberg at ABBCorporate Research Center for his great enthusiasm, all the conversations, and forthe extremely good supervision throughout the project.

I also want to thank my supervisor and examiner Associate Professor KlasNordberg at Linkoping University, for all the great phone conversations, the goodtips and all the support and guidance throughout the project.

Special thanks also to PhD. Mats Andersson at the Center for Medical ImageScience and Visualization for his great interest in the project and helpfully ideas.

Linkoping, February 2007Johan Hallenberg

vii

Page 8: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 9: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Notation

Symbols• f(x, y) denotes a 2D function or an image.

• x, X denote scalar values.

• x, X denote vectors or coordinates.

• X denotes a matrix.

Operators

• f(x, y) ∗ g(x, y) denotes the convolution between the image f(x, y) and theimage g(x, y).

• f(x, y) ? g(x, y) denotes the cross correlation between the image f(x, y)and the image g(x, y).

• XT denotes the transpose of X.

• X† denotes the pseudo inverse of X.

Glossary

TCP Tool Center Point.Jog Manually move the robot with joystick.Base frame The robot’s base coordinate system.Robtarget Cartesian target.DOF Degrees of freedom.Q.E.D ”Quod Erat Demonstrandum” a latin phrase meaning

”which was to be demonstrated”.

ix

Page 10: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 11: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Contents

1 Introduction 11.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Problem Specification/Objectives . . . . . . . . . . . . . . . . . . 21.3 Delimitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.4 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Robotics 32.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32.2 Coordinate Systems . . . . . . . . . . . . . . . . . . . . . . . . . 4

2.2.1 Tool Center Point, TCP . . . . . . . . . . . . . . . . . . . 42.2.2 Base frame . . . . . . . . . . . . . . . . . . . . . . . . . 42.2.3 World frame . . . . . . . . . . . . . . . . . . . . . . . . 52.2.4 Wrist frame . . . . . . . . . . . . . . . . . . . . . . . . . 52.2.5 Tool frame . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.3 TCP Calibration algorithm . . . . . . . . . . . . . . . . . . . . . 7

3 Theory 93.1 Image Processing Theory . . . . . . . . . . . . . . . . . . . . . . 9

3.1.1 Threshold . . . . . . . . . . . . . . . . . . . . . . . . . . 93.1.2 Morfological operations . . . . . . . . . . . . . . . . . . 93.1.3 Structure Element . . . . . . . . . . . . . . . . . . . . . . 9

3.1.3.1 Erosion . . . . . . . . . . . . . . . . . . . . . . 103.1.3.2 Dilation . . . . . . . . . . . . . . . . . . . . . 103.1.3.3 Opening . . . . . . . . . . . . . . . . . . . . . 113.1.3.4 Closing . . . . . . . . . . . . . . . . . . . . . . 11

3.2 Computer Vision Theory . . . . . . . . . . . . . . . . . . . . . . 123.2.1 Camera System . . . . . . . . . . . . . . . . . . . . . . . 123.2.2 Pinhole Camera Model . . . . . . . . . . . . . . . . . . . 133.2.3 Camera Calibration . . . . . . . . . . . . . . . . . . . . . 14

3.3 Transformation Theory . . . . . . . . . . . . . . . . . . . . . . . 163.3.1 Homogeneous Transformation . . . . . . . . . . . . . . . 16

xi

Page 12: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

xii CONTENTS

3.3.1.1 3D translation . . . . . . . . . . . . . . . . . . 163.3.1.2 3D Rotation . . . . . . . . . . . . . . . . . . . 163.3.1.3 Homogeneous Transformation Matrix . . . . . 16

3.3.2 Screw Axis Rotation representation . . . . . . . . . . . . 173.3.3 Rodrigues´s formula . . . . . . . . . . . . . . . . . . . . 18

4 Determination of Tool Center Point 214.1 Overview of the system . . . . . . . . . . . . . . . . . . . . . . . 21

4.1.1 Equipment . . . . . . . . . . . . . . . . . . . . . . . . . 224.2 Determine T2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224.3 Finding X1 and X2 transformation matrixes . . . . . . . . . . . . 244.4 Importance of measure A and B with 6 DOF . . . . . . . . . . . 294.5 Iterative Method for increasing the accuracy . . . . . . . . . . . . 314.6 Alternative method for rotation symmetric tools . . . . . . . . . . 33

5 Image Segmentation of Robot Tool 355.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

5.1.1 Open Computer Vision Library (OpenCV) . . . . . . . . . 355.2 Creation of binary mask image . . . . . . . . . . . . . . . . . . . 35

5.2.1 Image registration . . . . . . . . . . . . . . . . . . . . . 365.2.2 Background subtraction . . . . . . . . . . . . . . . . . . 38

5.2.2.1 The Hough filling method (HFM) . . . . . . . . 395.3 Edge Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

5.3.1 Canny’s Edge Detection . . . . . . . . . . . . . . . . . . 405.4 Contour Retrieving . . . . . . . . . . . . . . . . . . . . . . . . . 41

5.4.1 Freeman Chain Code Contour representation . . . . . . . 415.4.2 Polygon Representation . . . . . . . . . . . . . . . . . . 425.4.3 Freeman Methods . . . . . . . . . . . . . . . . . . . . . . 425.4.4 Active Contour (The Snake algorithm) . . . . . . . . . . . 42

5.5 Finding the contour matching the tool shape . . . . . . . . . . . . 445.5.1 Convex Hull and its defects . . . . . . . . . . . . . . . . 445.5.2 Polygonal Approximation of Contours . . . . . . . . . . . 45

5.5.2.1 Douglas-Peucker Approximation Method . . . . 465.6 Geometric constraints . . . . . . . . . . . . . . . . . . . . . . . . 465.7 Corner Detection . . . . . . . . . . . . . . . . . . . . . . . . . . 47

5.7.1 Harris corner detector . . . . . . . . . . . . . . . . . . . 475.7.1.1 Sub pixel Accuracy . . . . . . . . . . . . . . . 48

5.7.2 Fast radial . . . . . . . . . . . . . . . . . . . . . . . . . . 505.7.3 Orientation Tensor . . . . . . . . . . . . . . . . . . . . . 50

Page 13: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

CONTENTS xiii

6 Results 556.1 Repetition Accuracy of the Camera . . . . . . . . . . . . . . . . . 55

6.1.1 Averaging to achieve higher accuracy . . . . . . . . . . . 566.1.2 How does the number of points affect the repetition accuracy 57

6.2 Camera versus Robot measurements . . . . . . . . . . . . . . . . 596.2.1 Rotation measurements . . . . . . . . . . . . . . . . . . . 596.2.2 Translation measurements . . . . . . . . . . . . . . . . . 61

6.2.2.1 Averaging to achieve higher accuracy . . . . . . 646.3 Accuracy of the TCP Calibration method . . . . . . . . . . . . . 67

6.3.1 Averaging to obtain a higher accuracy . . . . . . . . . . . 696.4 Repetition Accuracy of the TCP Calibration method . . . . . . . . 69

6.4.0.1 How does the number of points affect the repe-tition accuracy of the TCP calibration method . 72

7 Discussion 737.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

7.1.1 Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . 737.1.2 Major Problems/known difficulties . . . . . . . . . . . . . 74

7.1.2.1 Finding distinct points . . . . . . . . . . . . . . 747.1.2.2 Light reflections in the tool . . . . . . . . . . . 74

7.1.3 Fulfillment of objectives . . . . . . . . . . . . . . . . . . 757.2 Future Development . . . . . . . . . . . . . . . . . . . . . . . . 75

7.2.1 Rotation symmetric tools . . . . . . . . . . . . . . . . . . 757.2.2 Light reflections in the tool . . . . . . . . . . . . . . . . . 767.2.3 Image Segmentation controlled by CAD drawings . . . . 767.2.4 Neural network based methods . . . . . . . . . . . . . . . 767.2.5 Online correction of the TCP calibration . . . . . . . . . . 767.2.6 Other image processing libraries . . . . . . . . . . . . . . 77

A Appendix 79A.1 Finding intrinsic parameters . . . . . . . . . . . . . . . . . . . . 79

Bibliography 87

Page 14: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 15: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 1

Introduction

This chapter begins with a short background, then the problem is specified and theobjectives and delimitations are determined. Finally the disposition of the thesisis presented.

1.1 Background

All robots delivered by ABB are calibrated in the production. When the calibra-tion procedure is done the robot is calibrated up to the last axis and is thereby ableto calculate the positions of all axes during a movement. When a tool is mountedon the last axis (the tool flange) the robot needs to know the actual position of theactive point of the tool, the tool center point, which for instance can be the muzzleof a spot welding tool. For this reason a tool center point calibration has to beperformed every time the tool is changed.

Today the TCP calibration is done manually by moving the robot, letting theTCP brush against a fixed point in the environment. The fixed point is typicallythe tip of a nail. The tool center point needs to brush against the tip of the nailwith great precision and from at least four different angles. Then the coordinateof the tool center point can be calculated by the robot in relation to the robot’s toolflange coordinate system. The method is time consuming and the result may varydue to how skilled the robot operator is. The resulting accuracy of todays methodis approximately ±1 mm.

Products for automating the TCP calibration exist, for instance BullsEye forcalibration of arc welding tools and TCP-Beam for calibration of spot weldingtools. The disadvantage with these methods are that they are specialized to onesingle type of robot tools.

1

Page 16: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

2 CHAPTER 1. INTRODUCTION

1.2 Problem Specification/ObjectivesThe purpose of this thesis is to investigate whether or not a computer visionmethod can be used to automate the tool center point calibration process. If itis possible to find a method then the accuracy of the method is also of great in-terest. The objective is to find a method with an accuracy of at least ±1 mm,that automates the tool center point calibration procedure. For development of themethod C#, C/C++ and Matlab are allowed to be used, but the final demonstrationsoftware shall be programmed in C# and C/C++.

1.3 DelimitationTo deliver a method with the ability to calibrate all kinds of robot tools during a5 months Master’s thesis is of course impossible. Instead the method is of impor-tance, and any kind of tool is permitted to be used to demonstrate the method.

1.4 Thesis OutlineChapter 1 Chapter 1 describes the background to the project.

The problem specification, delimitations and thethesis outline are also presented.

Chapter 2 Chapter 2 gives a short introduction to robots andthe chapter concludes with a description of a fourpoint tool center point calibration algorithm.

Chapter 3 In chapter 3 some theory needed to understand therest of the thesis are presented. The Chapter isdivided in image processing theory, computer visiontheory and transformation theory.

Chapter 4 Chapter 4 describes the proposed method in detail and theconfiguration of the devices is presented.

Chapter 5 Chapter 5 describes in detail how the image segmentationof the robot tool was done.

Chapter 6 Chapter 6 presents the results of several tests performedto investigate the accuracy of the proposed method.

Chapter 7 Chapter 7 presents the conclusions and difficulties ofthe project. The chapter concludes with a discussion offuture development.

Appendix A Appendix A presents some results from the camera calibration,performed with the Camera Calibration Toolbox in Matlab.

Page 17: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 2

Robotics

This chapter will give an overview of robots and their coordinate systems. Thechapter concludes with a description of a four point tool center point calibrationalgorithm used in the robots today.

2.1 OverviewTraditionally a robot consists of a mechanical arm and a computer controlling itsposition and movement. In this project only serial kinematics manipulators will beconcerned. The serial kinematics manipulators consist of several axes connectedby joints see figure 2.1.

Figure 2.1: The ABB Robot IRB 6600 (Courtesy of ABB).

3

Page 18: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4 CHAPTER 2. ROBOTICS

Usually the mechanical arm (the manipulator) is divided into an arm, a wristand an end-manipulator (the tool). All the manipulators used during this projecthave got six degrees of freedom, making it possible to position the tool anywherein the robot workspace with a given orientation. Internally the robot end effectororientation is represented by quaternions.

The ABB robots can be manually controlled by using the flexpendant. Theflexpendant is a hand held controller connected to the computer controlling therobot. A robot program telling the robot how to move can be written in theprogram language RAPID. The RAPID programs can be written directly in theflexpendant but it is also possible to develop the RAPID program on a PC andthen transfer the program (by a TCP-IP connection or a USB connection) to thecomputer controlling the robot.

2.2 Coordinate Systems

2.2.1 Tool Center Point, TCP

When a robot movement is programmed by specifying a path of robtargets for therobot to follow, then all the robot movements and robot positions are relative tothe Tool Center Point (TCP). Normally the tool center point is defined to be theactive point of the tool e.g. the muzzle of a spot welding tool or in the center of agripper.

Several tool center points can be defined e.g. one for each tool, but only onetool center point can be active at a given time. When the robot is programmed tomove along a given path, it is the tool center point that will follow the actual path.It is possible to define other coordinate systems and program the robot to moveaccording to these coordinate systems. The tool center point is then expressed inrelation to the coordinate system used in the program [1].

2.2.2 Base frame

The robot base coordinate system is located on the robot base. The origin of thebase frame is located at the intersection of axis 1 and the robot’s mounting surface.The x axis is pointing forward, the y axis points in the robot’s left side directionand the z axis coincides with the rotational axis of axis 1. Figure 2.2 illustratesthe base frame [1].

Page 19: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

2.2. COORDINATE SYSTEMS 5

Figure 2.2: The base coordinate system (Courtesy of ABB).

2.2.3 World frameIf several robots are working in the same area, a world frame can be set up and therobots base coordinate systems can be expressed in relation to the world coordi-nate system. It is then possible to make RAPID programs telling the robot to moveto a certain position in the world frame. If one of the robots is mounted up sidedown, the definition of a world frame will of course simplify the programming ofthe robot. Please, see figure 2.3

[1]

Figure 2.3: The world coordinate system (Courtesy of ABB).

2.2.4 Wrist frameThe wrist frame is fixed to the tool flange/mounting flange (the surface where thetool is to be mounted). The origin of the wrist frame is positioned in the center

Page 20: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6 CHAPTER 2. ROBOTICS

Figure 2.4: The wrist coordinate system (Courtesy of ABB).

of the tool flange and the z axis coincides with axis six of the robot. There is acalibration mark at the tool flange, see figure 2.5. The x-axis of the wrist frameis always pointing in the opposite direction of the calibration mark and the y-axisis achieved by constructing an orthogonal coordinate system axis to the x and zaxes.

[1]

2.2.5 Tool frame

To be able to define a tool center point, a tool frame is needed. The tool frame is acoordinate system on the tool, and the origin of the tool frame coincides with thetool center point. The tool frame can also be used to obtain information about thedirection of the robot movement.

[1]

Figure 2.5: The tool coordinate system (Courtesy of ABB).

Page 21: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

2.3. TCP CALIBRATION ALGORITHM 7

2.3 TCP Calibration algorithmToday, the TCP calibration is done by jogging the robot making the TCP brushagainst a fixed point in the close surrounding to the robot. The fixed point is forinstance the tip of a nail. By jogging the robot, making the TCP brush against thetip of the nail from at least four different orientations, the coordinate of the TCPin relation to the robot base frame is calculated.

Let T1 be the position of the tool flange in relation to the robot base frame,e.g. the robot’s forward kinematics. Assume N different positions of the robotare known, making the TCP brush against the fixed point in the environment.Then N different transformations T1i are also known, where i ∈ [1...N ]. Let[

TCPx TCPy TCPz

]Tbe the translation from the origin of the tool flange

coordinate system to the tool center point.Let Q be the homogeneous transformation from the tool flange to the tool centerpoint:

Q =

1 0 0 TCPx

0 1 0 TCPy

0 0 1 TCPz

0 0 0 1

It is obvious that equation 2.1 is satisfied, due to the fact that the tool center

point is at the same coordinate independent of which of the N robot positions thatis examined.

T1iQ = T1jQ (2.1)

Where i, j ∈ [1...N ] and i 6= j.

Denote:

T1i =

a11 a12 a13 a14

a21 a22 a23 a24

a31 a32 a33 a34

a41 a42 a43 a44

=

Ra ta

0 0 0 1

(2.2)

T1j =

b11 b12 b13 b14

b21 b22 b23 b24

b31 b32 b33 b34

b41 b42 b43 b44

=

Rb tb

0 0 0 1

(2.3)

Examine one row of equation 2.1 gives

a11TCPx + a12TCPy + a13TCPz + a14 = b11TCPx + b12TCPy + b13TCPz + b14

Page 22: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

8 CHAPTER 2. ROBOTICS

(a11 − b11)TCPx + (a12 − b12)TCPy + (a13 − b13)TCPz = −(a14 − b14) (2.4)

For every equation 2.1 a system of three equations like equation 2.4 is retrieved.The system of three equations can be rewritten as

[(Ra −Rb

)] TCPx

TCPy

TCPz

= −[

ta − tb

](2.5)

By using all distinct combinations i, j of the N robot positions, a system ofequations 2.5 is retrieved.

[TCPx TCPy TCPz

]Tis then easily calculated

as the linear least square solution of the system of equations.As the rotation part of the transformation from the tool flange coordinate sys-

tem to the tool coordinate system is of no importance for determine the tool centerpoint, all information needed for determine the coordinate of the tool center pointis hereby achieved.

Page 23: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 3

Theory

This chapter will describe theories necessary for understanding the rest of the the-sis. The chapter is divided in image processing theories, computer vision theoriesand transformation theories.

3.1 Image Processing Theory

3.1.1 ThresholdThe threshold operation is a grey level/one color channel image processing methodresulting in a binary image. Let the image be f(x, y), the resulting binary imageb(x, y) and the threshold value T then

b(x, y) =

{1 if f(x, y) ≥ T0 if f(x, y) < T

3.1.2 Morfological operationsMorfological operations are methods working on binary images. During this sec-tion the following terminology will be used.f(x, y) is the binary image.r(x, y) is the resulting binary image.a(x, y) is a structure element.See [2] for more information.

3.1.3 Structure ElementA structure element is a binary image a(x, y) that defines the erosion or the di-lation operations. The structure element is used as a convolution kernel in the

9

Page 24: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

10 CHAPTER 3. THEORY

dilation operation and a cross correlation kernel in the erosion operation.

3.1.3.1 Erosion

The erosion operation is accomplished by setting all object pixels within a certaindistance from a background pixel to 0. In practice a structure element is used. Theorigin of the structure element is then translated to every object pixel. If all of thestructure element is accommodated in the object at a certain position the pixel isset to 1 in the resulting image, otherwise 0.

r(x, y) = a(x, y) f(x, y)

Where is the erosion operator.The erosion operation can also be seen as a cross correlation between the struc-

ture element a(x, y) and the image f(x, y).

r(x, y) =

{1 if a(x, y) ? f(x, y) = A0 if a(x, y) ? f(x, y) 6= A

Where A is the number of pixels in the structure element.

Notice, a(x, y) f(x, y) 6= f(x, y) a(x, y) due to the fact that correlationdoes not fulfill the commutative law

Figure 3.1: Original image before morfological operation.

Figure 3.2: Erosion operation applied to figure 3.1. A 8× 8 structure element wasused.

3.1.3.2 Dilation

The dilation operation is the opposite of the erosion operation. It is accomplishedby setting all background pixels within a certain distance from an object point to

Page 25: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

3.1. IMAGE PROCESSING THEORY 11

1. In practice a structure element is used. The origin of the structure element isthen translated to every object pixel. In every position a pixel wise OR operationis carried out.

r(x, y) = a(x, y)⊕ f(x, y)

Where ⊕ is the erosion operator.The dilation operation can also be seen as a convolution between the structure

element a(x, y) and the image f(x, y).

r(x, y) =

{1 if a(x, y) ∗ f(x, y) ≥ 10 else

Figure 3.3: Dilation operation applied to figure 3.1. A 4×4 structure element wasused.

3.1.3.3 Opening

The opening operation consists of one erosion operation followed by one dilationoperation, where the distance is the same in both of the operations. When im-plemented with a structure element the same structure element is used in the twooperations.

The opening operation will split two objects which border on each other.

3.1.3.4 Closing

The closing operation consists of one dilation operation followed by one erosionoperation, where the distance is the same in both of the operations. When im-plemented with a structure element the same structure element is used in the twooperations.

The closing operation will merge two objects which border on each other.

Page 26: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

12 CHAPTER 3. THEORY

3.2 Computer Vision Theory

3.2.1 Camera System

The lens inside the camera refracts all rays of light from a certain object point toone single point in the image plane. If the lens is thin implying the distortion canbe neglected, the lens law is valid.

1

α+

1

β=

1

f(3.1)

Where α is the distance between the lens and the object, β is the distance betweenthe lens and the image plane and f is the focal length. Figure 3.4 illustrates thelens law.

Figure 3.4: Illustration of the lens law.

By the lens law it is obvious that an object at the distance α from the lenswill be reproduced with complete sharpness on the image plane. If the distancebetween the object and the lens differs from α, the reproduction on the imageplane will be more or less blurred. How blurred the image will be, is determinedby the depth of field s.

s = 2λ

(f

D

)(3.2)

Where D is the diameter of the aperture and λ is the wavelength of the incomingray of light. The depth of field can also be defined as the interval where the

Page 27: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

3.2. COMPUTER VISION THEORY 13

resolution is greater than 12d

where the resolution 1d

is defined as following

1

d=

1

λ

D

f(3.3)

[3]

3.2.2 Pinhole Camera ModelOne common way of modeling a camera is to use the pinhole camera model. Themodel performs well as long as the lens is thin and no wide-angle lens is used.In practise the image plane is located behind the lens, but to simplify calculationsand relations between the coordinate systems, the image plane can be put in frontof the lens. Figure 3.5 illustrates the pinhole camera model with the image planelocated in front of the lens.

Figure 3.5: Illustration of the pinhole camera model, image plane in front of thelens to simplify calculations (Courtesy of Maria Magnusson Seger).

Equation 3.4 shows the relation between the coordinate systems.

W

un

vn

1

=

UVW

=[

R t]

XYZ1

(3.4)

Where[

U V W]T

are the camera coordinates,[

un vn

]Tare the ideal nor-

malized image coordinates and[

X Y Z]T

are the world coordinates.

Page 28: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

14 CHAPTER 3. THEORY

Matrix[

R t]

is the extrinsic parameters and describes the translation andthe rotation between the coorinate systems, e.g. how the camera is rotateted andtranslated in relation to the origin of the world coordinate system.

The real image plane usually differs from the ideal normalized image plane.Equation 3.5 describes the relation between the two planes. u

v1

= A

un

vn

1

(3.5)

Where A defines the intrinsic parameters.

A =

α γ u0

0 β v0

0 0 1

Where α and β are scaling factors for the u and v axes.[

u0

v0

]are the image

coordinates of the intersection between the image plane and the optical axis. γdescribes the skewing between the u and v axes.

Observe, the relation in equation 3.5 implies equation 3.4 can be rewritten as

W

uv1

= kA[

R t]

XYZ1

(3.6)

Where k is an arbitrary constant. Let s = Wk

implies equation 3.6 can be rewrittenas

s

uv1

= A[

R t]

XYZ1

(3.7)

[3]

3.2.3 Camera CalibrationEquation 3.7 describes how a point in the environment maps to the image planeup to a scale factor s.Let:

C = A[

R t]

(3.8)

Page 29: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

3.2. COMPUTER VISION THEORY 15

If N corresponding points in the image[

ui vi

]Tand the world

[Xi Yi Zi

]Tare found and i ∈ [1 . . . N ] then C can be determined to up a scale factor.

C =

C11 C12 C13 C14

C21 C22 C23 C24

C31 C32 C33 C34

(3.9)

By setting C34 = 1 in equation 3.9 the scale factor is determined.Let:

c =[

C11 C12 C13 C14 C21 C22 C23 C24 C31 C32 C33

]TTo determine c (determine the intrinsic and extrinsic parameters) a system ofequations given by the corresponding points can be used.

Dc = f (3.10)

D =

X1 Y1 Z1 1 0 0 0 0 −u1X1 −u1Y1 −u1Z1

0 0 0 0 X1 Y1 Z1 1 −v1X1 −v1Y1 −v1Z1...

......

......

......

......

......

0 0 0 0 XN YN ZN 1 −vNXN −vNYN −vNZN

f =

u1

v1...

vN

c is then given by

c =(DTD

)−1DTf = D†f (3.11)

Note, at least six corresponding points are needed to determine the intrinsicand extrinsic parameters.

[3]

Page 30: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

16 CHAPTER 3. THEORY

3.3 Transformation Theory

3.3.1 Homogeneous Transformation3.3.1.1 3D translation

Let point P =[

x y z]T

be translated to a point P′=[

x′ y′ z′]T

by the

translation vector t =[

tx ty tz]T

. Then P′ can be expressed as

P′= P + t (3.12)

3.3.1.2 3D Rotation

Let point P =[

x y z]T

be rotated θ◦ around the Z axis to a point P′

=[x′ y′ z′

]T. Point P′ can be expressed according to [4] as

x′ = x cos θ − y sin θ (3.13)x′ = x sin θ + y cos θ (3.14)z′ = z (3.15)

Let:

Rz =

cos θ − sin θ 0sin θ − cos θ 0

0 0 1

Then P

′ can be written asP′= RzP (3.16)

3.3.1.3 Homogeneous Transformation Matrix

Translation and multiplicative terms for a three dimensional transformation canbe combined to a single matrix. Expand the three dimensional coordinates P andP ′ in section 3.3.1.1 and section 3.3.1.2 to four element column vectors as

P =[

xh yh zh h]T

P′=[

x′h y′h z′h h]T

Where h is the nonzero homogeneous parameter.

x = xh

h, y = yh

h, z = zh

h

x′ =x′hh

, y′ =y′hh

, z′ =z′hh

Page 31: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

3.3. TRANSFORMATION THEORY 17

For a geometric transformation the homogeneous parameter h can be set to anynonzero value. Suitably h is set to 1 implying e = eh where e ∈ {x,y,z}.

The translation in equation 3.12 can be represented by a homogeneous matrixM as

P′= MP (3.17)

M =

1 0 0 tx0 1 0 ty0 0 1 tz0 0 0 1

The rotation in equation 3.16 can be represented by the homogeneous matrix

M if

M =

cos θ − sin θ 0 0sin θ cos θ 0 0

0 0 1 00 0 0 1

A complete transformation of a rigid body e.g. a transformation between two

different coordinate systems can be expressed as a homogeneous transformationmatrix T.

T =

r11 r12 r13 txr21 r22 r23 tyr31 r32 r33 tz0 0 0 1

3.3.2 Screw Axis Rotation representation

A rotation matrix R =

a11 a12 a13

a21 a22 a23

a31 a32 a33

is completely defined by the axis of

rotation[

nx ny nz

]Tand the rotation angle θ as

a11 = (n2x − 1)(1− cos θ) + 1 (3.18)

a12 = nxny(1− cos θ)− nz sin θ (3.19)a13 = nxnz(1− cos θ) + ny sin θ (3.20)a21 = nynx(1− cos θ) + nz sin θ (3.21)a22 = (n2

y − 1)(1− cos θ) + 1 (3.22)a23 = nynz(1− cos θ)− nx sin θ (3.23)a31 = nznx(1− cos θ)− ny sin θ (3.24)a32 = nzny(1− cos θ) + nx sin θ (3.25)a33 = (n2

x − 1)(1− cos θ) + 1 (3.26)

Page 32: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

18 CHAPTER 3. THEORY

Equation 3.26 is called the screw axis representation of the orientation. [5]Given a rotation matrix R, the angle of rotation θ and the rotation axis

[nx ny nz

]Tcan be obtained by the following equations.

θ = cos−1 a11 + a22 + a22 − 1

2(3.27)

nx =a32 − a23

2 sin θ(3.28)

ny =a13 − a31

2 sin θ(3.29)

nz =a21 − a12

2 sin θ(3.30)

This representation makes it possible to compare two different measuring unitswhich uses different coordinate systems to measure the rotation. In this projectthe method was used to ensure the camera measured the same rotation angle asthe robot, when the robot was told to perform a rotation.

3.3.3 Rodrigues´s formulaOne way of finding the rotation matrix R in section 3.3.2 is to make use of theRodrigues’s formula for a spherical displacement of a rigid body.

Let point P1 rotate around the rotation axis n, resulting in a new position P2

of the point. See figure 3.6. The Rodrigues’s formula is then defined according toequation 3.31

Figure 3.6: Definition for Rodrigues’s formula.

r2 = r1 cos θ + n× r1 sin θ + n(rT1 n)(1− cos θ) (3.31)

Page 33: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

3.3. TRANSFORMATION THEORY 19

According to [5] equation 3.31 can be rewritten as equation 3.32

r2 = 1R2r1 (3.32)

Where 1R2 is the rotation matrix fulfilling the equations 3.18 - 3.26

Page 34: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 35: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 4

Determination of Tool Center Point

This chapter describes how the problem of finding the tool center point was solved.The chapter starts by describing the proposed method for determination of thecamera to tool, tool to robot wrist and camera to robot base coordinate systemtransformations respectively. The chapter also describes the proposed iterativemethod for finding the tool center point. The iterative method is used to removeuncertainties due to camera calibration errors and robot calibration errors.

4.1 Overview of the system

Figure 4.1: Overview of the system.

Figure 4.1 shows the arrangement of the system. A computer is connected toa camera by a USB cable and to the robot by a TCP-IP cable.

21

Page 36: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

22 CHAPTER 4. DETERMINATION OF TOOL CENTER POINT

4.1.1 EquipmentThe robot used during the project was an IRB 1400 ABB robot controlled by anIRC5 computer. A laptop computer with an Intel PentiumM 1600Mhz processorand 1Gb RAM was connected to the IRC5 computer by a TCP-IP cable. A pro-gram was then developed in C# and C/C++ making it possible to control the robotmovements directly from the laptop. The laptop computer was also used for im-age segmentation of the tool. The images were retrieved at 50 frames per secondby an uEye (UI221x-M V2.10) CCD grayscale USB camera with a resolution of640 × 480 pixels.

Except for the equipment, four different transformations T1, T2, X1 and X2

are displayed in figure 4.1.By defining a tool coordinate system TF (tool frame) with its origin at the tool

center point, the problem of finding TCP actually becomes equivalent by findingthe transformation X1 from the tool frame TF to the TFF (Tool Flange Frame)coordinate system fixed at the last axis of the robot manipulator.

To make it possible to determine transformation X1, a closed loop of homo-geneous transformation matrixes can be written as an equation.

T2X1 = X2T1 (4.1)

Transformation matrix T1 in Equation 4.1 is the robot’s forward kinematics.Consequently T1 is known, making the problem of finding X1 become equivalentby determine X2 and T2

4.2 Determine T2

By calibrating the camera using the method proposed by Zhengyou Zhang [6]the intrinsic parameters of the camera were determined and the lens distortioncoefficients were retrieved, see Appendix A.

Let m =[

u v]T

denote a 2D image point and M =[

X Y Z]T

de-note a 3D point. The augmented representations of these vectors are achievedby adding 1 as the last element of the vector. m =

[u v 1

]Tand M =[

X Y Z 1]T

. By modeling the camera as a perfect pinhole camera, the re-lation between a 3D point M and its projected image point m is fulfilling equation4.2

sm = ATM (4.2)

Where A is the intrinsic parameters, s is a scale factor and T is the extrinsicparameters.

T =[

r1 r2 r3 t]

Page 37: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4.2. DETERMINE T2 23

By setting up a plane in the world, the homography between the plane and its im-age can be determined. Let the plane fulfilling the constraint of Z = 0. Equation4.2 is then rewritten as

s

uv1

= A[

r1 r2 r3 t]

XY01

= A[

r1 r2 t] X

Y1

Let the homography H =

[h1 h2 h3

]= A

[r1 r2 t

]. According to [6]

this implies the extrinsic parameters can be determined as

r1 = λA−1h1

r2 = λA−1h2

r3 = r1 × r1

t = λA−1h3

Where λ = || 1A−1h1

|| = || 1A−1h2

||

By letting the plane be expressed in the tool coordinate system TF, finding trans-formation T2 in figure 4.1 becomes equivalent of finding the extrinsic parametersT =

[r1 r2 r3 t

]. From 4.2 it is obvious that T is totally determined by the

intrinsic parameters A and the homograpghy H. The camera calibration toolboxfor Matlab was used to determine the intrinsic parameters, A, for the camera. SeeAppendix A. This toolbox uses the same technique as mentioned in this sectionand in [6]. The homography H =

[h1 h2 h3

]can be obtained according to

[6] by minimizing

∑i

mi − 1

hT3 Mi

hT

1 Mi

hT

2 Mi

T

(σ2i I)

−1

mi − 1

hT3 Mi

hT

1 Mi

hT

2 Mi

Where hj is the j:th row of H, I is the identity matrix and σ is the standarddeviation of the Gaussian noise that is assumed to affect mi. The problem can besolved as a non-linear least square problem.

minH∑

i

∣∣∣∣∣∣∣∣∣∣∣∣mi − 1

hT3 Mi

hT

1 Mi

hT

2 Mi

∣∣∣∣∣∣∣∣∣∣∣∣2

Due to the fact that a plane is by definition defined of at least three points, atleast three corresponding points mi and Mi belonging to the plane needs to bedetermined. An image segmentation method for finding three distinct points inthe plane lying in the tool coordinate system TF, is described in chapter 5. Thetool used for the segmentation was a calibration tool with an awl looking shape.The TCP of the tool was the tip of the awl, and the TCP was one of the pointsdetermined by the method.

Page 38: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

24 CHAPTER 4. DETERMINATION OF TOOL CENTER POINT

4.3 Finding X1 and X2 transformation matrixes

When the transformations T1 and T2 were found only X2 remained to be de-termined. Jintao and Daokui proposed a method in [7] where the transformationX2 was determined by moving the robot to a fixed position e.g. the tip of a nailwith known relation to a calibration pattern. Zhuang et al. [8] proposed a methodthat solved X1 and X2 simultaneously by applying quaternion algebra to derivea linear solution. Fadi Donaika and Radu Horaud [9] proposed one closed formmethod and one method based on non-linear constrained minimization for solvingthe equation 4.1 of homogeneous matrixes.

The method implemented was first described by Roger Y. Tsai and Reimar K.Lenz [10]. By moving the robot to a second position finding T1 and T2 at thenew location of the robot, a system of two equations 4.3 and 4.4 can be achieved,please see figure 4.2

T21X1 = X2T11 (4.3)

T22X1 = X2T12 (4.4)

By multiply the inverse of equation 4.4 by equation 4.3 equation 4.6 will beobtained.

(T22X1)−1T21X1 = (X2T12)

−1X2T11 (4.5)

⇔X−1

1 T−122 T21X1 = T−1

12 X−12 X2T11

⇔T−1

22 T21X1 = X1T−112 T11

AX1 = X1B (4.6)

Where A = T−122 T21 and B = T−1

12 T11 are known. Observe, A and B are thetransformations from position 1 to position 2 of the tool flange frame and the toolframe respectively. See figure 4.2

The transformation matrixes A,B,X1 are all homogeneous matrixes. A ho-mogeneous matrix consists of one rotation matrix R and one translation vectort.

Page 39: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4.3. FINDING X1 AND X2 TRANSFORMATION MATRIXES 25

Figure 4.2: Transformations AX1 = X1B. RB is the robot base frame, CF is thecamera frame, TF is the tool frame and TFF is the tool flange frame. A and B arethe transformations from position 1 to position 2 of the tool flange frame and thetool frame respectively.

X1 =

(RX1 tX1

0 0 0 1

)

A =

(RA tA

0 0 0 1

)

B =

(RB tB

0 0 0 1

)

The rotation matrix R can be written as

R =

n21 + (1− n2

1) cos θ n1n2(1− cos θ)− n3 sin θ n1n3(1− cos θ) + n2 sin θn1n2(1− cos θ) + n3 sin θ n2

2 + (1− n22) cos θ n2n3(1− cos θ)− n1 sin θ

n1n3(1− cos θ) + n2 sin θ n2n3(1− cos θ) + n1 sin θ n23 + (1− n2

3) cos θ

T

(4.7)Where

[n1 n2 n3

]is the rotation axis and θ is the rotation angle. Obvi-

ously it is possible to specify R in equation 4.7 by specifying[

n1 n2 n3

]and

θ.

Page 40: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

26 CHAPTER 4. DETERMINATION OF TOOL CENTER POINT

By using a modified version of Rodrigues formula, see section 3.3.3, a func-tion Pr depending on

[n1 n2 n3

]and θ can be defined as

Pr = 2 sinθ

2

[n1 n2 n3

], 0 ≤ θ ≤ π (4.8)

Let:PA,PB,PX1 be the rotation axes defined according to equation 4.8for RA,RB,RX1 respectively.

P′X1

=1√

4− |PX1|2(4.9)

For a vector v = [vx vy vz] let

Skew(v) =

0 −vz vy

vz 0 −vx

−vy vx 0

(4.10)

By setting up a system of linear equations according to equation 4.11, P′X1

can be solved by using linear least square techniques.

Skew(PB + PA)P′X1

= PA −PB (4.11)

Proof:

PA −PB ⊥ P′X1

(4.12)

PA −PB ⊥ PA −PB (4.13)

By equation 4.12 and 4.13 follows

PA −PB = s(PA + PB)×P′X1

(4.14)

Where s is a constant

PA −PB and (PA + PB) × P′X1

has got the same length implying s = 1.

Page 41: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4.3. FINDING X1 AND X2 TRANSFORMATION MATRIXES 27

Let α be the angle between P′X1

and (PA + PB).

|(PA + PB)×P′X1| = |PA + PB||P

′X1| sin α =

=

/Use equation 4.9

/=

|PA + PB|2 sin θ2(4− 4 sin2 θ

2)−

12 sin α =

|PA + PB| tan θ2sin α =

|PA −PB|

This implies equation 4.14 can be written as

PA −PB = (PA + PB)×P′X1

(4.15)

By equation 4.15 and the relation a× b = Skew(a)b follows

Skew(PB + PA)P′X1

= PA −PB

Q.E.D

Skew(PB +PA) is always singular and therefor at least two pairs of positionsare needed to create the system of equation 4.11 e.g. three different positions forthe robot are needed. When P

′X1

is determined PX1 can be retrieved by equation4.16. The rotation RX1 is then determined by equation 4.17

PX1 =2P

′X1√

1 + |P′X1|2

(4.16)

RX1 = (1− |PX1|2

2)I +

1

2(PX1P

T

X1+√

4− |PX1|2Skew(PX1)) (4.17)

Let:

TX1 =

1 0 0 |0 1 0 tX1

0 0 1 |1 0 0 1

(4.18)

Page 42: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

28 CHAPTER 4. DETERMINATION OF TOOL CENTER POINT

TA =

1 0 0 |0 1 0 tA

0 0 1 |1 0 0 1

(4.19)

TB =

1 0 0 |0 1 0 tB

0 0 1 |1 0 0 1

(4.20)

To find the translation tX1 of the homogeneous transformation X1, a linearsystem of equations 4.21 can be solved by a linear least square technic.

(RB − I)TX1 = RX1TA −TB (4.21)

Proof:

AX1 = X1B⇔

(TB + RB − I)(TX1 + RX1 − I) = (TX1 + RX1 − I)(TA + RA − I)⇔

TBTX1 + TBRX1 −TB + RBTX1 + RBRX1 −RB −TX1 −RX1 + I =−RX1 −TA + RX1TA + RX1RA + TX1 + TA − I

⇔TB + TX1 − I + TB + RX1 − I−TB + RBTX1 + RBRX1 −RB −TX1 −RX1 + I =

−RX1 + RX1TA + RX1RA + TX1 − I⇔

TB − I + RBTX1 + RBRX1 −RB = −RX1 + RX1TA + RX1RA + TX1 − I⇔

RBTX1 + RBRX1 −RB −TX1 =−RX1 + RX1TA + RX1RA −TB

⇔/Remove all terms not having a translation

/⇔

RBTX1 −TX1 = RX1TA −TB

⇔(RB − I)TX1 = RX1TA −TB

Q.E.D.

At least three different positions of the robot are needed to calculate the equa-tions 4.11 and 4.21 by a linear least square technique. In the implementation, four

Page 43: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4.4. IMPORTANCE OF MEASURE A AND B WITH 6 DOF 29

different positions were used. The result varied much due to how the positionswere chosen. The best estimation of X1 was achieved when the positions werefulfilling the constraint of completely exciting all degrees of freedom. In practisethis constraint is equivalent of ensuring the three first positions differ in both a lin-ear movement along orthogonally axes x, y, z and rotations around each axis. Thefourth position was chosen to be a movement in

[x y z

]T=[

1 1 1]T

and a rotation around the same axis.

4.4 Importance of measure A and B with 6 DOF

Actually, if only five degrees of freedom can be measured in T1 or T2, no uniquesolution X1 exists to the equation AX1 = X1B. This situation occurs for in-stance if the tool has an axis of symmetry.

Proof:First four lemmas have to be defined.

Lemma1 :Matrixes A and B are similar if a matrix X exists such as

B = X−1AX⇔

AX = XBIf A and B are similar then A and B has got the same

trace, determinant and eigenvalues.[11]

Lemma2 :The eigenvalues of a rotation matrix R are

λ1 = 1λ2 = cos θ + i sin θλ3 = cos θ − i sin θ

Where θ is the angle of rotation specified by equation 4.7[12]

Page 44: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

30 CHAPTER 4. DETERMINATION OF TOOL CENTER POINT

Lemma3 :A rotation matrix R fulfills according to [13] the orthogonality conditions

R−1 = RT

andRTR = I

Lemma4 :For a matrix R fulfilling the ortogonality conditions,

the trace of R is the sum of the eigenvalues of R according to [12].Tr(R) =

∑i λi

Lemma 2 and 3 in Lemma 4 gives

Tr(R) = 1 + cos θ + i sin θ + cos θ − i sin θ

θ = arccos

(Tr(R)− 1

2

)(4.22)

If the robot is told to perform a rotation θ around the axis of symmetry, the robotwill measure a rotation θB = θ. However the camera will not be able to distinguishany difference between the orientation before and after the rotation around the axisof symmetry. This implies the camera will measure a rotation θA = 0.

It is then obvious by equation 4.22 that in general Tr(A) 6= Tr(B) implyingaccording to Lemma1, there is no solution X1 to equation AX1 = X1B.

Q.E.D

Observe, once transformation X1 is determined, the transformation X2 be-tween the camera coordinate system and the robot base frame can be obtained asX2 = T2X1T

−11 .

The calibration tool described in chapter 5 had an axis of symmetry makingit impossible to find a solution X1 to equation AX1 = X1B. Instead, the nonsymmetric chess board in figure 4.3 was used as a robot tool. The TCP was definedas the upper left corner of the chess pattern.

Even though it is impossible to find a solution X1 to equation AX1 = X1Bif the robot tool has an axis of symmetry, it is still possible to determine X1 byusing another approach, see section 4.6.

Page 45: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4.5. ITERATIVE METHOD FOR INCREASING THE ACCURACY 31

Figure 4.3: Chessboard pattern.

4.5 Iterative Method for increasing the accuracyAlthough the transformation X1 is known and the coordinate of the tool centerpoint is found, the accuracy of the method is under influence of errors in thecamera calibration and errors in the robot calibration. To ensure a high accuracyan iterative method had to be used. Denote the tool center point coordinate foundby the method as TCPguess and the correct coordinate of TCP as TCPcorrect.

Assume the robot reorient around TCPguess. If TCPguess = TCPcorrect

the TCPcorrect would stay at the same point TCPguess during the reorientation,while TCPcorrect would move away from TCPguess if TCPguess 6= TCPcorrect.This phenomena can be used to achieve a great accuracy of the tool center pointcalibration.

Denote the error vector between TCPguess and TCPcorrect after a reorienta-tion around TCPguess as ε. By measure the coordinate of TCPcorrect after thereorientation around TCPguess, ε can be retrieved as

ε = TCPcorrect −TCPguess (4.23)

Of course the new measurement of TCPcorrect is only a guess, retrieved in thesame way as TCPguess and with the same accuracy. This implies e.g.

ε = TCPguess2 −TCPguess (4.24)

The robot can then be told to perform a linear movement of−ε. Ideally this would

Page 46: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

32 CHAPTER 4. DETERMINATION OF TOOL CENTER POINT

imply TCPguess = TCPcorrect but in most cases TCPcorrect 6= TCPguess2 im-plying TCPguess 6= TCPcorrect. Instead TCPcorrect is measured again by thecamera and a new ε is calculated. This procedure is done iteratively until |ε| < β,where β is the desired accuracy. Observe, to retrieve a faster convergence of theiterative algorithm the robot can be told to move −αε, where α is a constant < 1.

When the desired accuracy β is retrieved, two positions of the tool are knownwhere TCPcorrect = TCPguess. The orientation of the tool differs between thetwo positions, because only linear movements are used when moving the robot−ε.

By performing reorientations around TCPguess in six different directions andusing the iterative procedure to linearly move TCPcorrect back to TCPguess, sixdifferent positions are obtained. All of the six robot positions ensure TCPcorrect

is at the same certain point TCPguess, but with six different orientations.

Let T1i, i ∈ [1...6] be the homogeneous transformations from the robot baseframe coordinate system to the tool flange coordinate system e.g. the robot’s for-

ward kinematics for the six different positions. Let

TCPx

TCPy

TCPz

be the translation

from the origin of the tool flange coordinate system to the tool center point.Let:

Q =

1 0 0 TCPx

0 1 0 TCPy

0 0 1 TCPz

0 0 0 1

Due to the fact that the tool center point is at the same coordinate independentof which of the six robot positions that is examined, the algorithm described insection 2.3 on page 7 can be used to determine the true position of the tool centerpoint. By using the iterative procedure and calculating the tool center point as aleast square solution described in section 2.3, the accuracy of the TCP calibra-tion method becomes independent of the camera calibration errors and the robotcalibration errors. Instead, only the accuracy of the camera when registering amovement between two different positions, affects the final accuracy of the toolcenter point calibration achieved by the iterative method. This can be assured byconsidering that the camera is an external observer during the iterative procedure,and the method will iterate until the camera measures (with the desired accuracy)the true tool center point is at the same location as the initial guess point.

Page 47: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

4.6. ALTERNATIVE METHOD FOR ROTATION SYMMETRIC TOOLS 33

4.6 Alternative method for rotation symmetric toolsIf the transformation between the camera and the robot’s base frame, X2, isknown, the iterative method for increasing the accuracy will work even if thetool is symmetric.

During the iterative method, the robot performs a reorientation and the cam-era only needs to measure the translation error between the positions before andafter the reorientation. This assures the transformation, T2 between the cameraframe and the tool frame only needs to be determined with five degrees of freedomduring the iterative procedure.

Although, the direction of the movement -ε must be expressed in relation to therobot’s base frame. Therefore the rotation part of the transformation, X2 betweenthe camera frame and the robot base frame needs to be determined. The rotationpart, RX2 , of the homogeneous transformation X2 can be obtained according to[4] by moving the robot a certain distance in the X,Y and Z directions respectivelyin the robot’s base frame.

Let u =[

ux uy uz

]T, v =

[vx vy vz

]Tand w =

[wx wy wz

]Tbe the vectors measured by the camera when the robot moves a specified distancein the X, Y and Z directions respectively. According to [4] the rotation part RX2 ,of the homogeneous transformation X2 is then obtained as

RX2 =

ux vx wx

uy vy wy

uz vz wz

(4.25)

The translational scaling factor between the two coordinate systems is also possi-ble to determine, because the distances the robot moves in the X, Y and Z direc-tions are known and the distances of the same movements measured by the cameraare also retrieved. This assures it is possible to find the translation error ε betweenTCPguess and the true coordinate of the tool center point TCPcorrect, even if therobot tool is rotation symmetric. It is thereby possible to perform a tool centerpoint calibration, even if the robot tool is rotation symmetric.

Page 48: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 49: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 5

Image Segmentation of Robot Tool

This chapter will describe the methods evaluated during this project for imagesegmentation of the robot tool.

5.1 OverviewSince one of the goals of this thesis was to deliver a software program made inC/C++ and C# , all of the image processing had to be written in either of theseprogram languages. The HMI part of the program was written in C# and the imageprocessing functions were written in C/C++.

To keep the production costs as low as possible only open source libraries wereallowed to be used during this project. Due to this constraint the image processingfunctions in Open Computer Vision Library 1.0 (OpenCV 1.0) were mainly usedfor image segmentation.

5.1.1 Open Computer Vision Library (OpenCV)

OpenCV 1.0 is an open source project written in C consisting of a large collec-tion of image processing algorithms. The library is also compatible with Intel’sIPL image package and has got the ability to utilize Intel Integrated PerformancePrimitives for better performance. [14] [15]

5.2 Creation of binary mask imageTo retrieve a rough approximation of the image region where the object is locatednumerous approaches can be used. In this section two different methods will beevaluated.

35

Page 50: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

36 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

5.2.1 Image registration

The idea of image registration is to find the transfer field v(x) : R2 → R2,x =[xy

]making a reference image I2(x, y) fit as good as possible in a target image

I1(x, y) e.g. finding v(x) minimizing ε in equation 5.1

ε2 = ||I2(x + v(x))− I1(x)|| (5.1)

The method outlined in this section is described in more detail in [16]. Byletting the reference image be a picture of the desired tool and the target image bethe image achieved by the camera, this method should iteratively be able to findthe location of the tool in the target image. One constraint of the target image haveto be fulfilled.

ConstraintI :The image can locally be described as a sloping plane.

I(x, t) = I(x, t)−∇IT (x, t)v(x) + ∆t

∆t = I(x, t + 1)− I(x, t)

∇I(x, t) =

[∇xI(x, t + 1)∇yI(x, t + 1)

]

ConstraintI can be rewritten as

I(x, t) = I(x, t)−∇IT (x, t)v(x) + ∆t

I(x, t) = I(x, t)−∇IT (x, t)v(x) + (I(x, t + 1)− I(x, t))

I(x, t) = −∇IT (x, t)v(x) + I(x, t + 1)

∇IT (x, t)v(x) = I(x, t + 1)− I(x, t) (5.2)

Page 51: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.2. CREATION OF BINARY MASK IMAGE 37

Let:I1 = Ix,t+1

I2 = Ix,t

v(x) = B(x)p

B(x) =

[1 0 x y 0 00 1 0 0 x y

]

p =

p1...

p6

Then 5.2 can be written as

∇IT2 B(x)p = I1 − I2 (5.3)

Let:

A =

∇I2(x1)

TB(x1)...

∇I2(xn)TB(xn)

b =

I1(x1)− I2(x1)

...I1(xn)− I2(xn)

Where n is the number of pixels in I2. Equation 5.3 is then rewritten to equation5.4.

Ap = b (5.4)

This implies p and v(x) are determined as

p = (ATA)−1ATb = A†b (5.5)v(x) = B(x)p (5.6)

The new location I2new of the reference image is then obtained by interpolatingI2 from v(x). The method is then iterated with I2new as I2 until |v(x)| < β, whereβ is the desired accuracy.

The method described in this section was implemented and evaluated in Mat-lab. Unfortunately the method was not reliable. The resulting position of the

Page 52: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

38 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

reference image I2 was heavily affected by the light condition. Although, whenusing the method on synthetic binary test images the method performed very well.

Of course it would be possible to threshold the image I1 retrieving a binaryimage Ibinary, and then apply the image registration method on the binary image.Although, finding the threshold value that separates the background completelyfrom the tool is a hard problem to solve.

Of course there might exist image registration methods, better suited for solv-ing the problem. A phase based image registration method would perhaps givea better result than the method described here. However, due to the fact that itis possible to move the robot to different positions, the background subtractionmethod was instead evaluated.

5.2.2 Background subtractionIf several image frames can be acquired and the object is moving between eachframe the background subtraction method can be used to retrieve a smaller searcharea for the object.

The background subtraction is done by subtracting two images from each otherand perform a threshold. If the scene is completely static except for the movingobject, all of the background will be eliminated resulting in a binary mask imagewhere background pixels are set to 0 and object pixels are set to 1.Let

• f1(x, y) be the image acquired at the start position.

• f2(x, y) be the image acquired at the final position.

• b(x, y) be the binary result image.

• T be threshold value.

b(x, y) =

{1 , ‖f1(x, y)− f2(x, y)‖ ≥ T0 , ‖f1(x, y)− f2(x, y)‖ < T

The resulting binary mask image will of course have its pixels set to one inboth the start location and the final location for the object. To distinguish betweenthe object’s true location and the last location, the object can be moved to a thirdposition and a new image f3 can be retrieved.

By applying the background subtraction method to f1 − f3 and f2 − f3 sep-arately and applying a pixel wise logical AND operator to the resulting binaryimages, a new binary mask image only giving the last location of the object willbe retrieved.

In practise the resulting binary mask image was not perfect, several holes oc-curred in the object region see figure 5.1.

Page 53: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.2. CREATION OF BINARY MASK IMAGE 39

Figure 5.1: Binary mask image from background subtraction.

To fill the holes in the mask image, morphological operations were applied tothe binary image. By gradually apply dilate and erode operators (cvDilate andcvErode in OpenCV) on the mask image the holes were filled.

However, due to the quadrangular structure element used by the morpholog-ical operators the tip structure of the object mask image was deteriorated. Lateron during the segmentation process the damaged tip structure showed to be anobvious drawback.

Instead of using the morphological operators, a new method referred to as theHough Filling Method (HFM) was invented.

5.2.2.1 The Hough filling method (HFM)

By applying the Progressive Probabilistic Hough Transform (PPHT) described in[17] to the binary mask image in figure 5.1, all probabilistic lines were found. Ifthe permitted gap constraint was set to a suitable value and the minimum distancebetween pixels which were to be considered as a line was set to zero, all holes wereperfectly filled when the lines found by PPHT were drawn on the binary maskimage. The HFM method will not increase the mask image boundary as long asthe curvature of the boundary remains low. The PPHT algorithm is implementedin the OpenCV function cvHoughLines2 [18].

By applying the background subtraction method and filling gaps and holeswith the Hough filling method, almost perfect binary mask images were achievedfor all three locations of the tool, see figure 5.2. After the binary mask was usedonly a small search area was left in the image.

Figure 5.2: Binary mask image from background subtraction after applying theHFM.

Page 54: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

40 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

5.3 Edge DetectionAn edge can be treated as a locally odd function in an image.

f(−x,−y) = −f(x, y)

And such a function will always have a gradient. The easiest way of finding thegradient is probably to convolve the image with the sobel operators.

Gx =

1 0 −12 0 −21 0 −1

, Gy =

−1 2 −10 0 01 2 1

The edge image may then be retrieved as fedge =√

(f ∗Gx)2 + (f ∗Gy)2

Unfortunately this method won’t, according to [2], work properly in noise af-fected environments. Irregular light conditions will also decrease the functionalityof this method.

5.3.1 Canny’s Edge Detection1986 J.Canny presented a more robust technique in [19] for edge detection. Incontrast to the previous method, Canny´s edge detection algorithm makes use ofthe second derivates, which are zero when the first derivates reach their maxima.Let n = ∇f(x,y)√

f2x(x,y)+f2

y (x,y)be the unit vector in the gradient direction. Where fi(x, y)

is the image f(x, y) derived in the i direction.The edge image fcanny can be found by deriving the absolute value of the gradientimage in the direction of the gradient.fcanny = n · ∇‖∇f(x, y)‖ = f2

x(x,y)fxx(x,y)+2fx(x,y)fy(x,y)fxy(x,y)+fy(x,y)2fyy(x,y)f2

x(x,y)+f2y (x,y)

To be able to find the contour of the tool, the Canny edge detection algorithm[19] was used on the masked image

f(x, y)masked =

{f(x, y) , if b(x, y) = 1

0 , if b(x, y) = 0(5.7)

Where b(x, y) is the resulting image given from the background subtraction andHFM methods. To ensure that all of the tool was left in the masked image fmasked,the dilation operator was applied to b(x, y) before calculating fmasked accordingto equation 5.7.

Page 55: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.4. CONTOUR RETRIEVING 41

After applying the Canny edge detection algorithm on the masked image fmasked,the binary edge image bedge was obtained. The Canny algorithm is implementedin the OpenCV library function cvCanny [18].

The threshold values in the cvCanny function had to be determined accordingto the current environment. If the image contained a lot of noise, the threshold hadto be set to a high value to ensure the elimination of the noise. Unfortunately thehigh threshold sometimes resulted in discontinuities of the tool edges in the edgeimage bedge. To solve the problem of discontinuities of the tool edges the HoughFilling Method (see section 5.2.2.1 on page 39) was used with an excellent result.Another disadvantage was the presence of a light reflection in the metallic surfaceof the tool. The contour of the reflection had the same shape as the actual tool,resulting in a possible false hit later in the segmentation procedure. To avoid theinfluence of light reflections the tool was painted in a non-reflecting colour.

5.4 Contour RetrievingTo retrieve the contours from the binary edge image, bedge, two different meth-ods were evaluated. This section will illuminate the two methods, but first twodifferent contour representations will be described.

Observe, it would be possible to retrieve an approximation of the tool contourfrom the binary mask image, see figure 5.2. In practise the result became moreaccurate by retrieving the tool contour from the binary edge image bedge given bythe Canny edge detection algorithm.

5.4.1 Freeman Chain Code Contour representation

The freeman chain code representation is a compact representation of contours.By denoting the neighbours of the current pixel with different digits, see figure5.3, the contour can be stored as a succession of digits. Each digit then containsall necessary information for finding the next pixel in the contour.[18]

Please, look at the example in figure 5.4

Figure 5.3: The Freeman representation.

Page 56: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

42 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

Figure 5.4: An example of the Freeman Chain Code.

5.4.2 Polygon RepresentationThe polygonal representation of a contour is according to [18] a more suitablerepresentation for contour manipulations. In the polygonal representation the se-quence of points are stored as vertices e.g. the pixel coordinates are stored.

5.4.3 Freeman MethodsThe four scan line methods described in [20] are implemented in the OpenCVfunction cvFindContours [18]. The scan line methods scan the image line by lineuntil it finds an edge. When an edge is found the method starts a border followingprocedure until the current border is retrieved in the Freeman representation.

The first method described in [20] only finds the outer most contours, while theother methods discover contours on several levels. The methods were evaluatedon the binary edge image with satisfactory results.

The methods with the ability to find contours on different levels had a disad-vantage because unnecessary contours were found, increasing the risk of mixingthe true object with a light reflection of the same shape.

5.4.4 Active Contour (The Snake algorithm)Active contours or Snakes are model-based methods for segmentation. The activecontour is a spline function v(s) with several nodes.

v(s) = (x(s), y(s)) (5.8)

Where x(s), y(s) are the x, y coordinates along the contour. s ∈ [0, 1]Each node has got an internal energy and the algorithm is trying to minimize

the total energy, E, of the spline function.

E =∫ 1

0Esnake(v(s))ds (5.9)

Page 57: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.4. CONTOUR RETRIEVING 43

The spline function is affected by internal forces Eint(v(s)), external forcesEimg(v(s)) and Econ(v(s)).

Esnake = Eint(v(s)) + Eimg(v(s)) + Econ(v(s))

The internal forces affecting the active contour are divided in tension forcesand rigidity forces. The tension forces imply spring behaviour of the spline func-tion, while the rigidity forces make the spline function resist bending.

Eint(v(s)) = α(s)

∣∣∣∣∣dv(s)

ds

∣∣∣∣∣2

+ β(s)

∣∣∣∣∣d2v(s)

ds2

∣∣∣∣∣2

Where α(s) specifies the elasticity and β(s) specifies the rigidity of the splinefunction.

The external forces consist of image forces Eimg and user specified constraintforces Econ. The image force is an image, where each pixel value defines a force.The constraint forces can be used to guarantee that the active contour is not gettingstuck at a local minima. The constraint forces might be set by a higher levelprocess.[21]

One can compare the snake algorithm to a rubber band expanded to its maxi-mum when the algorithm is initialized. The rubber band then iteratively contractuntil the energy function in each node has reached equilibrium.

By using the binary edge image as the external force and defining the initialcontour to be the result of applying the Freeman scan line method to the binarymask image, the active contour will successively contract until it perfectly en-closes the object, see figure 5.5

Figure 5.5: The snake contour at initial state to the left and at the final state to theright. The contour is randomly color coded from the start point to the end point ofthe contour.

In fact the mask image and the initial active contour have to be larger thanthe object itself to be able to enclose the object. If there are structures or patterns

Page 58: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

44 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

behind the object, the active contour might get stuck due to forces in the edgeimage belonging to these structures see figure 5.6. This drawback together withthe fact that the algorithm is iterative and therefore time consuming made theFreeman method better suited for retrieving the contour.

Figure 5.6: The snake contour could get stuck at the background pattern. Thecontour is randomly color coded from the start point to end point of the contour.

The snake algorithm is implemented in the OpenCV function cvSnakeImage.[18]

5.5 Finding the contour matching the tool shapeWhen all contours were retrieved by the Freeman method, a process of logics hadto be applied for actually finding a contour or a part of a contour matching thewanted tool object. Two different methods were implemented and evaluated.

5.5.1 Convex Hull and its defectsA widely used methodology for finding objects with a shape reminding of a handor a finger is to analyse the convex hull of the object. The awl shape of the toolobject does remind much of a finger shape. In the OpenCV library a functionnamed cvConvexHull2 exists for finding the convex hull of a contour. There isalso a function for finding the deepest defects of the convex hull e.g. the pointsmost far away from every line segment of the convex hull. The two functions wereused to retrieve all points outlining the convex hull and the points constituting thedeepest defects of the hull. All five points defining the tool were found by thesefunctions. By finding the two deepest defect points, two of the five desired pointscould be determined (the two points closest to the robot’s wrist), see figure 5.7. Todetermine the three remaining points of the tool, lines were drawn between thesepoints and their nearest neighbours, resulting in four different lines. The four lines

Page 59: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.5. FINDING THE CONTOUR MATCHING THE TOOL SHAPE 45

were then compared two by two to find the two lines being most parallel to eachother. See figure 5.8. By this method four out of five of the desired points wereeasily determined. Finding the last point (TCP) of interest, was then becomingeasy while it was the only point lying on the complex hull fulfilling the constraintof being a nearest neighbour of two of the points already found. The methodwas fast and performed well as long as the robot only made reorientations aroundaxis five during the background subtraction method. When more complex robotmovements were used, a huge number of defect points in the convex hull werefound resulting in that the method became time consuming. Instead the polygonalapproximation method was evaluated.

Figure 5.7: The green lines is the convex hull found by the method, and the reddots are the points defining the convex hull and the defect points.

Figure 5.8: The four lines are illustrated in this figure. The green lines are the twolines being most parallel and found by the logics. The red lines are not paralleland will therefore be neglected.

5.5.2 Polygonal Approximation of ContoursTo further compress the retrieved contour several methods can be used such as RunLength Encoding compression and polygonal approximation. In this implemen-tation the Douglas-Peucker polygonal approximation was used (cvApproxPoly in

Page 60: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

46 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

the OpenCV library). The reason of using a polygon approximation method wasdue to the possibility to set the approximation accuracy to a level where the toolobject only consisted of five points see figure 5.9

Figure 5.9: Five points polygonal approximation of the tool.

5.5.2.1 Douglas-Peucker Approximation Method

The method starts in the two points p1 and p2 on the contour having the largestinternal distance. The algorithm loops through all points on the contour to retrievethe largest distance from the line p1p2. If the maximum distance is lower than thedesired accuracy threshold, the process is finished. If not the point p3 at the longestdistance from the line p1p2 is added to the resulting contour. The line p1p2 is thensplit into the two line segments p1p3 and p3p2. The same methodology is thenapplied recursively until the desired accuracy constraint is fulfilled.[18]

5.6 Geometric constraintsWhen the five points were found some logics were applied to make sure the fivepoints fulfilled the geometry of the tool shape. The angles α1, α2 and α3, betweenthe four different lines L1, L2, L3 and L4 of the shape, illustrated in figure 5.10,were determined and the area A of the tool was calculated. The longest sides L1

and L2 of the tool had to fulfill the constraint of being parallel and the length ofthe long sides L1 and L2 had to be at least twice the length of the short sides L3

and L4. If all these constraints were fulfilled, the five points were classified as partof the tool shape. To ensure only one tool was to be found, the constraints were setto be very tough. If the method did not find a tool when the toughest constraintswere used, the constraints were gradually slackened.

Figure 5.10: Definition of geometric constraints of the tool.

Page 61: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.7. CORNER DETECTION 47

5.7 Corner DetectionWhen the tool contour was found by the polygonal approximation algorithm, fivepoints defining the object were obtained. To use the camera as a measurementunit, at least three points had to be found. Unfortunately only three of the fivepoints defining the contour of the tool could be found with high accuracy. Due toocclusion the two points closest to the robot wrist were hard to detect. Instead thetool center point (the tip of the awl) and the two points where the cylinder part ofthe awl starts were used. See figure 5.11

Figure 5.11: Calibration tool with the three points found.

All three points had one thing in common, their locations were at corners. Bymatching the three points given by polygonal approximation algorithm with theclosest points of high certainty being a corner, the three points in the figure 5.11was found.

To find the three points of high probability being corners in the neighbourhoodof the points given by the polygonal approximation, three different methods wereevaluated.

5.7.1 Harris corner detectorThe idea of Harris corner detection algorithm described in [22] is to use a function

E(x, y) =∑

w(x, y)(I(x + u, y + v)− I(x, y))2 (5.10)

where w(x, y) is a window function, typically a Gaussian function, and I(x +u, y + v) is a shifted version of the intensity image I(x, y). At a corner pixel thedifference between I(x + u, y + v) and I(x, y) would be high resulting in a highvalue of E(x, y).

For a small shift (the u and v are small) a bilinear approximation can be used.

E(x, y) ∼=[

u v]∑

x,y

w(x, y)

(I2x IxIy

IxIy I2y

)[uv

](5.11)

Page 62: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

48 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

Where Ix is the x-derivate and Iy is the y-derivate of the image I.

Let M =∑x,y

w(x, y)

(I2x IxIy

IxIy I2y

)(5.12)

Equation 5.11 and equation 5.12 give

E(x,y) ∼=[

u v]M

[uv

](5.13)

The eigenvalues λ1 and λ2 of M then include the information of the directionsof the fastest and the slowest changes. As mentioned earlier in this section a cornerhas fast intensity changes in all direction, resulting in high λ1 and λ2 values.

λ1 ≈ λ2 = high values Cornerλ1 < λ2 Edgeλ1 > λ2 Edge

λ1 ≈ λ2 = small values Flat

Table 5.1: Interpretation of the eigenvalues.

The Harris corner detection algorithm makes use of the information stored inthe eigenvalues of M.

R = det(M)− k(trace(M)) = λ1λ2 − k(λ1 + λ2), where k is a parameter.(5.14)

R |R| Result> 0 Large Corner< 0 Large Edge− Small Flat

Table 5.2: Interpretation of R.

The Harris corner detection algorithm was implemented in Matlab. Figure5.13 shows the R image when the algorithm was applied to figure 5.12. By thresh-olding the R image, only points with very high certainty for being a corner wereleft.

The Harris corner detection algorithm is also implemented in the OpenCVfunction cvGoodFeaturesToTrack.

Page 63: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.7. CORNER DETECTION 49

Figure 5.12: Image of robot tool used for Harris corner detection.

Figure 5.13: The R image in Harris corner detection algorithm.

5.7.1.1 Sub pixel Accuracy

By using the positions of the corners found by the Harris detector as input tothe OpenCV function cvFindCornerSubPix, the positions of the corners could beretrieved with sub pixel accuracy.

Let εi be a minimization function, pi is a point in the neighborhood of q, ∇Ii

is the gradient in the point pi, q is the corner with sub pixel accuracy, see figure5.14.

εi = ∇ITi • (q − pi) (5.15)

Figure 5.14: Accurate sub pixel corner detection. The blue arrows are the imagegradients and the green vectors are the vectors from q to pi.

Page 64: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

50 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

Observe, all vectors qpi are orthogonal to ∇Ii. To retrieve the corner point qwith sub pixel accuracy, εi in equation 5.15 was minimized for several points pi

resulting in a system of equations giving q.

q = (∑

i

∇Ii∇ITi )−1(

∑i

∇Ii∇ITi p1) (5.16)

Please, refer to [18] for more information about sub pixel accuracy.

5.7.2 Fast radialG. Loy and A. Zelinsky proposed a method in [23] based on Radial Symmetryfor finding points of interest. A Matlab implementation by the authors was found.See the result image in figure 5.15 when the algorithm was applied to figure 5.12.

Figure 5.15: Result of the Fast Radial algorithm.

The fast radial algorithm resulted in more points of interest in relation to theHarris corner detector. In this particular case the algorithm found many false hits.Due to this effect, the fast radial algorithm was rejected.

5.7.3 Orientation TensorTo be able to find the three points of the tool with sub pixel accuracy the orien-tations of the tool sides can be used. The orientation of every pixel at the tool’scontour was found in double angle representation [24]. To find the double anglerepresentation the orientation tensor was determined.

Several ways of determine the orientation tensor have been proposed. In [24]G. Granlund and H. Knutsson propose a method based on quadrature filter re-sponses. C. Mota, I. Stuke, and E. Barth describe a method in [25] for creatingthe orientation tensor as an outer product of derivatives. G Farneback proposedanother method for retrieving the orientation tensor in [26] based on polynomialexpansion.

Page 65: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.7. CORNER DETECTION 51

The method based on outer products of derivatives was implemented in Mat-lab.

T =

(T11 T12

T21 T22

)=

(I2xx IxIy

IxIy I2yy

)(5.17)

Ixx,Ixx and IxIy were retrieved by 2D convolutions of image I(x, y) and de-rived 2D Gaussian functions.

Let:g(x) = e

−X2

2σ2

g(y) = e−Y 2

2σ2

∂g(x)

∂x=−X

σ2e−X2

2σ2

∂g(y)

∂y=−Y

σ2e−Y 2

2σ2

h1(x, y) =∂g(x)

∂x∗ g(y)

h2(x, y) =∂g(y)

∂y∗ g(x)

Figure 5.16 shows h1(x, y) and h2(x, y). By performing a 2D convolution ofh1(x, y) and image I(x, y) the image Ix was retrieved. Observe, a faster way toimplement the function is to perform a sequence of 1D convolutions on imageI(x, y).

Ix = I(x, y) ∗ g(x) ∗ ∂g(y)

∂y

The double angle representation of orientation can be retrieved from the ori-entation tensor T.

Let v be the double angle orientation vector.

v =

(v1

v2

)=

(T11 − T22

2T12

)=

(I2x − I2

y

2IxIy

)

To ensure v is the double angle representation let Ix = cos(φ) and Iy = sin(φ)

v =

(I2x − I2

y

2IxIy

)=

(cos(φ)2 − sin(φ)2

2 cos(φ) sin(φ)

)=

(cos(2φ)sin(2φ)

)

The advantage of using a double angle representation of orientation is the pos-sibility to avoid branch cuts in the representation. If only the angle between the

Page 66: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

52 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

Figure 5.16: Gaussian derived in x and y directions respectively.

x-axis to an image line is used as orientation information, no average betweenneighboring points can be done. For instance a line with an angle close to 0 radand a line with an angle close to π rad will have very different orientations. Whenusing the double angle representation these two lines will instead have orientationsclose to each other. This implies the possibility of counting an average of orien-tations when using the double angle representation. By color coding the doubleangle and letting the intensity be |v| an orientation image can be retrieved, pleaserefer to [24]. Figure 5.18 shows the orientation image of 5.17

Figure 5.17: Test image for double angle representation.

Page 67: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

5.7. CORNER DETECTION 53

Figure 5.18: Orientation image of figure 5.17

By finding the orientation tensor to the image achieved by the camera andcalculating the orientation image, figure 5.19 was retrieved. For every pixel a linewas drawn in the direction specified by the orientation image. If all lines wereaccumulated the three points could be found by applying a high threshold.

Observe, by calculating the histogram of the orientation image it is possible totell if the tool was found in the image. Figure 5.20 shows the histogram. Threepeaks are observed, the highest peak is the orientation of the two longest sidesof the tool, while the two lower peaks are the orientations of the two shortestsides of the tool. The relations between these three orientations are specific forthe tool. Of course the tool has to be segmented before calculating the histogram,otherwise the tool specific orientations would be impossible to determine amongall the orientation information in the image. The histogram can also be used tofind the orientation of the tool (the highest peek in the histogram), after the toolhas been segmented.

Figure 5.19: Double angle representation of the robot tool.

Page 68: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

54 CHAPTER 5. IMAGE SEGMENTATION OF ROBOT TOOL

Figure 5.20: Histogram of orientations.

Page 69: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 6

Results

In this chapter the results of different tests will be presented. The tests wereperformed to determine the accuracy of the method.

6.1 Repetition Accuracy of the CameraFigure 6.1 shows the configuration of the camera in relation to the robot tool flangecoordinate system. The optical axis of the camera is directed in -X direction,where X is the X axis of the robot base coordinate system.

Figure 6.1: Configuration during the test.

To be able to determine the repetition accuracy of the camera the start positionof the TCP found by the method was stored in a vector POS1 =

[x1 y1 z1

]T.

The robot was then randomly moved to a second position. The new position ofTCP was measured by the method several times and stored as POSi =

[xi yi zi

]T.

55

Page 70: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

56 CHAPTER 6. RESULTS

Figure 6.2: Movment mi plotted for all i.

Where i is the estimation number. The tool used during this test was the chessboard in figure 4.3 and 24 points of the chess board pattern were used to determinethe transformation between the camera and the tool (chess board). For all i themovement mi =

√(x1 − xi)2 + (y1 − yi)2 + (z1 − zi)2 was calculated.

Figure 6.2 shows the result of mi.By the graph it is possible to determine the difference between the maximum

and minimum interval to approximately 0.03 mm, resulting in that the camera isnot able to measure distances shorter than 0.03 mm.

6.1.1 Averaging to achieve higher accuracy

By letting every estimation mj be the average of several mi see equation 6.1, atest was performed to investigate whether a better accuracy could be obtained.

mj =

∑10i=1 mi

10(6.1)

During the test every mj was the average of ten different mi. The result is illus-trated in figure 6.3. The difference between the maximum and minimum intervalwas still determined to 0.03 mm. It is then obvious that averaging the result doesnot increase the repetition accuracy of the camera when 24 points are used formeasuring the transformation between the camera and the tool (chess board).

Page 71: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.1. REPETITION ACCURACY OF THE CAMERA 57

Figure 6.3: Movment mj plotted for all j.

6.1.2 How does the number of points affect the repetition ac-curacy

To investigate how the amount of points in the chess board pattern affects therepetition accuracy of the camera, the same test (not with averaging) was doneagain. Although, this time only the three most upper left points of the chess boardpattern were used to measure the transformation between the camera and the tool(chess board).

Observe, three points are minimum for finding the transformation between thecamera and the tool. This is due to the fact that a plane is needed to be observedby the camera to determine the homography, see section 4.2. A plane is by itsdefinition defined of at least three points.

The result of the new test is illustrated in figure 6.4 The difference betweenthe maximum and minimum interval was determined to approximately 0.58 mm.The greater difference between the maximum and minimum interval indicates thenumber of points used during the test affects the repetition accuracy. Although,still using only three points the camera repetition accuracy is better than 1 mm.

Page 72: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

58 CHAPTER 6. RESULTS

Figure 6.4: Movment mi plotted for all i when only 3 points of the chess boardwere observed.

Page 73: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.2. CAMERA VERSUS ROBOT MEASUREMENTS 59

6.2 Camera versus Robot measurementsBoth the camera and the robot measurements are affected of errors during thecamera calibration and the robot calibration respectively. To investigate the effectof these errors different tests were performed. The configuration of the devices isillustrated in figure 6.1. All twenty four points of the chess board were observedduring these tests.

6.2.1 Rotation measurementsThe robot was told to perform a rotation of θ = 20◦ around a random axis and theforward kinematics transformation matrix T1 and the measured transformationmatrix between the camera and the tool T2 were saved before and after the rota-tion, see figure 4.1 at page 21. The transformation A from the start position to thefinal position measured by the camera is then A = T−1

2afterT2before and the trans-formation B from the start position to the final position measured by the robot isthen B = T−1

1afterT1before, see figure 4.2 at page 25. Due to the fact that A and Bare homogeneous matrixes it is obvious that the rotation part of the transforma-tions A and B can be obtained as the upper left 3 × 3 sub matrixes of A and Brespectively.

Denote these sub matrixes RA and RB respectively. The angle of rotationθrobot measured by the robot and the angle of rotation θcamera measured by thecamera were then obtained by using the screw axis rotation representation de-scribed in section 3.3.2 on page 17 and equation 3.27 on page 18.

The difference γ between θrobot and θcamera was then calculated accorded tothe following equation.

γ = |θrobot| − |θcamera| (6.2)

The test was done several times and the resulting γ:s were plotted in figure6.5. The difference between the maximum and minimum interval was determinedto 0.06◦ and the maximum deviation was 0.04◦

Page 74: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

60 CHAPTER 6. RESULTS

Figure 6.5: Difference of rotation measured by the robot versus the camera.

Page 75: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.2. CAMERA VERSUS ROBOT MEASUREMENTS 61

6.2.2 Translation measurementsThe robot was told to perform linear movements of 10 mm along the three orthog-onal axes of the robot base coordinate system. The positions of the robot measuredby the robot and the camera respectively were stored after every movement.

Denote the positions of the robot measured by the robot as robposi =[

rxi ryi rzi

]Tand the positions of the robot measured by the camera and transformed to therobot’s base frame by applying X2 as camposi =

[cxi cyi czi

]Twhere i ∈

[1,N ] and N is the number of movements.Let:

robdiff i = robposi+1 − robposi

camdiff i = camposi+1 − camposi

diff i = robdiff i − camdiff i =

diffxi

diffyi

diffzi

Figure 6.6 illustrates |diff | during movements in the X direction of the robot

base coordinate system. Figure 6.7 and Figure 6.8 illustrates |diff | during move-ments in Y direction and Z direction of the robot base coordinate system respec-tively. Table 6.1 shows the resulting differences between maximum and minimumintervals and the maximum deviations. Observe, the accuracy is higher in the Yand Z directions in relation to the X direction.

Figure 6.6: Difference of translation in X direction measured by the robot versusthe camera.

Page 76: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

62 CHAPTER 6. RESULTS

Figure 6.7: Difference of translation in Y direction measured by the robot versusthe camera.

Figure 6.8: Difference of translation in Z direction measured by the robot versusthe camera.

Page 77: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.2. CAMERA VERSUS ROBOT MEASUREMENTS 63

Movement in direction Difference between maximum and Maximum deviationminimum interval [mm] [mm]

X direction 2.72 1.45

Y direction 1.49 1.16

Z direction 1.66 1.05

Table 6.1: Table showing the differences between maximum and minimum inter-vals and the maximum deviations.

Page 78: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

64 CHAPTER 6. RESULTS

6.2.2.1 Averaging to achieve higher accuracy

The same procedure was redone, but this time every robot position, camposj ,measured by the camera was the average of ten measurements.

camposj =

∑10i=1 camposi

10

The results are illustrated in figure 6.9, figure 6.10 and figure 6.11. Table 6.2shows the resulting differences between maximum and minimum intervals andthe maximum deviations.

Note, the accuracy has increased in all directions in relation to the test whenno averaging was used. Although, the accuracy of movements in the X directionis still lower than the accuracy in Y or Z directions.

Observe, the accuracy of these tests are approximately ten times lower thanfor the camera repetition accuracy. This shows the importance of using the itera-tive method for increasing the accuracy of the TCP calibration method, since theiterative method is independent of the robot calibration error, see section 4.5 onpage 31.

Figure 6.9: Difference of translation in X direction measured by the robot versusthe camera, averaging has been done.

Page 79: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.2. CAMERA VERSUS ROBOT MEASUREMENTS 65

Figure 6.10: Difference of translation in Y direction measured by the robot versusthe camera, averaging has been done.

Figure 6.11: Difference of translation in Z direction measured by the robot versusthe camera, averaging has been done.

Page 80: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

66 CHAPTER 6. RESULTS

Movement in direction Difference between maximum and Maximum deviationminimum interval [mm] [mm]

X direction 0.56 0.33

Y direction 0.26 0.49

Z direction 0.21 0.22

Table 6.2: Table showing the differences between maximum and minimum inter-vals and the maximum deviations. Averaging has been done.

Page 81: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.3. ACCURACY OF THE TCP CALIBRATION METHOD 67

6.3 Accuracy of the TCP Calibration methodThe accuracy of the proposed method is limited to the least distance between twopoints which the camera is able to register, see section 4.5. Due to the fact that therepetition accuracy of the camera is 0.03 mm, see section 6.1, there is no need oftrying to register a shorter distance.

A test was performed to find the shortest distance between two points whichthe camera is able to register. Denote the initial position of the tool center point,determined by the camera and transformed to the robot base coordinate systemas p1. This position was calculated as the mean value of 100 different measure-ments. The robot was then told to move a distance d and the new position, p2, wasmeasured by the camera and transformed to the robot base coordinate system.

The test was performed until the smallest distance d between p1 and p2 wasfound, still making the camera able to register a movement. To ensure the cameraregistered a movement, the distance, dcamera, measured by the camera had to begreater than the repetition accuracy of the camera (of course dcamera was alsoexpressed in the robot base coordinate system).

Figure 6.12, figure 6.13 and figure 6.14 illustrate the distance dcamera whenthe robot was told to move a distance d = 0.10 mm in the X, Y and Z direc-tions respectively. By these figures it is possible to tell that the accuracy in theX direction limits the total accuracy of the method. When the robot was told tomove a distance shorter than 0.1 mm some of the measurements, dcamera, becamelower than the camera repetition accuracy, resulting in that the system was notable to determine a significant movement. Therefore the accuracy of the proposedmethod is assumed to be 0.1 mm.

Figure 6.12: Distance dcamera registered by the camera when a movement of 0.1mm in X-direction was performed by the robot. The dashed line is the camerarepetition accuracy.

Page 82: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

68 CHAPTER 6. RESULTS

Figure 6.13: Distance dcamera registered by the camera when a movement of 0.1mm in Y-direction was performed by the robot.

Figure 6.14: Distance dcamera registered by the camera when a movement of 0.1mm in Z-direction was performed by the robot.

Page 83: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.4. REPETITION ACCURACY OF THE TCP CALIBRATION METHOD 69

6.3.1 Averaging to obtain a higher accuracyThe same tests were performed once again, but this time both positions of thetool, p1 and p2, were determined as an average of 100 measurements. Table6.3 shows the results of the distance dcamera measured by the camera, when amovement d = 0.06 mm was performed by the robot. If the distance d waschosen to d < 0.06 mm no significant movement was registered by the camera.The result of this test assures an increased accuracy of the method, if the averageof several measurements are used.

d dcamera

The robot moves 0.06 mm in X direction 0.04 mm

The robot moves 0.06 mm in Y direction 0.04 mm

The robot moves 0.06 mm in Z direction 0.04 mm

Table 6.3: Significant movements registered by the camera.

6.4 Repetition Accuracy of the TCP Calibration methodTo test the repetition accuracy of the TCP calibration method presented in this the-sis, the method was executed several times for the same tool. The tool used wasthe chess board pattern and all 24 points of the chess pattern were observed. Thedesired accuracy of the iterative method was set to 1 mm. Denote the translation

from the tool flange to the TCP found by the method as TCP =

TCPx

TCPy

TCPz

. The

result of TCPx, TCPy, TCPz for the different runs of the method are illustratedin figure 6.15, figure 6.16 and figure 6.17 respectively. Table 6.4 shows the dif-ferences between the maximum and minimum interval for each of the results. Byobserving the information in table 6.4, it is obvious the desired accuracy of 1 mmwas obtained.

Page 84: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

70 CHAPTER 6. RESULTS

Figure 6.15: Repetition accuracy of TCPx measured by the proposed TCP cali-bration method.

Figure 6.16: Repetition accuracy of TCPy measured by the proposed TCP cali-bration method.

Page 85: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

6.4. REPETITION ACCURACY OF THE TCP CALIBRATION METHOD 71

Figure 6.17: Repetition accuracy of TCPz measured by the proposed TCP cali-bration method.

Element of TCP Difference between maximum and minimum interval [mm]

TCPx 0.97

TCPy 0.52

TCPz 0.48

√TCP 2

x + TCP 2y + TCP 2

z 0.66

Table 6.4: Difference of max and min interval of |TCP| and for the elements ofTCP respectively.

Page 86: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

72 CHAPTER 6. RESULTS

6.4.0.1 How does the number of points affect the repetition accuracy of theTCP calibration method

To investigate how the number of points affect the repetition accuracy of the pro-posed TCP calibration method, the same procedure was executed but only thethree most upper left points of the chess board pattern were observed. The methodmanaged to give a satisfying result most of the times, but the method became un-reliable and very time consuming. Sometimes the robot diverged from the correctTCP point during the iterative phase. This implies more than three points have tobe observed to make the method reliable.

Although, the reason why the robot sometimes diverged during the iterativeprocedure might be that the wrong solution to the three points was found. Ifthree points with almost the same distances to each other are observed, it mightbe difficult for the algorithm to distinguish between the points. This results inthree possible solutions, but only one of the solutions is correct. If the mirroredsolutions are concerned, six different solutions are possible to find. The mirroredsolutions were removed during the test by only permitting solutions with Z > 0,where Z is the camera coordinate axis coinciding with the optical axis.

Page 87: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Chapter 7

Discussion

This chapter presents the conclusion of this Master’s thesis. Problems and diffi-culties are described and the chapter concludes with a discussion of future devel-opment.

7.1 ConclusionThis thesis has proposed a new method for tool center point calibration. Themethod can be split into three different parts.

1. Calculation of a good guess of the Tool Center Point, by solving a closedloop of transformations resulting in a homogeneous equation AX1 = X1B.

2. Image Segmentation of the tool.

3. Iterative method for increasing the accuracy.

The method proposed is completely automated, resulting in no variations inaccuracy, due to how skilled the robot operator is. The method is also very easy toconfigure, only a USB camera connected to a laptop computer with the softwaredeveloped during this project is needed. No fixed devices with known relationsto the robot’s base frame are needed, implying the calibration set up used in thismethod is completely portable.

7.1.1 AccuracyTests have been performed with regard to investigate the accuracy of the pro-posed method. The tests showed the need of the iterative method to achieve asatisfying accuracy. Due to the fact that only accuracy of the camera when reg-istering a movement between two different positions limits the accuracy of the

73

Page 88: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

74 CHAPTER 7. DISCUSSION

iterative method, an accuracy of approximately 0.1 mm is possible to achieve bythis method if a camera of the same model is used, see section 6.3. By that remark,this method is able to perform a tool center point calibration with equivalent ac-curacy or even better in relation to the manual four point TCP calibration method.

7.1.2 Major Problems/known difficulties

During the project two main problems have occurred.

1. Finding distinct points

2. Light reflections in the tool

7.1.2.1 Finding distinct points

To be able to determine the transformation between the camera and the tool, atleast three distinct points on the tool have to be observed by the camera. In thisthesis results have been presented, showing that it is possible to perform a cal-ibration by only observing three points. However, to make the method reliableand fast, more points are needed. If the tool has a very simple geometry it mightbe difficult to find several points and assure it is the same points found in everyframe.

This problem can be solved by for instance painting the tool with a certainpattern. Another possibility would be to press a matrix pattern on the tool. Thematrix pattern could then also store information about what kind of tool it is.

The best idea though, would probably be to ensure that the tool has some kindof structure e.g. not a simple geometric shape.

7.1.2.2 Light reflections in the tool

A major problem during the development of the image segmentation part of themethod, was the influence of light reflections in the tool. This is a well knownproblem within the field of image segmentation. Several methods exist to makeimage segmentation robust against reflections, but no suitable functions for solv-ing this problem exist in the Open Computer Vision library. Instead the problemwas solved by painting the tool in a non reflecting color.

Before using the method in an industrial environment, this problem must besolved.

Page 89: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

7.2. FUTURE DEVELOPMENT 75

7.1.3 Fulfillment of objectivesThe objectives of the thesis is to investigate whether or not it is possible to createan automated method for tool center point calibration, by making good use ofcomputer vision and image processing technologies. To consider the method asuseful; an accuracy of ±1 mm had to be obtained.

In this thesis a method has been proposed and a software has been deliveredwith the ability to successfully perform tool center point calibrations. The acu-racy of the proposed method has been analyzed, showing the method’s ability ofperforming TCP calibrations with an accuracy of approximately 0.1 mm, see thediscussion in section 6.3. Therefore all objectives are considered to be fulfilled.

7.2 Future DevelopmentIn this section some future development will be presented.

1. Rotation symmetric tools

2. Light reflections in the tool

3. Image Segmentation controlled by CAD drawings

4. Neural network based methods

5. Online correction of the TCP calibration

6. Other image processing libraries

7.2.1 Rotation symmetric toolsOne problem of the proposed method is to calculate a good initial guess of thetool center point, if the robot tool has an axis of symmetry. The symmetry makesit impossible to measure the transformation between the camera and the tool by 6degrees of freedom. A mathematical proof, showing it is impossible to measurea good guess of TCP by using this method if the tool is rotation symmetric, ispublished in the thesis. Although, the iterative part of the method still works forrotation symmetric tools, if the transformation between the camera and the robotcoordinate system is known and an initial guess of the tool center point is obtainedby another procedure.

The alternative method for finding the transformation between the cameraframe and the robot’s base frame presented in section 4.6 should be implementedand evaluated.

Page 90: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

76 CHAPTER 7. DISCUSSION

Observe, other methods for solving this problem exist. For instance the rota-tion symmetry can be removed by painting the tool with a non symmetric pattern.Attaching a blinking light emitting diode (LED) to the tool, easy to find by imagesegmentation, can also remove the symmetric geometry.

By fasten a calibration pattern, for instance a chess board pattern, at the toolit is possible to determine the transformation between the camera frame and therobot’s base frame. The iterative method then works even though the tool is rota-tion symmetric. Instead of fasten a calibration pattern at the tool, a matrix patterncan be pressed on the tool. Pressing a pattern on the tool will also solve the prob-lem of finding distinct points, see section 7.1.2.1.

7.2.2 Light reflections in the toolBefore using the method in an industrial environment, the problem of handlinglight reflections in the tool, see section 7.1.2.2, must be solved.

7.2.3 Image Segmentation controlled by CAD drawingsTo make the method general it needs the ability of calibrating all kinds of tools.Therefore the next step is to control the image segmentation part of the method bydata from for instance a CAD drawing of the actual tool.

Developing a software with the ability to read 3D drawings of tools and an-alyze the drawings to find distinct points or lines to search for during the imagesegmentation would ensure that the method would be able to handle different kindof tools. It would be even better if only one general 3D model for each kind oftool would be enough to control the image segmentation method. Then the userwould only need to choose what kind of tool he/she wants to calibrate.

7.2.4 Neural network based methodsWhen the method is able to handle different kinds of tools, it would be possible tomake the system able to distinguish between different tools by itself e.g. classify-ing tools. This can be done by making use of a neural network approach such asprinciple component analysis (PCA).

As soon as a robot tool is located in front of the camera, the system wouldthen be able to classify the tool and start the calibration procedure.

7.2.5 Online correction of the TCP calibrationThere are several events which can cause misplacement of the tool center pointduring a robot program. The method proposed in this thesis could easily be re-

Page 91: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

7.2. FUTURE DEVELOPMENT 77

configured for correction of the current TCP calibration. By placing the cameraat a fixed position and once and for all determine the relation between the cameraand the robot base frame, only the iterative part of the method is needed to deter-mine the correct placement of the tool center point. The initial guess of the TCPis then of course chosen to be the current tool center point.

7.2.6 Other image processing librariesThe image processing library (Open Computer Vision Library) used during thisproject, does not include suitable functions for handling light reflections in metal-lic tools. Therefore it could be of great interest to try other image processinglibraries, for instance the library developed by Cognex or the library developed byHalcon.

Page 92: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 93: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Appendix A

Appendix

This appendix will present the results of the camera calibration done during thisproject.

A.1 Finding intrinsic parametersThe camera calibration toolbox for Matlab by Jean-Yves Bouguet was used toperform the camera calibration. Fifty two images of a chess board pattern wereachieved by the camera. The chess board was rotated and translated betweenevery frame. Figure A.1 illustrates the positions of the chessboard. The cameracalibration toolbox was then used to determine the intrinsic parameters presentedin table A.1. Figure A.2 illustrates the reprojection errors in pixels after the cameracalibration was done.

Also the distortion coefficients were calculated and the result can be seen intable A.1. Figure A.3 and figure A.4 illustrates the radial and the tangential lensdistortion respectively. By the figure it is possible to tell that the radial distortionis greater than the tangential distortion. Figure A.5 illustrates the complete lensdistortion.

79

Page 94: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

80 APPENDIX A. APPENDIX

Calibration results (with uncertainties):

Focal Length: fc =

[615.17386615.69958

]±[

3.320533.41099

]

Principal point: cc =

[337.27613233.77556

]±[

4.321344.42747

]

Skew: alpha c = [0.00000] ± [0.00000] =>angle of pixel axes = 90.00000 ± 0.00000 degrees

Distortion: kc =

−0.200330.253460.00009−0.001430.00000

±

0.008630.024840.001640.001660.00000

Pixel error: err =

[0.069920.07343

]

Note: The numerical errors are approximately threetimes the standard deviations (for reference).

Table A.1: The camera calibration result given by camera calibration toolbox forMatlab.

Page 95: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

A.1. FINDING INTRINSIC PARAMETERS 81

Figure A.1: Illustration of the chess board positions.

Page 96: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

82 APPENDIX A. APPENDIX

Figure A.2: Reprojection errors in pixels.

Page 97: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

A.1. FINDING INTRINSIC PARAMETERS 83

Figure A.3: Radial lens distortion.

Page 98: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

84 APPENDIX A. APPENDIX

Figure A.4: Tangential lens distortion.

Page 99: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

A.1. FINDING INTRINSIC PARAMETERS 85

Figure A.5: Complete lens distortion.

Page 100: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,
Page 101: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

Bibliography

[1] ABB, Rapid reference manual Industrial robot controller system. ABB.

[2] P.-E. Danielsson., “Bildanalys,” in Bildanalys, pp. 52–63, 2005.

[3] M. M. Seger, “Lite om kamerageometri.” September 2005.

[4] Donald Hearn, M.Pauline Baker, Computer Graphics with OpenGL. PearsonEducation, Inc. Upper Saddle River: Pearson Prentice Hall, third ed., 2004.

[5] Lung-Wen Tsai, Robot Analysis The Mechanics of Serial and Parallel Ma-nipulators. 605 Third Avenue, New York, N.Y. 10158-0012: John Wiley &Sons, INC., 1998.

[6] Z. Zhang, “A flexible new technique for camera calibration,” technical re-port, Microsoft Research, http://research.microsoft.com/ zhang, 8 2002.

[7] Jintao Wang, Daokui Qu and Fang Xu, “A new hybrid calibration methodfor extrinsic camera parameters and hand-eye transformation,” InternationalConferance on Mechatronics and Automation Niagara Falls, Canada, July2005.

[8] H. Zhuang, Z. Roth, and R.Sudhakar, “Simultaneous robot/world andtool/flange calibration by solving homogeneous transformation of the formAX=YB,” IEEE Transaction on Robotics and Automation, vol. 10, pp. 549–554, August 1994.

[9] Fadi Dornaika and Radu Horaud, “Simultaneous robot-world and hand-eyecalibration,” IEEE Transaction on Robotics and Automation, vol. 14, August1998.

[10] Roger Y. Tsai and Reimark K.Lenz, “A new technique for fully au-tonomous and efficient 3d robotics hand/eye calibration,” IEEE Transactionon Robotics and Automation, vol. 5, June 1989.

87

Page 102: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

88 BIBLIOGRAPHY

[11] “Matrix reference manual: Matrix relations.”Internet link. http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/relation.html#similarity.

[12] “3d rotations (wrf).” Internet link. http://www.ecse.rpi.edu/Homepages/wrf/Research/Short Notes/rotation.html#find.

[13] “Rotation matrix.” Internet link. http://mathworld.wolfram.com/RotationMatrix.html.

[14] “Sourceforge OpenCV.” Internet link. http://sourceforge.net/projects/opencvlibrary/.

[15] “Intel OpenCV.” Internet link. http://www.intel.com/technology/computing/opencv/.

[16] Bjorn Svensson, Johanna Pettersson, Hans Knutsson, “Bildregistrering ge-ometrisk anpassning av bilder.” Mars 2004.

[17] C. Galambosi J. Matastsand J. Kittlert, “Progressive probabilistic houghtransform for line detection,” Computer Vision and pattern Recognition,vol. 1, pp. 733–737, June.

[18] Intel Corporation, http://surfnet.dl.sourceforge.net/sourceforge/opencvlibrary/OpenCVReferenceManual.pdf, Open Source Computer Vision Library Ref-erence Manual, first ed., 12 2000.

[19] J.Canny, “A computational approach to edge detection,” IEEE Transactionon Pattern Analysis and Machine Intelligence, vol. 8, no. 6, pp. 679–698,1986.

[20] K. A. S. Suzuki, “Topological structural analysis of digital binary images byborder following,” CVGIP, pp. 32–46, July 1985.

[21] F. Albregtsen, “Active contours and region based segmentation,” INF 269Digital Image Analysis, p. 25, 10 2002. Lecture 6.

[22] C. Harris and M.J. Stephens, “A combined corner and edge detector,” AlveyVision Conference, pp. 147–152, 1988.

[23] G. Loy and A. Zelinsky, “Fast radial symmetry for detecting points of in-terest,” IEEE Transaction on Pattern Analysis and Machine Intelligence,vol. 25, pp. 959–973, August 2003.

[24] Gosta H. Granlund, Hans Knutsson, Signal Processing for Computer Vision.P.O Box 17, 3300 AA Dordrecht The Netherlands: Kluwer Academic Pub-lishers, 1995.

Page 103: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

BIBLIOGRAPHY 89

[25] C. Mota, I. Stuke, and E. Barth, “Analytic solutions for multiple motions,”International Conference on Image Processing, 2001. Institute for SignalProcessing, University of Lubeck Ratzeburger Allee 160, 23538 Lubeck,Germany.

[26] G. Farneback, Polynomial Expansion for Orientation and Motion Estima-tion. PhD thesis, Linkopings University, Department of Electrical Engineer-ing Linkopings University, SE-581 83 Sweden, 2002.

Page 104: Robot Tool Center Point Calibration using Computer Vision23964/FULLTEXT01.pdf · Robot Tool Center Point Calibration using Computer Vision By: Johan Hallenberg MSc, Linkoping University,

På svenska Detta dokument hålls tillgängligt på Internet – eller dess framtida ersättare – under en längre tid från publiceringsdatum under förutsättning att inga extra-ordinära omständigheter uppstår.

Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka kopior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervisning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säkerheten och tillgängligheten finns det lösningar av teknisk och administrativ art.

Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsmannens litterära eller konstnärliga anseende eller egenart.

För ytterligare information om Linköping University Electronic Press se förlagets hemsida http://www.ep.liu.se/ In English The publishers will keep this document online on the Internet - or its possible replacement - for a considerable time from the date of publication barring exceptional circumstances.

The online availability of the document implies a permanent permission for anyone to read, to download, to print out single copies for your own use and to use it unchanged for any non-commercial research and educational purpose. Subsequent transfers of copyright cannot revoke this permission. All other uses of the document are conditional on the consent of the copyright owner. The publisher has taken technical and administrative measures to assure authenticity, security and accessibility.

According to intellectual property law the author has the right to be mentioned when his/her work is accessed as described above and to be protected against infringement.

For additional information about the Linköping University Electronic Press and its procedures for publication and for assurance of document integrity, please refer to its WWW home page: http://www.ep.liu.se/ © [Johan Hallenberg]