35
Course 12 Calibration

Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: ----- Camera is located at the origin of coordinate system of scene

Embed Size (px)

Citation preview

Course 12 Calibration

Course 12 Calibration

1.IntroductionIn theoretic discussions, we have assumed:

----- Camera is located at the origin of coordinate system of scene.

----- Optic axis of camera is pointing in z-direction of scene coordinates.

----- Image plane is to perpendicular to z-axis with origin at (f, 0, 0)

----- X and Y axes of image coordinates are parallel to x and y axes of scene coordinates, respectively.

----- No any distortions for camera, i.e. pinhole camera model.

1) What we need to calibrate:

----- Absolute orientation: between two coordinate systems, e.g.,robot coordinates and model coordinates.

----- Relative orientation: between two camera systems.

----- Exterior orientation: between camera and scene systems

----- Interior orientation: within a camera, such as camera constant, principal point, lens distortion, etc.

2) Coordinate systems:

----- scene coordinates (global coordinates,

world coordinates, absolute coordinates).

----- Camera coordinates.

----- Image coordinate systems

----- Pixel coordinates

For a image of size m×n, image center

2

m

cx 2

n

cy

)2

1()ˆ(

)2

1()ˆ(

niscisY

mjscjsX

yyy

xxx

),( YX

],[ ji

2. Absolute orientation:

To find out relationship between two coordinate systems from 3 or more 3D points that have expressed in the two coordinate systems.

Let a 3D point p is expressed by

in model coordinate system.

in absolute coordinate system.

Then:

where R is rotation of model coordinate corresponding to

absolute coordinates; is the origin of model coordinate

system in the absolute coordinate system.

),,( mmmc zyxp

),,( aaaa zyxp

0ppRp ma

0p

Given:

We want to find:

It is the same expression as 3D motion estimates

from 3D points. Therefore, all the algorithms of

motion estimation (using 3D points) can be used to

solve for absolute orientation.

),3,2,1(, ,, ipp imia

0and pR

One should remember the constraint of rotation matrix that is orthogonal matrix, i.e.

This adds 6 additional equations in solving for rotation.

)3()2()1( || RRRR

jifor

jiforRR ji

0

1)()(

1) Solve rotation with quaternion: Orthonornal constraint of rotation matrix is absorbed in quaternion expression.In absolute coordinate system, a set of points at a 3D object are :

In model coordinate system, the same set of points are correspondingly measured as:

they satisfy

},,,{ 21 aNaa ppp

},,,{ 21 mNmm ppp

0)( ppqRp miai

In absolute coordinate system:Centroid of the point set is

And the ray from set centroid to point is:

In the same way, in model coordinate system:

Since and are parallel, We can solve for rotation by least-squares.

N

iaia P

NP

1

1

aip

aaiai ppr

;1

1

N

imim P

NP

mmimi ppr

air

cirR

Using quaternion to express rotation and recall that

we have:

*)( qrqrqR

N

imiai rqRr

1

2 )(

*

1

qrqr mi

N

iai

N

iai

Tmi qNqN

1

)()(

N

iai

Tmi

T qNNq1

NqqqNNq TN

iai

T

miT

1

)(

max1

2

N

imiai rRr

Where

i.e.

where

This equation can be solved by linear least squares, such as SVD. After is found, camera position can be easily calculated by

0

0

0

0

mimimi

mimimi

mimimi

mimimi

mi

xyz

xzy

yzx

zyx

N

0

0

0

0

aiaiai

aiaiai

aiaiai

aiaiai

ai

xyz

xzy

yzx

zyx

N

maxNqqT

N

iai

T

mi NNN1

R

ma pqRpp

)(0

2) Scale Problem

If absolute coordinate system and model coordinate system may have different measurements, scale problem is introduced.

If one notices the fact that the distances between points of 3D scene are not affected by choice of coordinate systems, we can easily solve for scale factor:

0ppsRp ma

Once scale factor is found, the problem becomes ordinary absolute orientation problem

00 )()( ppsRppRsp mma

or

2/1

2

1

1

2

n

i mmi

n

i aai

pp

pps

2/1

2

2

ji mjmi

ji ajai

pp

pps

3. Relative Orientation

To determine the relationship between two

camera coordinate systems from the

projections of corresponding points in the

two camera. (e.g., in stereo cases).

This is to say, given pairs of image point

correspondences, to find rotation and

translation of the camera from one position

to the other position.

1) Solve relative orientation problem by motion estimation.

In motion estimation, camera is stationary. Object is moving.

Using to find

In stereo case, we can imagine that a camera first take an image of scene at position , and then moves to to take the second image of the same scene. The scene is stationary. Camera is moving.

),(),,(: 111 YXPzyxpt

21 PP

TR

and

lO

rO

),(),,(: 222 YXPzyxpt

Use to find , it is the same as stationary camera case with

Use 8-point method , one can solve for rotation and translation .

),(),,(:1 lll YXPzyxpt

),(),,(:2 rrr YXPzyxpt

rl PP

bR

and

rr PPPP

~,~

b

2) Iterative method:Let and be the direction from camera centers to scene point, respectively.

Since baseline of stereo system and the normal of epipolar plane are perpendicular, we have:

It can be seen that baseline can only be solved with a scale factor. We put a constraint :

lr

rr

l

l

l

l

f

Y

X

r

r

r

r

r

f

Y

X

r

0)( rl rrRb

b

12

b

By least squares:

five or more stereo image points are needed to solve for the relative orientation problem. In updating, we use the increment for baseline and for rotation, which constrains the rotation matrix to be orthonormal;

i.e.

where:

n

i

Trilii bbrrRbw

1

22 min)1()]([

iirili dbcrrRb ))((

b

rilii rrRc

)( brrRd rilii

2

1

4

11

2 q

)()()1( nnn qqq

*)( qqrrqR lili

for baseline, iterative formula is

where subject to

)()()1( nnn bbb

b

0 bb

4. Exterior OrientationUsing image point (X,Y) in image coordinates and the corresponding 3D point (x,y,z) of scene in absolute coordinate system to determine the relationship of camera system and the absolute system. We assume that image plane and camera are well calibrated.

Let a scene point is expressed in absolute coordinate system:

In camera coordinate system:

p

a

a

a

a

z

y

x

p

c

c

c

c

z

y

x

p

Express in camera coordinate system:

i.e.

(1)/(3) and (2)/(3), and considering

We have :

TpRp ac

)3(

)2(

)1(

333231

232221

131211

zaaac

yaaac

xaaac

tzryrxrz

tzryrxry

tzryrxrx

c

c

z

xfX

c

c

z

yfy

zaaa

xaaa

tzryrxr

tzryrxr

f

X

333231

131211

ap

zaaa

yaaa

tzryrxr

tzryrxr

f

Y

333231

232221

Or

As are known, there are 12

unknowns, 9 for rotation and 3 for translation, at least

6 point are needed to provide 12 equations. However,

if considering 6 constraints for rotation, the minimum

required point is 3 to calibrate the exterior orientation

problem.

0)()( 131211333231 xaaazaaa tzryrxrftzryrxrX

0)()( 232221333231 yaaazaaa tzryrxrftzryrxrY

),,(and),( aaa zyxYX

5. Interior OrientationTo determine the internal geometry of the camera, such as:

----- Camera constant: the distance from image plane to projection center.

----- Principal point: the origin location of image plane coordinate systems.

----- Lens distortion coefficients: optic property of camera.

----- Scale factors: the distance between rows and columns.

1) Calibration model of camera:

----- Uncorrected image coordinates -----True image coordinates ----- Principal point of image, expressed

in system

wherexcjx ˆ~ )ˆ(~

yciy

)~,~( yx

),( yx

),( pp yx

)~,~( yx

xxx ~ yyy ~

lens decentering:

where

))(~( 63

42

21 rkrkrkxxx p

))(~( 63

42

21 rkrkrkyyy p

222 )~()~( pp yyxxr

2) Calibration method

----- Straight lines in scene should be straight lines in image.

• Put straight lines in scene, such as a paper printing with lines.

• Take image from the straight lines in scene

• With image, use Hough transform to group edge lines

nlyx lll ,,2,10sincos

• Substitute calibration model

into the equation of straight lines:

Where denotes the point at line.

• Use lease-squares to find unknown parameters

0),,,,,,,~,~( 321 llppklkl kkkyxyxf

)~,~( klkl yx thk thl

5formin),,,,,,,~,~(1

2321

2

nkkkyxyxfn

kllppklkl

321 ,,,, kkkyx pp

xxx ~ yyy ~

3) Examples:

Affine method for camera calibration

----- Combine interior and exterior calibrations

----- Scale error, translation error, rotations, skew error and differential scaling.

----- Cannot correct lens distortion

----- 2D-3D point correspondences of calibration points are given.

Correction model (affine transformation)

xbyaxax 1211~

ybyaxay 2221~

Since is an arbitrary matrix, the

transformation involves rotation, translation, scaling ,

skew transform.

Using projection model:

Affine transformation becomes:

2221

1211

aa

aa

c

c

z

xfx

c

c

z

yfy

f

b

z

ya

z

xa

f

x x

c

c

c

c 1211

~

f

b

z

ya

z

xa

f

y y

c

c

c

c 2221

~

From exterior orientation of camera:

Substitute exterior orientation relation into affine transformation model:

where coefficients

are absorbed into

zaaac

yaaac

xaaac

Tzryrxrz

Tzryrxry

Tzryrxrx

333231

232221

131211

za33a32a31

xa13a12a11

zyxs

zyxs~

tss

tss

f

x

za33a32a31

ya23a22a21

zyxs

zyxs~

tss

tss

f

y

iijij bra ,,

zyxij ttts ,,and

31 32 33 11 12 13

31 a 32 a 33 a z 11 a 12 a 13 a z

( ) ( ) 0

y(s x s y s z t ) f(s x s y s z t ) 0a a a z a a a xx s x s y s z t f s x s y s z t

With known 3D-2D point correspondences of

Coefficients can be computed by SVD.After are obtained, affine transformation matrix can be formed:

f

y

x

z

y

x

a

a

a

~

~

zyxij ttts ,,andijs

333231

232221

131211

sss

sss

sss

S

And virtual scene points can be calculated from uncorrected image points.

Thus, calibrated image points are obtained by

z

y

x

i

i

i

i

i

t

t

t

f

y

x

S

z

y

x~

~

1

i

ii z

xx

i

ii z

yy

4) Example: Nonlinear method of calibration ----- Combine interior and exterior calibrations ----- Can correct lens distortions ----- Give 2D-3D point correspondences for calibration points

Since:

And interior calibration model:

za33a32a31

xa13a12a11

tzryrxr

tzryrxr

c

c

z

x

f

x

za33a32a31

ya23a22a21

tzryrxr

tzryrxr

c

c

z

y

f

y

))(~(~ 63

42

21 rkrkrkxxxx p

))(~(~ 63

42

21 rkrkrkyyyy p

We have:

The orthonormal property of rotation matrix is constrained by expressing rotation by 3 Euler angles

f

rkrkrkxxx p ))(~(~ 63

42

21

zaaa

xaaa

tzryrxr

tzryrxr

333231

131211

f

rkrkrkyyy p ))(~(~ 63

42

21

zaaa

yaaa

tzryrxr

tzryrxr

333231

232221

and,,

With a set of point correspondences of , we can solve

the unknowns

Iteratively with initial guess :

= camera location in absolute coordinate

systems.

= nominal values of focal length of camera

321 ,,,,,,,,,,, kkkyxfttt ppzyx

0 ωφθ

),,( zyx ttt

f

0

0

321

kkk

yx pp