8
Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan 9-2 Camera Calibration Method for Far Range Stereovision Sensors Used in Vehicles Tiberiu Marita, Florin Oniga, Sergiu Nedevschi Computer Science Department Technical University of Cluj-Napoca Cluj-Napoca, 400020, ROMANIA Abstract This paper presents a camera calibration methodfor far-range stereo-vision used for driving environment perception on highways. For a high accuracy stereovision system the most critical camera parameters are the relative extrinsic parameters which are describing the geometry of the stereo-rig. Experiments proved that even a few seconds drift of the relative camera angles can lead to disastrous consequences in the whole stereo-vision process: incorrect epipolar lines and consequently lack of reconstructed 3D points. Therefore we propose an off-line method able to give a very accurate estimation of these parameters and on-line methods for monitoring their stability in time taking into account the real-world automotive conditions, which can introduce drifts to the initial parameters due vibrations, temperature variations etc. 1 Introduction Optical sensors, such as visible-light cameras, are usually referred to as passive sensors because they acquire data in a non-intrusive way. They are low cost sensors but do not perform direct measurements as the active ones. The measurement is done indirectly from the 2D image and this process could be time consuming and its accuracy dependents on the vision sensor setup. However, the use of a high resolution, high accuracy stereovision algorithm provides accurate results in 3D estimation while delivering a larger amount of data, thus making some obstacle detection tasks as grouping and tracking easier, and allowing a subsequent classification of the obstacles. Moreover, the lane detection is a key problem in any driving assistance system, and one of the privileges of computer vision. Thorsten Graf, Rolf Schmidt Electronic Research Volkswagen AG Wolfsburg, 38436, GERMANY The measuring capabilities of vision sensors with monocular cameras are reduced because are relying on some prior knowledge of the scene as flat road assumption or constant pitch angle, which cannot be met in real scenarios. With stereo-vision systems the full 3D reconstruction of the scene is possible, but a high accuracy of the camera parameters is required. The calibration of a stereovision system deals with the estimation of the following parameters: - Internal parameters of the stereo-rig: intrinsic parameters (focal length and the principal point position) and relative position and orientation of the two cameras (relative extrinsic parameters). - Absolute extrinsic parameters of the stereo-rig: position and orientation of the stereo-rig (usually the left camera coordinate system) relative to a world coordinate system in which the measurements are reported. The quality of the internal parameters is influencing the accuracy of the essential and fundamental matrix and consequently the positions of the epipolar lines [1] which are essential in the stereo correlation process. Their wrong estimation can lead to lack of correlated points or false correlations with disastrous consequences for the whole stereo reconstruction process. The quality of the absolute extrinsic parameters is linked to the position of the reconstructed 3D points in the world coordinate system. Regarding the intrinsic parameters calibration there are many general purpose methods found in the literature [2]-[5]. Most of them are estimating the parameters by minimizing the projection error of a set of control points from a calibration object/pattern with known structure against their detected 2D 356

9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

9-2

Camera Calibration Method for Far Range Stereovision Sensors Used in Vehicles

Tiberiu Marita, Florin Oniga, Sergiu Nedevschi

Computer Science DepartmentTechnical University of Cluj-NapocaCluj-Napoca, 400020, ROMANIA

Abstract

This paper presents a camera calibration methodforfar-range stereo-vision usedfor driving environmentperception on highways. For a high accuracystereovision system the most critical cameraparameters are the relative extrinsic parameterswhich are describing the geometry of the stereo-rig.Experiments proved that even a few seconds drift ofthe relative camera angles can lead to disastrousconsequences in the whole stereo-vision process:incorrect epipolar lines and consequently lack ofreconstructed 3D points. Therefore we propose anoff-line method able to give a very accurateestimation of these parameters and on-line methodsfor monitoring their stability in time taking intoaccount the real-world automotive conditions, whichcan introduce drifts to the initial parameters duevibrations, temperature variations etc.

1 Introduction

Optical sensors, such as visible-light cameras, areusually referred to as passive sensors because theyacquire data in a non-intrusive way. They are lowcost sensors but do not perform direct measurementsas the active ones. The measurement is doneindirectly from the 2D image and this process couldbe time consuming and its accuracy dependents onthe vision sensor setup. However, the use of a highresolution, high accuracy stereovision algorithmprovides accurate results in 3D estimation whiledelivering a larger amount of data, thus makingsome obstacle detection tasks as grouping andtracking easier, and allowing a subsequentclassification of the obstacles. Moreover, the lanedetection is a key problem in any driving assistancesystem, and one of the privileges of computer vision.

Thorsten Graf, Rolf Schmidt

Electronic ResearchVolkswagen AG

Wolfsburg, 38436, GERMANY

The measuring capabilities of vision sensors withmonocular cameras are reduced because are relyingon some prior knowledge of the scene as flat roadassumption or constant pitch angle, which cannot bemet in real scenarios. With stereo-vision systems thefull 3D reconstruction of the scene is possible, but ahigh accuracy of the camera parameters is required.

The calibration of a stereovision system deals withthe estimation of the following parameters:

- Internal parameters of the stereo-rig: intrinsicparameters (focal length and the principal pointposition) and relative position and orientation ofthe two cameras (relative extrinsic parameters).

- Absolute extrinsic parameters of the stereo-rig:position and orientation of the stereo-rig (usuallythe left camera coordinate system) relative to aworld coordinate system in which themeasurements are reported.

The quality of the internal parameters is influencingthe accuracy of the essential and fundamental matrixand consequently the positions of the epipolar lines[1] which are essential in the stereo correlationprocess. Their wrong estimation can lead to lack ofcorrelated points or false correlations with disastrousconsequences for the whole stereo reconstructionprocess. The quality of the absolute extrinsicparameters is linked to the position of thereconstructed 3D points in the world coordinatesystem.

Regarding the intrinsic parameters calibration thereare many general purpose methods found in theliterature [2]-[5]. Most of them are estimating theparameters by minimizing the projection error of aset of control points from a calibration object/patternwith known structure against their detected 2D

356

Page 2: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

image positions. Multiple views of the calibrationobject are taken and the accuracy of the resultsusually increases with the total number of controlpoints. A special note must be mentioned for theBouguet method implemented in the CaltechCamera Calibration Toolbox [5], which wasintensively tested and used in our experiments forthe estimation of the intrinsic parameters.

are proposed. These methods are allowingmonitoring the stability of the internal parameters intime taking into account the real-world automotiveconditions, which can introduce drifts to the initialparameters due vibrations or temperature variations.

2 Camera Model

Regarding the extrinsic parameters estimation, theprinciple is the same: minimizing the projectionerror of some 3D control points with knownpositions in the world coordinate system(measurement coordinate system). General purposemethods [5], [6] are using the same calibrationobject as for the intrinsic parameters estimation.These approaches can be satisfactory for monocularor near range indoor applications, but are completelyunsuited for far range stereo-vision (because in theestimation process the error was minimized for asmall calibration object and for near distances).Therefore, using a calibration scene with sizescomparable with the working range of the stereoreconstruction application has been proved a reliablesolution. Such methods [8], [9] are using as controlpoints, painted markers on a flat surface of a fewmeters or tens of meters in length.

In this paper we present a method for estimating theextrinsic camera parameters suited for far range andhigh accuracy stereo-vision. As control points, 'X'-shaped targets are placed vertically in a flatcalibration scene, up to 40 m in depth. The 3Dcoordinates of the targets' are measured in the worldcoordinate system. Each camera is calibratedindividually (so the method can be applied also formono systems). The 2D image projections of thetargets' central points are detected automatically andthe projection error of their 3D coordinates areminimized against the position and orientation of thecamera in the world coordinate system using theGauss-Newton method. The obtained results provedto be very accurate regarding both the absoluteextrinsic parameters of each camera individually andespecially for the relative extrinsic parameters,allowing a very accurate stereo reconstructionprocedure. Furthermore, two original methods foron-line self checking of the stereo system geometry

The intrinsic parameters of each camera are theprincipal point's position (xc,yc) [pixels], the focallength of the camera f [pixels] and the first tworadial (ki, k2) and tangential (p1, p2) distortioncoefficients [4], [5]. The intrinsic parameters areassumed to be correctly computed, using thealgorithm presented in [5].

The stereovision sensor consists in two camerasmounted on a rigid frame (stereo-rig). The positionand orientation of the cameras' are completelydetermined by the absolute extrinsic parameters: thetranslation vectors TL and TR (1), and the rotationmatrices RL and RR (2) relative to the worldcoordinate system which coincides with the ego-carcoordinate system. The car/world coordinate systemhas its origin on the ground in the front of the car,and its Z axis points in our direction of travel (fig.1). The relative extrinsic parameters can becomputed from the absolute ones (3).

T =[Xcw, Ycw, ZCW] T (1)

(2)

{Trel = RL (TR-TL)

Rrel = RL RR(3)

Left camera

Right camera:

XR2TI ZMk

TRy

Fig. 1. The cameras' and the car coordinate systems.

357

TL

-)0-Car lieadiiig direction

Page 3: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

3 Extrinsic Parameters Calibration

3.1 The calibration scenario

A calibration scenario on a flat surface of 30-40 mdepth and 5-10 m width was chosen (fig.4). Ascontrol points a set of 'X'-shaped targets wereplaced vertically in a calibration scene,

There are some aspects influencing the quality of theobtained extrinsic parameters which were taken intoaccount when setting up this calibration scenario:- the 3D control points should be uniformly spreadover the calibration field;

- their 3D coordinates should be measured asaccurate as possible;

- the projection of the control points onto the imageplane should be easily detected with sub-pixelaccuracy.

The 'X' -shape of the targets was chosen due to somemajor advantages in the automated detectionprocess:- it has a low rate of false targets detections becausemost of the structures in our 3D scenarios are '+'-shaped structures (due to predominance ofhorizontal and vertical features) while the 'X'-shape is based on oblique features;

- the center of the target can be easily computed;- the centers of the vertical 'X'-shaped targets aremore easier to detect than lines or other markerspainted on the road [8[, [9], which are poorlyvisible at long distances due to perspectiveprojection effects, especially when the cameras aremounted at low pitch angles as in vehicles.

3.2 Correction of the lens distortion effects

First, a correction algorithm is applied to remove thelens distortion effects encoded in the four distortioncoefficients using the distortion model of thecameras [3], [5]. This is a compulsory step sciencetargets which are far from the image center cansuffer a distortion drift up to a few pixels, dependingon the quality and geometry of the lenses. Thedestination un-distorted images are obtained byremapping the original images to the undistortedcoordinates using bilinear interpolation. The

correction of the lens distortion simplifies theprojection model of the 3D points onto the imageplane used in the extrinsic parameters estimation.The lens distortion correction is applied also onevery processed image in the stereo-reconstructionapplication.

3.3 Automatic detection of the targets 2D imageprojections

Detection of the targets is performed in 3 stagesusing multiple validation constraints in order toeliminate false positives which can occur in somescenes with non-uniform background.1. First, the grayscale image is segmented using an

automated global threshold detection method [7]computed in the region of interest (the smallestrectangle delimiting the image area occupied bythe 'X' -shaped targets). Then concavities of theblack areas are searched (fig. 2) and groups offour concavities (left/right/upper/lower) areformed and marked based on vicinity criteria.

Fig. 2. Concavities detection: zoomed area- upper/lower/left/right concavities; complete image - centers of

detected groups of 4-concavities are marked.

2. Validation by searching oblique edge pointsobtained with the Roberts-cross operator in theneighborhoods of the previously foundconcavities. The areas occupied by oblique edgepoints around each concavity is marked (fig. 3)and used in the next validation stage.

Pig. 3. Validation by oblique edges.

358

Page 4: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

3. A final validation step is performed by applying aclassical pattern-matching method withsynthetically generated 'X'-shaped patterns ofdifferent sizes in the surrounding region of thealready found targets. The final results of thetargets detection are shown in figure 4.

Fig. 4. Calibration scenario - the 3D control points aresituated in the centers of the 'X'-shaped targets.

To detect the sub-pixel position of the targets'center, the image area bounding the current target iszoomed-in using bi-cubic interpolation (fig. 5-left).To detect the target's center position along the Xaxis, the sum of the pixel values along each columnis computed (fig. 5-center). The maximum of thesesums is considered (in the center of the target thenumber of light pixels along the row and the columnis maximal). To minimize the influence of noise andimage discrete nature, instead of considering exactlythe position of the maximum, we take the mid pointof the interval having values above 90% from themaximum. The computed position is now mappedback by downscaling and translation to the originalimage coordinates. To detect the target' s centerposition along the Y axis the procedure is similar,except that it's applied along the vertical direction(fig. 5-right).

Fig. 5. Zoomed 'X-shape' image (left); The sum ofintensities along the columns for each row (center); Thesum of intensities along the rows for each column (right).

3.4 Estimation of the extrinsic parameters

Considering a known 3D point PA(XA, YA,, ZA) andits image projection p(xi, yi), due to perspectiveprojection [1], the following relations are available:

Xi

Yi

xc _f rll(XA-r13 (XA

_f r12(XArl3(xA

X,CW) + r2 1 (YAXCW) + r23 (YA

- XCW)+ r22(YA

- X CW ) + r23 (YA

)w)+ r3l(ZAYcw)+ r33(ZA -

-Ycw)+r32(ZA- YCW ) + r33 (ZA;-

-Zcw )-Zcw )

-Z w )

-Zcw )

(4)

For each known 3D point there are two equationsinvolving the 12 unknowns. The 9 unknownsrepresenting the rotation matrix have only 3 freedomdegrees (represented by the 3D angles of rotation)Thus it is necessary to add several constraints thatmodel the dependencies between the rotation matrixcoefficients. These constraints are obtained easilyconsidering the fact that the rotation matrix isorthogonal [1], and therefore six equations can besupplementary added:

rli ¼ril +'r2 Pr12 +Ir3 rl3 -1=0r2l *r2l +r22 *r22 +r23 *r23 -1=0

r3lPr3l +r32 2r32 +r33 *r33 -1=0

rll * r2l + r12 * r22 + r13 * r23 = °

rll r3l +1r2 r32 +/r3 *r33 =0r2l *r3l +r22 *r32 +r23 *r33 =0

(5)

By using a number n>=3 of 3D control points, from(4) and (5) a non-linear system of 2n+6 equations isbuilt (7), where the first 2n equations are obtainedby applying (5) for each control point, and the last 6equations are the constraints defined in (5):

F(u)=0 (6)

where u is a vector of the 12 unknowns (Xw, Yc,Zcw, rij), for i, j = 1,2,3.

To solve this system, the Gauss-Newton iterativemethod was used. This method starts from an initialrandom solution (the constraint that the initialsolution must be close to the real one is notrequired), and converges applying iterativelycorrection steps. As initial solution the translationvector was set to the null vector and the rotationmatrix to the identity matrix. Suppose that, after anumber of correction steps, the current solution is uiand, by applying another correction step, the new

359

Page 5: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

solution is ui+1 = ui + du, than the residual du isobtained by solving the system:

J du=F (7)

where J is the Jacobian matrix associated toequation system F(u) = 0.

correspondent (mR) in the right view can be obtainedusing the homography matrix of the plane at infinityHO., which depends only on the internal parametersof the stereo rig: the intrinsic parameters of the twocameras encoded in the two intrinsic matrices ALand AR [3] and the relative rotation between the twocameras Rrel.

To solve the over-determined system (7) a least-squares method was used (8). The decision to stopthe correction process after a number of iterations istaken when the norm of the residual du is smallerthan a threshold.

du = ui+1 -1ui = -(J(ui)T (J(ui)) J(u)T F(uj) (8)

4 On-line Self-Checking of the Stereo-RigInternal Parameters

Real-world automotive conditions, can lead to driftsto the initial camera parameters due vibrations,temperature variations etc. Therefore some methodsused to check the geometry of the stereo-rig duringexploitation would be very useful. The checkingshould be applied from time to time and should notrequire a specific calibration scene. In the followingtwo such methods which can be used on-line duringthe exploitation of the stereovision sensor areproposed.

4.1 Self-checking using infinity points

It is very likely that on highway or country roads tohave a visible horizon line. Usually the mid points ofthe detected horizon line are at far distance (a fewhundred meters) and can be considered at infinity incomparison with the depth range of the stereovisionsensor (<100 m). Thus, these points can beconsidered in a plane situated at infinity. Letconsider the disparity equation for points situated atinfinity:

mR H mL R Rrel AL mL (9)

The above equation says that for the left imageprojection (mL) of a point at infinity, its

By setting the homogeneous coordinates of the pointmL to 0 in (9) we define the expected disparity ofinfinity points as:

(10)

The method computes the expected disparity for thecamera parameters in use. When we want to check ifthe stereo-system' s geometrical configuration fulfilsthe initial values we compute the average disparityfor a set of horizon points: d6,g (if a valid horizonline is available):

(1 1)

Then, if the difference between the computedaverage disparity of the horizon points and theexpected disparity is below a threshold, we can saythat the geometrical configuration is unchanged. If itis above the threshold then a geometrical change inthe parameters of AL, AR or Rrei has occurred (12).The proper value of the threshold value isdetermined experimentally.

(id= |davg -d-{ <Trashold :Geometry OK (12)> Trashold: Geometry -Wrong

The horizon points are matched between the left andright horizon line segments detected as the lowerborders of the sky regions. The classical way to findthe right image correspondents of left image featuresis the stereo-matching using epipolar geometryconstraints [1]. But in our case this approach is notapplicable because infinity points have "zero"disparity and we assume that the camera geometrymight be changed (and therefore epipolar constraints

360

dx 0

d_ = d=A 'R.A-'. 0y 2 1

-I- -I-

R Lxi xid =

avg R LL-Yi _j L_ yi _j

Page 6: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

are not valid any more). Therefore a set of horizonpoints are selected from the left image and theircorrespondents in the right are found using thefollowing algorithm:1. Features selection in both left and right horizon

lines segments which are local extreme points ofthe horizon line segments (local minima or

maxima in the vertical direction).2. For every feature from the left image a window

based matching is performed on the set of rightimage features and the feature pair of the bestmatch is selected.

4.2 Self-checking using epipolar line constraints

A measure of the stereo system's wrong geometrycan be the average vertical drift from the trueposition of the computed epipolar line in the rightimage. Considering the theory of the fundamentalmatrix [1], it easy to deduce that the epipolar lineswill have a vertical drift when small modificationsare present in the focal lengths, the verticalcoordinate of the principal points (1-2 pixels), therelative pitch angle of the two cameras (0.02-0.05degrees), or large modifications of the otherparameters.

Usually, when a drift appears, the effect onreconstruction is a decrease in the number of 3Dpoints. Therefore the drift can be checked by settingdifferent values of drift in between -2...+2 pixelswith a step of 0.25, and testing the variation of thenumber of 3D points (as in fig. 6). The algorithm forcomputing the epipolar drift is described bellow:1. For each drift value, the histogram of the number

of the reconstructed 3D points is computed (onlyedge points were considered).

2. The maximum value of the histogram isdetermined.

3. Because it is not a strong maximum, a region istaken around the maximum for values higher then95%.

4. The middle of the interval is considered as thecorrect position for the correspondent drift.

For a good calibrated system the drift should havevalues between -1 ... +1 pixel. Outside this rangethe errors of the stereo correlation process are

dramatically increasing. Considering themeasurements' error range (witch is proportionalwith the matching accuracy = 1/4 pixels) a

maximum allowed drift range of -0.7 ... +0.7 isconsidered. If a small de-calibration appears, thanthe drift will have higher values (more than 1 pixelabsolute values).

5 Results

The calibration method was tested in a large numberof experiments with various types of stereovisionsystems: from individual high quality mega-pixelCCD cameras mounted on a rigid rig to integratedstereo-heads with CMOS sensors.

The intrinsic parameters of each camera were

calibrated individually. The parameters were

estimated using the Bouguet' s algorithm [5] byminimizing the projection errors of a set of controlpoints placed on a coplanar calibration objectviewed from different positions. Accuracy of theobtained parameters was depending on severalfactors: number of views, number of control pointson each view, size of the calibration object, coverage

of the views by the calibration object, accuracy andplanarity of the calibration object etc. By using theproper calibration methodology, the obtaineduncertainties of the intrinsic parameters were belowone pixel. Regarding the stability of the intrinsicparameters in time the parameters remainedunchanged during in-vehicle exploitation of thestereo system in a period of over 2 years.

The extrinsic parameters of each camera were

calibrated using the above presented method. Theaccuracy of the automated targets' center detection

361

Nr. of 3dg\points Detected drift - middle of the plateau

Drift-2.0 0 -.

Fig. 6. Detection of the epipolar drift

Page 7: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

method was compared with the positions of handextracted points (using a zoomed window applied onthe targets). The obtained average error was bellow0.1 pixels. The uncertainties for the extrinsicparameters roughly estimated from the residual ofthe Gauss-Newton minimization method wereusually below 10-9 for the terms of T and 10-11 forthe terms of R.

The overall accuracy of the estimated parameterswas assessed offline, by measuring thereconstruction errors of 'X'-shaped targets, placed inknown 3D coordinates (similar to fig. 4). The targetscenters were computed with sub-pixel accuracyusing the above described algorithm. In such a setup,the only errors' sources were the sub-pixel detectionerrors of the targets' center (about 0.1 pixelsaccuracy) and the targets' 3D measuring errors. Theerror plots of the reconstructed coordinates for oneof the calibration experiments are shown in figure 7.The lateral offset errors were below 4 cm, the heighterrors were below 1 cm and the depth errors werebelow 22 cm (below 5% relative errors).

I20 5I 20 25 --- 40 45

1405 10 15 20 25 30 35 40 45

a. X errors

3(10-~~~~~~~~~~---------2(10L L

1(0 ---- -

1(10r~~~~~~T~~~~~~n~~~~~~~X~~------- --- ------ ------ --------

5 10 15 20 25 30 35 40 45

c. Z errors

l+ +,~+

Fig. 7. Absolute errors of the reconstructed control pointsexpressed in [mm] vs. the depth of the control point [in].

The same scenario was used to validate the self-checking method based on epipolar line constraints.From the sub-pixel left-right image coordinates ofthe detected targets' centers an average verticalepipolar drift was computed and compared with thedrift measured with the on-line method. The errorswere below 0.1 ... 0.2 pixels.

Regarding the self-checking method of the camerageometry by using the infinity points' constraints,for the proper estimation of the error threshold (12),the following study was made: a set of goodcalibration parameters was taken as reference andthe expected infinity disparity do. for the initial set ofparameters computed. Then the value of eachinternal parameter was modified around the initialvalue to simulate changes in the stereo systemgeometry (table I).

TABLE ISIMULATED DISPARITY ERRORS: Ad

Parameter Simulated drift | A [pixels]Principal point 1 pixel 1Focal length 10 pixels 0.2Rotation angle 0.1 deg 2

By assuming that we have the followinguncertainties in the estimation of system parameters:1-2 pixels for principal point and focal length, 0.01[deg] for the relative camera angles and 1 pixel inthe detection of horizon points' positions, we canconsider the threshold value of &l as the sum of theconsequences of all these uncertainties obtaining anapproximate value of [2, 2]T [pixels].

The self-checking method using infinity points'constraints was tested on different scenarios. If aproper horizon line was detected (beyond lOOm,with enough local maxima) the method proved to bereliable providing for a good calibrated system anaverage disparity of infinity points very close to theexpected one.

The calibration parameters were tested in astereovision based lane and obstacle detectionapplication for traffic environment perception inhighway scenarios [11-13]. The system was able do

362

Page 8: 9-2users.utcluj.ro/~tmarita/CV/Papers_IEEE-Xplore/IVS-2006.pdfevery processed image in the stereo-reconstruction application. 3.3 Automatic detection of the targets 2Dimage projections

Intelligent Vehicles Symposium 2006, June 13-15, 2006, Tokyo, Japan

detect and track objects in a range up to 100m with adepth errors up to 5%. For comparison, the results ofthe extrinsic parameters calibration method from [5]were tested but the current approach has beenproved far more accurate and reliable.

6 Conclusions

A dedicated calibration method for far range stereo-vision was developed. The method performs theestimation of the camera' s absolute extrinsicparameters with high accuracy. Consequently, veryaccurate relative extrinsic parameters of thecameras' are determined, allowing a preciseestimation of the epipolar lines (= 0 pixel drift) withbenefic effects for the whole stereo reconstruction.Therefore the method is suited for the calibration ofany stereo-vision system used for far range 3Dreconstruction as outdoor robot vision applicationsor vision based driving assistance systems.

Also a set of original methods for on-line self-checking of the stereo rig internal parameters wasproposed. These methods are allowing the on-linemonitoring of the stability of the internal parametersduring time taking into account the real-worldautomotive conditions, which can introduce drifts tothe initial parameters.

Further work will be focused on developing an auto-calibration method of the stereo rig absoluteextrinsic parameters, by knowing the internalparameters calibrated off-line and taking theadvantages of available stereo reconstruction of roadfeatures in the stereo-rig coordinate system.

References

[1] E. Trucco, "Introductory techniques for 3DComputer Vision", Prentice Hall, 1998.

[2] Tsai, R. Y.(1987), "A versatile cameracalibration technique for high-accuracy 3Dmachine vision metrology using off-the-shelfTV cameras and lenses". IEEE Journal ofRobotics and Automation, RA-3(4)/1987:pp.323-344.

[3] Zhang Z (1999), "Flexible Camera CalibrationBy Viewing a Plane From UnknownOrientations." International Conference onComputer Vision (ICCV'99), Corfu, Greece,September 1999, pp. 666-673

[4] Heikkila, J.; Silven, 0. (1997), "A four-stepcamera calibration procedure with implicitimage correction," Proc. IEEE ComputerSociety Conf, 1997 pp. l 106 -1112.

[5] J.Y. Bouguet, Camera Calibration Toolbox forMatlab, www.vision.caltech.edu/bouguetj.

[6] N. Kampchen, , U. Franke, R. Ott, "Stereovision based pose estimation of parking lotsusing 3D vehicle models", IEEE IntelligentVehicle Symposium, 2002, pp. 459- 464.

[7] R.C. Gonzales, R.E. Woods, Digital ImageProcessing, 2-nd edition, Prentice Hall, 2002.

[8] S. Ernst, C. Stiller, J. Goldbeck, C. Roessig,"Camera Calibration for Lane and Obstacledetection", IEEE Inteligent TransportationSystems, 1999, pp. 356-361.

[9] A. Broggi, M. Bertozzi, A. Fascioli "Self-Calibration of a Stereo Vision System forAutomotive Applications", IEEE Conf onRobotics and Automation, Seoul, Korea, May2001, vol.4, pp. 3698-3703.

[10] C. Zeller, 0. Faugeras, Camera Self-Calibrationfrom Video Sequences: the Kruppa EquationsRevisited, Technical Report No. 2793, INRIA,1996.

[11] S. Nedevschi, R. Schmidt, T. Graf, R. Danescu,D. Frentiu, T. Marita, F. Oniga, C. Pocol,"High Accuracy Stereo Vision System for FarDistance Obstacle Detection", IEEE IntelligentVehicles Symposium, Parma, Italy, June 14-17,2004, pp.161-166.

[12] S. Nedevschi, R..Schmidt, T. Graf, R. Danescu,D. Frentiu, T. Marita, F. Oniga, C. Pocol, "3DLane Detection System Based onStereovision", IEEE Intelligent TransportationSystems Conference, Washington, USA,October 4-6, 2004, pp. 292-297.

[13] S. Nedevschi, R. Danescu, T. Marita, F. Oniga,C. Pocol, S. Sobol, T. Graf, R. Schmidt,"Driving Environment Perception UsingStereovision", IEEE Intelligent VehiclesSymposium, June 2005, Las Vegas, USA,pp.331-336.

363