Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Hindawi Publishing CorporationISRNMachine VisionVolume 2013 Article ID 874084 7 pageshttpdxdoiorg1011552013874084
Research ArticleVision Measurement Scheme Using Single Camera Rotation
Shidu Dong
College of Computer Science and Engineering Chongqing University of Technology Chongqing 400050 China
Correspondence should be addressed to Shidu Dong shidu dongyahoocom
Received 10 April 2013 Accepted 8 May 2013
Academic Editors A Nikolaidis and J P Siebert
Copyright copy 2013 Shidu Dong This is an open access article distributed under the Creative Commons Attribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited
We propose visionmeasurement scheme for estimating the distance or size of the object in static scene which requires single camerawith 3-axis accelerometer sensor rotating around a fixed axis First we formulate the rotationmatrix and translation vector fromonecoordinate system of the camera to another in terms of the rotation angle which can be figured out from the readouts of the sensorSecond with the camera calibration data and through coordinate system transformation we propose a method for calculating theorientation and position of the rotation axis relative to camera coordinate system Finally given the rotation angle and the imagesof the object in static scene at two different positions one before and the other after camera rotation the 3D coordinate of the pointon the object can be determined Experimental results show the validity of our method
1 Introduction
Nowadays digital camera or mobile phone with camera isvery popular It is appealing and convenient if they areutilized to estimate the distance or size of an object For thispurpose stereo images with disparity should be taken [1 2]One obvious method for the stereo image acquisition is usingtwo cameras with different view angles With two imagesof an object from two cameras and the relative orientationand position of the two different viewpoints through thecorrespondence between image points in the two views the3D coordinates of the points on the object can be determined[3 4] But in general mobile phone or professional camerahas only one camera and cannot acquire two images fromdifferent views simultaneously Fortunately there have beenmany methods for stereo vision system with single cameraThe methods may be broadly divided into three categoriesFirst to obtain virtual images from different viewpointsadditional optical devices are introduced such as two planarmirrors [5] a biprism [6] convexmirrors [1 7] or the doublelobed hyperbolic mirrors [8] But these optical devices areexpensive and space consuming Second 3D informationof an object is inferred directly from a still image underthe knowledge of some geometrical scene constraints suchas planarity of points and parallelism of lines and planes[9ndash11] or prior knowledge about the scene obtained from
the supervised learning [12] Nevertheless these methodsrequire constrained scenes or extra computation for trainingthe depth models Third 3D information is extracted fromsequential images with respect to camera movement whichis often adopted in robot area Due to the uncertainties in thesequential camera position however it is difficult to get theaccurate 3D information in that method [1]
In this paper we propose a novel vision measurementmethod for estimating the distance or size of the objectin static scene which requires single camera with 3-axisaccelerometer rotating around afixed axisThrough the 3-axisaccelerometer sensor the slope angle of the camera relative togravity direction can be obtained [13]The angle can uniquelydetermine the position of the camera if the camera is rotatedaround a fixed axis which is not parallel to gravity Moreoverthe relative position and orientation of the camera betweentwo positions one before and the other after camera rotationcan be determined by the rotation angle Therefore at thegiven two positions if the camera is calibrated the classicalbinocular view methods [4] can be adopted to extract the 3Dcoordinates of the points on objects Unfortunately it is verydifficult for the user to rotate the camera to the same slopeangle as the calibrated one
To deal with this problem we firstly formulate therotation matrix and translation vector from one coordinatesystem of the camera to another in terms of the rotation
2 ISRNMachine Vision
Gx1
Gx2
G998400
Gy1
Gy212057321
1205791
1205792
Figure 1 Position and rotation angle of the camera 1198661015840 denotesgravity division on the x-y plane of the sensor coordinate system119866119909and 119866
119910denote the divisions of gravity on the 119909 and 119910 axes of
the sensor coordinate system respectively and 1205791and 120579
2denote the
camera positions 12057321indicates the rotation angle
angleThenwith camera calibration data at various positionswe calculate the orientation and position of the rotationaxis relative to the camera coordinate system Accordinglyat given two positions the rotation matrix and translationvector from one coordinate system of the camera to anothercan be calculated by rotation angle andwith the collected twoimages at the two positions the 3D coordinate of the pointson the object can be determined
The paper is organized as follows Section 2 provides astereo vision system from single camera through rotationand Section 3 indicates calculation method of the rotationmatrix and translation vector from one coordinate system ofthe camera to another by the rotation angle The calculationmethod for the position and orientation of rotation axisrelative to the camera coordination system is proposed inSection 4 and the complete calibration and the 3D coordinatecalculation of the point on an object are presented inSection 5 Experimental results are given in Section 6 andsome conclusions and discussions are drawn in Section 7
2 Stereo Vision through Camera Rotation
In what follows we adopt the following hypotheses
(H1)The camera is provided with a 3-axis accelerom-eter sensor whose readouts 119866
119909 119866119910 and 119866
119911 are the
divisions of gravity on the 119909 119910 and 119911 axes of thesensor coordinate system respectively(H2) In calibration and measurement processes thecamera rotates around the fixed rotation axis which isparallel to the 119911 axis of the sensor coordinate systemand is not parallel to the direction of gravity
Thus in the course of the rotation the readout 119866119911keeps
steady As a result the division of gravity on the plane119909-119910 of the sensor coordinate system 119866
1015840 also keeps steady
(see Figure 1) Therefore the slope angle with respect togravity division119866
1015840 (position for short) can be determined by
120579 = arctan(119866119909
119866119910
) (1)
From position 1205791to 1205792 the rotation angle of device 120573
21is
governed by
12057321
= 1205792minus 1205791 (2)
Two images 1198681and 1198682 of an object are collected at the
positions 1205791and 1205792 respectively 119874
1-119909119910119911 and 119874
2-119909119910119911 denote
the camera coordinate systems at 1205791and 120579
2 respectively
(1199091 1199101 1199111) and (119909
2 1199102 1199112) denote the coordinates of a point
on the object relative to 1198741-119909119910119911 and 119874
2-119909119910119911 respectively
and (1198831 1198841) and (119883
2 1198842) denote image coordinates of the
point of the object on 1198681and 1198682(image coordinate for short)
respectivelyThe projection of object coordinates relative to 119874
1-119909119910119911
and 1198742-119909119910119911 into image coordinates is summarized by the
following forms
1199041[
[
1198831
1198841
1
]
]
= [
[
119891119909
0 119888119909
0 119891119910
119888119910
0 0 1
]
]
[
[
1199091
1199101
1199111
]
]
1199042[
[
1198832
1198842
1
]
]
= [
[
119891119909
0 119888119909
0 119891119910
119888119910
0 0 1
]
]
[
[
1199092
1199102
1199112
]
]
(3)
Let
[
[
1199092
1199102
1199112
]
]
= [11987721
11987921][[[
[
1199091
1199101
1199111
1
]]]
]
= [
[
1199031
1199032
1199033
119905119909
1199034
1199035
1199036
119905119910
1199037
1199038
1199039
119905119911
]
]
[[[
[
1199091
1199101
1199111
1
]]]
]
(4)
where 11987721
and 11987921
denote rotation matrix and translationvector between 119874
2-119909119910119911 and 119874
1-119909119910119911
Substituting (3) into (4) we get
1205882[
[
1198832
1198842
1
]
]
= [
[
1198911199091199031+ 1198881199091199037
1198911199091199032+ 1198881199091199038
1198911199091199033+ 1198881199091199039
119891119909119905119909+ 119888119909119905119911
1198911199101199034+ 1198881199101199037
1198911199101199035+ 1198881199101199038
1198911199101199036+ 1198881199101199039
119891119910119905119910+ 119888119910119905119911
1199037
1199038
1199039
119905119911
]
]
times
[[[[[[[
[
(1198831minus 119888119909)
119911
119891119909
(1198841minus 119888119910)
119911
119891119910
1199111
1
]]]]]]]
]
(5)
ISRNMachine Vision 3
Thus the 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (object coordinate) can be determined by
1199091= (1198831minus 119888119909)1199111
119891119909
1199101= (1198841minus 119888119910)1199111
119891119910
1199111=
119891119909119891119910(119891119909119905119909+ 119888119909119905119911minus 1198832119905119911)
119861
(6)
where
119861 = (1198832minus 119888119909) [1199037119891119910(1198831minus 119888119909) + 1199038119891119909(1198841minus 119888119910) + 1199039119891119910119891119909]
minus 119891119909[1198911199101199031(1198831minus 119888119909) + 1198911199091199032(1198841minus 119888119910) + 1198911199091199033119891119910]
(7)
From (6) we can see that the object coordinate can befound provided that the intrinsic parameters 119891
119909 119891119910 119888119909 and
119888119910 the rotation matrix 119877
21 and translation vector 119879
21are
availableWith the camera calibration method proposed by Zhang
[14] the camera intrinsic parameters and extrinsic parame-ters describing the camera motion around a static scene canbe figured out Let 119877
1and 119879
1denote the extrinsic parameters
at 1205791and 119877
2and 119879
2the extrinsic parameters at 120579
2 For
simplicity 119875119908= (119909 119910 119911) 119875
1= (1199091 1199101 1199111) and 119875
2= (1199092 1199102
and 1199112) stand for the coordinates of the point on the object
relative to the world coordinate system 1198741-119909119910119911 and 119874
2-119909119910119911
respectively Thus
1198751= 1198771119875119908+ 1198791 119875
2= 1198772119875119908+ 1198792 (8)
From (8) we get
119875119908= 119877minus1
11198751minus 119877minus1
11198791 (9)
Substituting (9) into (8) we get
1198752= 1198772119877minus1
11198751minus 1198772119877minus1
11198791+ 1198792 (10)
Equation (4) can be rewritten as
1198752= 119877211198751+ 11987921 (11)
Thus
11987721
= 1198772119877minus1
1 119879
21= 1198792minus 1198772119877minus1
11198791 (12)
Based on the previous discussions the object coordinatecan be determined by the following process
(1) The camera is calibrated and the intrinsic parameters119891119909 119891119910 119888119909 and 119888
119910 are obtained
(2) At the positions 1205791and 120579
2 the camera is calibrated
and the corresponding extrinsic parameters 1198771 1198791
1198772 and 119879
2 are obtained Then 119877
21and 119879
21can be
acquired from (12)
(3) In the course of measurement the camera is rotatedcarefully to the positions 120579
1and 120579
2 At these two
positions the two images of the object 1198681and 119868
2
are collected Once the image coordinates of theobject (119883
1 1198841) and (119883
2 1198842) are known the object
coordinate can be figured out by (6)
It should be noticed that it is rather difficult for the user torotate the camera accurately to the positions 120579
1and 1205792 where
camera is calibrated
3 Calculation Method of Rotation Matrix andTranslation Vector
Intuitively the orientation and position of 1198742-119909119910119911 relative to
1198741-119909119910119911 119877
21and 119879
21 depend on the camera rotation angle
12057321 Thus 119877
21and 119879
21may be figured out by 120573
21
Let 119874119877-119909119910119911 be the coordinate system associated with
the rotation axis which is unit vector on the 119911 axis of thecoordinate system Let 119877
119909and 119879
119909denote the orientation
and position of 119874119877-119909119910119911 relative to 119874
1-119909119910119911 (orientation and
position of the rotation axis) Its homogeneous matrix can bewritten as
119872119883
= [119877119883
119879119883
0 1] (13)
The rotation matrix for the camera rotation around the 119911axis of 119874
119877-119909119910119911 from 120579
1to 1205792(device rotation matrix) can be
modeled as
11987712057321
= [
[
cos12057321
minus sin12057321
0
sin12057321
cos12057321
0
0 0 1
]
]
(14)
Its homogeneous matrix can be written as
Rot (12057321) = [
11987712057321
0
0 1] (15)
The coordinate system transformation can be representedas a graph (Figure 2) A directed edge represents a relation-ship between two coordinate systems and is associated with ahomogeneous transformation From Figure 2 we can get
11987221119872119883= 119872119883Rot (120573
21) (16)
where
11987221
= [11987721
11987921
0 1] (17)
Substituting (13) (15) and (17) into (16) we get
[11987721
11987921
0 1] [
119877119883
119879119883
0 1] = [
119877119883
119879119883
0 1] [
11987712057321
0
0 1] (18)
[11987721119877119883
11987721119879119883+ 11987921
0 1] = [
11987711988311987712057321
119879119883
0 1] (19)
11987721
= 11987711988311987712057321
119877minus1
119883 (20)
11987921
= 119879119883minus 11987721119879119883 (21)
4 ISRNMachine Vision
M21
y
z
x
O2 minus xyzO1 minus xyz
Mx Mx
Rot (12057321)
z
x
Figure 2 Coordinate system transformation represented by agraph
From (20) and (21) we can see that the 11987721and119879
21can be
calculated by the rotation angle 12057321provided that 119877
119909and 119879
119909
are available
4 Calculation Method of Position andOrientation of Rotation Axis
41 Orientation of Rotation Axis Suppose that the camerabe calibrated at different positions At position 120579
119894 we get
camera calibration data Let 1205731198941= 120579119894minus 1205791 and let 119877
1205731198941denote
the device rotation matrix with respect to rotation angle 1205731198941
1198771198941
and 1198791198941
denote rotation matrix and translation vectorbetween camera coordinate systems at the positions 120579
119894and
1205791 Equation (20) can be written as
1198771198941119877119883= 1198771198831198771205731198941
(22)
When the rotation axis is fixed 119877119909would be constant
Thus given the values of 1198771198941 1198791198941 and 120573
1198941 we can solve (22)
for 119877119909
Using a normalized quaternion to define the rotationbetween two coordinate systems provides a simple andelegant way to formulate successive rotations [15ndash17] Givenrotation matrix
119877 = [
[
1199031
1199032
1199033
1199034
1199035
1199036
1199037
1199038
1199039
]
]
(23)
it can be transformed to the quaternion with the followingequation [18]
1199020= radic1 + 119903
1+ 1199035+ 1199039
119902119909=
1199038minus 1199036
4 times 1199020
119902119910=
1199037minus 1199033
4 times 1199020
119902119911=
1199034minus 1199032
4 times 1199020
(24)
Similarly the quaternion can be transformed to the rotationmatrix with the following equation [18]
119877=[[
[
1199022
0+1199022
119909+1199022
119910+1199022
1199112 (119902119909119902119910minus 1199020119902119911) 2 (119902
119909119902119911+ 1199020119902119910)
2 (119902119909119902119910+ 1199020119902119911) 1199022
0+1199022
119910minus1199022
119909minus 1199022
1199112 (119902119910119902119911minus 1199020119902119909)
2 (119902119909119902119911minus 1199020119902119910) 2 (119902
119910119902119911+ 1199020119902119909) 1199022
0+1199022
119911minus1199022
119909minus1199022
119910
]]
]
(25)
Let 119902119860119894 119902119883 and 119902
119861119894denote the quaternion of 119877
1198941 119877119909
and 1198771205731198941
respectively With quaternion the sequence ofrotation can be formulated as an equation without involvingrotation matrices [15] As a result the problem of solving1198771198941119877119883
= 1198771198831198771205731198941
can be transformed into an equivalentproblem involving the corresponding quaternion as follows[15]
119902119860119894
otimes 119902119883= 119902119883otimes 119902119861119894 (26)
Since the quaternion multiplication can be written inmatrix form and with notations introduced in [19] we havethe following [16]
119902119860119894
otimes 119902119883
= 119876 (119902119860119894) 119902119883 119902
119883otimes 119902119861119894
= 119882(119902119861119894) 119902119883 (27)
where letting 119902 = [1199020 119902119909 119902119910 119902119911]
119876 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
minus119902119911
119902119910
119902119910
119902119911
1199020
minus119902119909
119902119911
minus119902119910
119902119909
1199020
]]]
]
119882 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
119902119911
minus119902119910
119902119910
minus119902119911
1199020
119902119909
119902119911
119902119910
minus119902119909
1199020
]]]
]
(28)
Moreover these two matrices are orthogonal [16] that is
[119876 (119902)]119879
119876 (119902) = [119882 (119902)]119879
119882(119902) = 119868 (29)
Thus
119876 (119902119860119894) 119902119883= 119882(119902
119861119894) 119902119883
1003817100381710038171003817119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883
1003817100381710038171003817
2
= [119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]119879
[119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]
= 119902119879
1198832119868 minus [119876 (119902
119860119894)]119879
119882(119902119861119894) minus [119882 (119902
119861119894)]119879
119876 (119902119860119894) 119902119883
= 119902119879
119883[2119868 minus 119862
119894] 119902119883
(30)
where
119862119894= [119876 (119902
119860119894)]119879
119882(119902119861119894) + [119882 (119902
119861119894)]119879
119876 (119902119860119894) (31)
Thus the total error function allowing us to compute 119902119883
becomes
119891 (119902119909) =
119899
sum
119894=1
119902119879
119883[2119868 minus 119862
119894] 119902119883
= 119902119879
119883[2119899119868 minus
119899
sum
119894=1
119862119894] 119902119883
= 119902119879
119883[2119899119868 minus 119878] 119902
119883
(32)
ISRNMachine Vision 5
where 119899 is the number of positions of the camera
119878 =
119899
sum
119894=1
119862119894 (33)
119902119909is the unit quaternion Therefore 119902
119909can be obtained by
solving the following problem
min 119891 (119902119883) = min 119902
119879
119883[2119899119868 minus 119878] 119902
119883
st 119902119879
119883119902119883= 1
(34)
42 Position of Rotation Axis At position 120579119894 (21) can be
written as
(119868 minus 1198771198941) 119879119883= 1198791198941 (35)
When the rotation axis is fixed 119879119909would be constant
Thus given the values of 1198771198941and 119879
1198941 we can solve (35) for
119879119883Let 119864119894= [minus119879
1198941 119868 minus 119877
1198941] 119879119884= [1 119879
119879
119909]119879 and thus
119864119894119879119884= 0
10038171003817100381710038171198641198941198791198841003817100381710038171003817
2
2= 119879119879
119884119864119879
119894119864119894119879119884
(36)
Let 119899 denote the number of positions of the camera Thetotal error function is
119892 (119879119884) =
119899
sum
119894=1
119879119879
119884119864119879
119894119864119894119879119884= 119879119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884 (37)
Since the camera rotation axis is approximately verti-cal to 119879
119883 the value of 119905
119911approaches zero Thus 119879
119910=
[1199051199100 1199051199101 1199051199102 1199051199103] can be obtained by solving the following
problem
min 119892 (119879119884) = min119879
119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884
st1199051199100
= 1
1199051199103
lt 1
(38)
5 Calculation of Position and Orientation ofRotation Axis and 3D Coordinate
51 Calculation of Position and Orientation of Rotation AxisBased on the previous discussions the complete process ofcalculation of position and orientation of rotation axis isoutlined below
(1) The chessboard (see Figure 3) is printed and plasteredon plane By rotating and moving the camera prop-erly a set of chessboard images is collected Thenthe values of the intrinsic parameters of the camera119891119909 119891119910 119888119909 and 119888
119910 are obtained by calling the method
proposed in [14]
Figure 3 Chessboard
(2) The camera is fixed on a camera tripod and therotation axis of the camera which is parallel to the119911 axis of the sensor coordinate system lies in anapproximately horizontal plane
(3) By rotating the camera around the fixed axis to thedifferent positions another set of chessboard imagesare collected
(4) The extrinsic parameters of the camera at the posi-tions 120579
119894119877119894 and119879
119894are obtained by calling the function
cvFindExtrinsicCameraParams2 in Opencv [20](5) The rotation matrix and translation vector between
the camera coordinate systems at the positions 120579119894and
1205791 1198771198941 and 119879
1198941are figured out by (12) The rotation
angle 1205731198941and its corresponding rotation matrix R
1205731198941
are also calculated by (2) and (14)(6) The 119877
1198941and 119877
1205731198941are converted into quaternions 119902
119860119894
and 119902119861119894 respectively by using the method proposed
by Bar-Itzhack [18](7) 119902119883is found by solving problem (34) As a result the
119877119909can be obtained
(8) 119879119883is obtained by solving problem (38)
52 3D Coordinate Calculation Based on the previous dis-cussions we present the complete process of 3D coordinatecalculation as follows
(1) The camera is fixed on a camera tripodwhose rotationaxis lies in an approximately horizontal plane
(2) At certain position 1205791 the image of the object 119868
1is
collected(3) By rotating the camera on the fixed axis to another
position 1205792 we get another image of the object 119868
2
(4) The rotation angle 12057321and its corresponding rotation
matrix 11987712057321
are figured out by (2) and (14)(5) With119877
1205732111987721and11987921are calculated by (20) and (21)
(6) The image coordinate of the point of the object(1198831 1198841) on the image 119868
1is appointed manually
6 ISRNMachine Vision
(7) The corresponding image coordinate in the image 1198682
(1198832 1198842) can be determined by stereo correspondence
method for example the function FindStereoCorre-spondenceBM in Opencv [20] or by manual
(8) The 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (119909
1 1199101 1199111) can be figured out by (6)
6 Experiments
Since digital camera with 3-axis accelerometer sensor is notavailable for us IPhone 4 which has the sensor is adopted Tosimulate digital camera the phone is placed in a box which isfixed on a camera tripod And it is ensured that the 119911 axis ofthe sensor (experiment results show that the axis is parallel tothe optical axis of the camera) is parallel to the rotation axisof the camera tripod so that the value of 119866
119911keeps steady in
the course of the rotation In calibration course the distancebetween the camera and the chessboard is about 1000mm
Figure 4 illustrates the curve of the quaternion of therotation matrix between the camera coordinate systems atthe 119894th position and 1st position with respect to the rotationangle The quaternion 119902
1015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] was calculated
by the proposed method with rotation angle while 119902 =
[1199020 119902119909 119902119910 119902119911] was converted directly from rotation matrix
1198771198941 which was from calibration data From the graph one
can see that the proposed method can calculate the rotationmatrix by rotation angle
Figure 5 plots the curve of translation vector between thecamera coordinate systems at the 119894th position and 1st positionwith respect to rotation angle The vector 119879
1015840= [1198791015840
119909 1198791015840
119910 1198791015840
119911]
was calculated by the proposed method with rotation anglewhile 119879 = [119879
119909 119879119910 119879119911] was directly from calibration data 119879
1198941
From the graphs one can see that the proposed method cancalculate effectively translation vector by rotation angle
In order to estimate the accuracy of 3D coordinatecalculated by the proposed method the chessboard whichhas a bigger block than the one for calibration is printed Thewidth of the block is 466mm and the distance between thechessboard and the camera is about 1200mm The distancebetween two neighbor corners rather than the distancebetween the camera and the chessboard is calculated becausethe measurement of the former by manual is easier For sim-plicity the corners of the blocks in images are automaticallydetected by calling the Opencv function ldquocvFindChessboard-Cornersrdquo [19] Figure 6 depicts the measurement error ratioof the distance between two corners with respect to rotationangle
7 Conclusions
This paper proposed a stereo vision system with singlecamera which requires digital camera with 3-axis accelerom-eter sensor rotating around the fixed axis which is parallelto the axis 119911 of the sensor Under these conditions theslope angle relative to gravity which can be figured outfrom the readouts of the sensor can determine the cameraposition and the rotation angle between two positions candetermine the rotation matrix and translation vector from
0
02
04
06
08
1
12
0 01 02 03 04 05 06 07 08 09Rotation angle
minus02
q0qxqyqz
q0998400
qx998400
qy998400
qz998400
Figure 4 Curve of the quaternion of rotationmatrix with respect torotation angle 119902 = [119902
0 119902119909 119902119910 119902119911] is converted from rotation matrix
and 1199021015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] is calculated by the proposed method
0
10
20
30
40
50
60
0 01 02 03 04 05 06 07 08 09
(mm
)
Rotation angleminus10
TxTyTz
Tx998400
Ty998400
Tz998400
Figure 5 Curve of translation vector with respect to rotation angle119879 = [119879
119909 119879119910 119879119911] was directly from calibration data and 119879
1015840=
[1198791015840
119909 1198791015840
119910 1198791015840
119911] was calculated by (23) with rotation angle
one coordinate system of the camera to another Accordinglygiven the rotation angle and the images of the object at twodifferent positions one before and the other after camerarotation the 3D coordinates of the points on the object canbe determined Theoretical analysis and experimental resultsshow the validity of our method
It should be noticed that few digital cameras are providedwith 3-axis accelerometer sensor However to obtain stereovision we believe that the inexpensive sensor embedded in
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
2 ISRNMachine Vision
Gx1
Gx2
G998400
Gy1
Gy212057321
1205791
1205792
Figure 1 Position and rotation angle of the camera 1198661015840 denotesgravity division on the x-y plane of the sensor coordinate system119866119909and 119866
119910denote the divisions of gravity on the 119909 and 119910 axes of
the sensor coordinate system respectively and 1205791and 120579
2denote the
camera positions 12057321indicates the rotation angle
angleThenwith camera calibration data at various positionswe calculate the orientation and position of the rotationaxis relative to the camera coordinate system Accordinglyat given two positions the rotation matrix and translationvector from one coordinate system of the camera to anothercan be calculated by rotation angle andwith the collected twoimages at the two positions the 3D coordinate of the pointson the object can be determined
The paper is organized as follows Section 2 provides astereo vision system from single camera through rotationand Section 3 indicates calculation method of the rotationmatrix and translation vector from one coordinate system ofthe camera to another by the rotation angle The calculationmethod for the position and orientation of rotation axisrelative to the camera coordination system is proposed inSection 4 and the complete calibration and the 3D coordinatecalculation of the point on an object are presented inSection 5 Experimental results are given in Section 6 andsome conclusions and discussions are drawn in Section 7
2 Stereo Vision through Camera Rotation
In what follows we adopt the following hypotheses
(H1)The camera is provided with a 3-axis accelerom-eter sensor whose readouts 119866
119909 119866119910 and 119866
119911 are the
divisions of gravity on the 119909 119910 and 119911 axes of thesensor coordinate system respectively(H2) In calibration and measurement processes thecamera rotates around the fixed rotation axis which isparallel to the 119911 axis of the sensor coordinate systemand is not parallel to the direction of gravity
Thus in the course of the rotation the readout 119866119911keeps
steady As a result the division of gravity on the plane119909-119910 of the sensor coordinate system 119866
1015840 also keeps steady
(see Figure 1) Therefore the slope angle with respect togravity division119866
1015840 (position for short) can be determined by
120579 = arctan(119866119909
119866119910
) (1)
From position 1205791to 1205792 the rotation angle of device 120573
21is
governed by
12057321
= 1205792minus 1205791 (2)
Two images 1198681and 1198682 of an object are collected at the
positions 1205791and 1205792 respectively 119874
1-119909119910119911 and 119874
2-119909119910119911 denote
the camera coordinate systems at 1205791and 120579
2 respectively
(1199091 1199101 1199111) and (119909
2 1199102 1199112) denote the coordinates of a point
on the object relative to 1198741-119909119910119911 and 119874
2-119909119910119911 respectively
and (1198831 1198841) and (119883
2 1198842) denote image coordinates of the
point of the object on 1198681and 1198682(image coordinate for short)
respectivelyThe projection of object coordinates relative to 119874
1-119909119910119911
and 1198742-119909119910119911 into image coordinates is summarized by the
following forms
1199041[
[
1198831
1198841
1
]
]
= [
[
119891119909
0 119888119909
0 119891119910
119888119910
0 0 1
]
]
[
[
1199091
1199101
1199111
]
]
1199042[
[
1198832
1198842
1
]
]
= [
[
119891119909
0 119888119909
0 119891119910
119888119910
0 0 1
]
]
[
[
1199092
1199102
1199112
]
]
(3)
Let
[
[
1199092
1199102
1199112
]
]
= [11987721
11987921][[[
[
1199091
1199101
1199111
1
]]]
]
= [
[
1199031
1199032
1199033
119905119909
1199034
1199035
1199036
119905119910
1199037
1199038
1199039
119905119911
]
]
[[[
[
1199091
1199101
1199111
1
]]]
]
(4)
where 11987721
and 11987921
denote rotation matrix and translationvector between 119874
2-119909119910119911 and 119874
1-119909119910119911
Substituting (3) into (4) we get
1205882[
[
1198832
1198842
1
]
]
= [
[
1198911199091199031+ 1198881199091199037
1198911199091199032+ 1198881199091199038
1198911199091199033+ 1198881199091199039
119891119909119905119909+ 119888119909119905119911
1198911199101199034+ 1198881199101199037
1198911199101199035+ 1198881199101199038
1198911199101199036+ 1198881199101199039
119891119910119905119910+ 119888119910119905119911
1199037
1199038
1199039
119905119911
]
]
times
[[[[[[[
[
(1198831minus 119888119909)
119911
119891119909
(1198841minus 119888119910)
119911
119891119910
1199111
1
]]]]]]]
]
(5)
ISRNMachine Vision 3
Thus the 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (object coordinate) can be determined by
1199091= (1198831minus 119888119909)1199111
119891119909
1199101= (1198841minus 119888119910)1199111
119891119910
1199111=
119891119909119891119910(119891119909119905119909+ 119888119909119905119911minus 1198832119905119911)
119861
(6)
where
119861 = (1198832minus 119888119909) [1199037119891119910(1198831minus 119888119909) + 1199038119891119909(1198841minus 119888119910) + 1199039119891119910119891119909]
minus 119891119909[1198911199101199031(1198831minus 119888119909) + 1198911199091199032(1198841minus 119888119910) + 1198911199091199033119891119910]
(7)
From (6) we can see that the object coordinate can befound provided that the intrinsic parameters 119891
119909 119891119910 119888119909 and
119888119910 the rotation matrix 119877
21 and translation vector 119879
21are
availableWith the camera calibration method proposed by Zhang
[14] the camera intrinsic parameters and extrinsic parame-ters describing the camera motion around a static scene canbe figured out Let 119877
1and 119879
1denote the extrinsic parameters
at 1205791and 119877
2and 119879
2the extrinsic parameters at 120579
2 For
simplicity 119875119908= (119909 119910 119911) 119875
1= (1199091 1199101 1199111) and 119875
2= (1199092 1199102
and 1199112) stand for the coordinates of the point on the object
relative to the world coordinate system 1198741-119909119910119911 and 119874
2-119909119910119911
respectively Thus
1198751= 1198771119875119908+ 1198791 119875
2= 1198772119875119908+ 1198792 (8)
From (8) we get
119875119908= 119877minus1
11198751minus 119877minus1
11198791 (9)
Substituting (9) into (8) we get
1198752= 1198772119877minus1
11198751minus 1198772119877minus1
11198791+ 1198792 (10)
Equation (4) can be rewritten as
1198752= 119877211198751+ 11987921 (11)
Thus
11987721
= 1198772119877minus1
1 119879
21= 1198792minus 1198772119877minus1
11198791 (12)
Based on the previous discussions the object coordinatecan be determined by the following process
(1) The camera is calibrated and the intrinsic parameters119891119909 119891119910 119888119909 and 119888
119910 are obtained
(2) At the positions 1205791and 120579
2 the camera is calibrated
and the corresponding extrinsic parameters 1198771 1198791
1198772 and 119879
2 are obtained Then 119877
21and 119879
21can be
acquired from (12)
(3) In the course of measurement the camera is rotatedcarefully to the positions 120579
1and 120579
2 At these two
positions the two images of the object 1198681and 119868
2
are collected Once the image coordinates of theobject (119883
1 1198841) and (119883
2 1198842) are known the object
coordinate can be figured out by (6)
It should be noticed that it is rather difficult for the user torotate the camera accurately to the positions 120579
1and 1205792 where
camera is calibrated
3 Calculation Method of Rotation Matrix andTranslation Vector
Intuitively the orientation and position of 1198742-119909119910119911 relative to
1198741-119909119910119911 119877
21and 119879
21 depend on the camera rotation angle
12057321 Thus 119877
21and 119879
21may be figured out by 120573
21
Let 119874119877-119909119910119911 be the coordinate system associated with
the rotation axis which is unit vector on the 119911 axis of thecoordinate system Let 119877
119909and 119879
119909denote the orientation
and position of 119874119877-119909119910119911 relative to 119874
1-119909119910119911 (orientation and
position of the rotation axis) Its homogeneous matrix can bewritten as
119872119883
= [119877119883
119879119883
0 1] (13)
The rotation matrix for the camera rotation around the 119911axis of 119874
119877-119909119910119911 from 120579
1to 1205792(device rotation matrix) can be
modeled as
11987712057321
= [
[
cos12057321
minus sin12057321
0
sin12057321
cos12057321
0
0 0 1
]
]
(14)
Its homogeneous matrix can be written as
Rot (12057321) = [
11987712057321
0
0 1] (15)
The coordinate system transformation can be representedas a graph (Figure 2) A directed edge represents a relation-ship between two coordinate systems and is associated with ahomogeneous transformation From Figure 2 we can get
11987221119872119883= 119872119883Rot (120573
21) (16)
where
11987221
= [11987721
11987921
0 1] (17)
Substituting (13) (15) and (17) into (16) we get
[11987721
11987921
0 1] [
119877119883
119879119883
0 1] = [
119877119883
119879119883
0 1] [
11987712057321
0
0 1] (18)
[11987721119877119883
11987721119879119883+ 11987921
0 1] = [
11987711988311987712057321
119879119883
0 1] (19)
11987721
= 11987711988311987712057321
119877minus1
119883 (20)
11987921
= 119879119883minus 11987721119879119883 (21)
4 ISRNMachine Vision
M21
y
z
x
O2 minus xyzO1 minus xyz
Mx Mx
Rot (12057321)
z
x
Figure 2 Coordinate system transformation represented by agraph
From (20) and (21) we can see that the 11987721and119879
21can be
calculated by the rotation angle 12057321provided that 119877
119909and 119879
119909
are available
4 Calculation Method of Position andOrientation of Rotation Axis
41 Orientation of Rotation Axis Suppose that the camerabe calibrated at different positions At position 120579
119894 we get
camera calibration data Let 1205731198941= 120579119894minus 1205791 and let 119877
1205731198941denote
the device rotation matrix with respect to rotation angle 1205731198941
1198771198941
and 1198791198941
denote rotation matrix and translation vectorbetween camera coordinate systems at the positions 120579
119894and
1205791 Equation (20) can be written as
1198771198941119877119883= 1198771198831198771205731198941
(22)
When the rotation axis is fixed 119877119909would be constant
Thus given the values of 1198771198941 1198791198941 and 120573
1198941 we can solve (22)
for 119877119909
Using a normalized quaternion to define the rotationbetween two coordinate systems provides a simple andelegant way to formulate successive rotations [15ndash17] Givenrotation matrix
119877 = [
[
1199031
1199032
1199033
1199034
1199035
1199036
1199037
1199038
1199039
]
]
(23)
it can be transformed to the quaternion with the followingequation [18]
1199020= radic1 + 119903
1+ 1199035+ 1199039
119902119909=
1199038minus 1199036
4 times 1199020
119902119910=
1199037minus 1199033
4 times 1199020
119902119911=
1199034minus 1199032
4 times 1199020
(24)
Similarly the quaternion can be transformed to the rotationmatrix with the following equation [18]
119877=[[
[
1199022
0+1199022
119909+1199022
119910+1199022
1199112 (119902119909119902119910minus 1199020119902119911) 2 (119902
119909119902119911+ 1199020119902119910)
2 (119902119909119902119910+ 1199020119902119911) 1199022
0+1199022
119910minus1199022
119909minus 1199022
1199112 (119902119910119902119911minus 1199020119902119909)
2 (119902119909119902119911minus 1199020119902119910) 2 (119902
119910119902119911+ 1199020119902119909) 1199022
0+1199022
119911minus1199022
119909minus1199022
119910
]]
]
(25)
Let 119902119860119894 119902119883 and 119902
119861119894denote the quaternion of 119877
1198941 119877119909
and 1198771205731198941
respectively With quaternion the sequence ofrotation can be formulated as an equation without involvingrotation matrices [15] As a result the problem of solving1198771198941119877119883
= 1198771198831198771205731198941
can be transformed into an equivalentproblem involving the corresponding quaternion as follows[15]
119902119860119894
otimes 119902119883= 119902119883otimes 119902119861119894 (26)
Since the quaternion multiplication can be written inmatrix form and with notations introduced in [19] we havethe following [16]
119902119860119894
otimes 119902119883
= 119876 (119902119860119894) 119902119883 119902
119883otimes 119902119861119894
= 119882(119902119861119894) 119902119883 (27)
where letting 119902 = [1199020 119902119909 119902119910 119902119911]
119876 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
minus119902119911
119902119910
119902119910
119902119911
1199020
minus119902119909
119902119911
minus119902119910
119902119909
1199020
]]]
]
119882 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
119902119911
minus119902119910
119902119910
minus119902119911
1199020
119902119909
119902119911
119902119910
minus119902119909
1199020
]]]
]
(28)
Moreover these two matrices are orthogonal [16] that is
[119876 (119902)]119879
119876 (119902) = [119882 (119902)]119879
119882(119902) = 119868 (29)
Thus
119876 (119902119860119894) 119902119883= 119882(119902
119861119894) 119902119883
1003817100381710038171003817119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883
1003817100381710038171003817
2
= [119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]119879
[119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]
= 119902119879
1198832119868 minus [119876 (119902
119860119894)]119879
119882(119902119861119894) minus [119882 (119902
119861119894)]119879
119876 (119902119860119894) 119902119883
= 119902119879
119883[2119868 minus 119862
119894] 119902119883
(30)
where
119862119894= [119876 (119902
119860119894)]119879
119882(119902119861119894) + [119882 (119902
119861119894)]119879
119876 (119902119860119894) (31)
Thus the total error function allowing us to compute 119902119883
becomes
119891 (119902119909) =
119899
sum
119894=1
119902119879
119883[2119868 minus 119862
119894] 119902119883
= 119902119879
119883[2119899119868 minus
119899
sum
119894=1
119862119894] 119902119883
= 119902119879
119883[2119899119868 minus 119878] 119902
119883
(32)
ISRNMachine Vision 5
where 119899 is the number of positions of the camera
119878 =
119899
sum
119894=1
119862119894 (33)
119902119909is the unit quaternion Therefore 119902
119909can be obtained by
solving the following problem
min 119891 (119902119883) = min 119902
119879
119883[2119899119868 minus 119878] 119902
119883
st 119902119879
119883119902119883= 1
(34)
42 Position of Rotation Axis At position 120579119894 (21) can be
written as
(119868 minus 1198771198941) 119879119883= 1198791198941 (35)
When the rotation axis is fixed 119879119909would be constant
Thus given the values of 1198771198941and 119879
1198941 we can solve (35) for
119879119883Let 119864119894= [minus119879
1198941 119868 minus 119877
1198941] 119879119884= [1 119879
119879
119909]119879 and thus
119864119894119879119884= 0
10038171003817100381710038171198641198941198791198841003817100381710038171003817
2
2= 119879119879
119884119864119879
119894119864119894119879119884
(36)
Let 119899 denote the number of positions of the camera Thetotal error function is
119892 (119879119884) =
119899
sum
119894=1
119879119879
119884119864119879
119894119864119894119879119884= 119879119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884 (37)
Since the camera rotation axis is approximately verti-cal to 119879
119883 the value of 119905
119911approaches zero Thus 119879
119910=
[1199051199100 1199051199101 1199051199102 1199051199103] can be obtained by solving the following
problem
min 119892 (119879119884) = min119879
119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884
st1199051199100
= 1
1199051199103
lt 1
(38)
5 Calculation of Position and Orientation ofRotation Axis and 3D Coordinate
51 Calculation of Position and Orientation of Rotation AxisBased on the previous discussions the complete process ofcalculation of position and orientation of rotation axis isoutlined below
(1) The chessboard (see Figure 3) is printed and plasteredon plane By rotating and moving the camera prop-erly a set of chessboard images is collected Thenthe values of the intrinsic parameters of the camera119891119909 119891119910 119888119909 and 119888
119910 are obtained by calling the method
proposed in [14]
Figure 3 Chessboard
(2) The camera is fixed on a camera tripod and therotation axis of the camera which is parallel to the119911 axis of the sensor coordinate system lies in anapproximately horizontal plane
(3) By rotating the camera around the fixed axis to thedifferent positions another set of chessboard imagesare collected
(4) The extrinsic parameters of the camera at the posi-tions 120579
119894119877119894 and119879
119894are obtained by calling the function
cvFindExtrinsicCameraParams2 in Opencv [20](5) The rotation matrix and translation vector between
the camera coordinate systems at the positions 120579119894and
1205791 1198771198941 and 119879
1198941are figured out by (12) The rotation
angle 1205731198941and its corresponding rotation matrix R
1205731198941
are also calculated by (2) and (14)(6) The 119877
1198941and 119877
1205731198941are converted into quaternions 119902
119860119894
and 119902119861119894 respectively by using the method proposed
by Bar-Itzhack [18](7) 119902119883is found by solving problem (34) As a result the
119877119909can be obtained
(8) 119879119883is obtained by solving problem (38)
52 3D Coordinate Calculation Based on the previous dis-cussions we present the complete process of 3D coordinatecalculation as follows
(1) The camera is fixed on a camera tripodwhose rotationaxis lies in an approximately horizontal plane
(2) At certain position 1205791 the image of the object 119868
1is
collected(3) By rotating the camera on the fixed axis to another
position 1205792 we get another image of the object 119868
2
(4) The rotation angle 12057321and its corresponding rotation
matrix 11987712057321
are figured out by (2) and (14)(5) With119877
1205732111987721and11987921are calculated by (20) and (21)
(6) The image coordinate of the point of the object(1198831 1198841) on the image 119868
1is appointed manually
6 ISRNMachine Vision
(7) The corresponding image coordinate in the image 1198682
(1198832 1198842) can be determined by stereo correspondence
method for example the function FindStereoCorre-spondenceBM in Opencv [20] or by manual
(8) The 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (119909
1 1199101 1199111) can be figured out by (6)
6 Experiments
Since digital camera with 3-axis accelerometer sensor is notavailable for us IPhone 4 which has the sensor is adopted Tosimulate digital camera the phone is placed in a box which isfixed on a camera tripod And it is ensured that the 119911 axis ofthe sensor (experiment results show that the axis is parallel tothe optical axis of the camera) is parallel to the rotation axisof the camera tripod so that the value of 119866
119911keeps steady in
the course of the rotation In calibration course the distancebetween the camera and the chessboard is about 1000mm
Figure 4 illustrates the curve of the quaternion of therotation matrix between the camera coordinate systems atthe 119894th position and 1st position with respect to the rotationangle The quaternion 119902
1015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] was calculated
by the proposed method with rotation angle while 119902 =
[1199020 119902119909 119902119910 119902119911] was converted directly from rotation matrix
1198771198941 which was from calibration data From the graph one
can see that the proposed method can calculate the rotationmatrix by rotation angle
Figure 5 plots the curve of translation vector between thecamera coordinate systems at the 119894th position and 1st positionwith respect to rotation angle The vector 119879
1015840= [1198791015840
119909 1198791015840
119910 1198791015840
119911]
was calculated by the proposed method with rotation anglewhile 119879 = [119879
119909 119879119910 119879119911] was directly from calibration data 119879
1198941
From the graphs one can see that the proposed method cancalculate effectively translation vector by rotation angle
In order to estimate the accuracy of 3D coordinatecalculated by the proposed method the chessboard whichhas a bigger block than the one for calibration is printed Thewidth of the block is 466mm and the distance between thechessboard and the camera is about 1200mm The distancebetween two neighbor corners rather than the distancebetween the camera and the chessboard is calculated becausethe measurement of the former by manual is easier For sim-plicity the corners of the blocks in images are automaticallydetected by calling the Opencv function ldquocvFindChessboard-Cornersrdquo [19] Figure 6 depicts the measurement error ratioof the distance between two corners with respect to rotationangle
7 Conclusions
This paper proposed a stereo vision system with singlecamera which requires digital camera with 3-axis accelerom-eter sensor rotating around the fixed axis which is parallelto the axis 119911 of the sensor Under these conditions theslope angle relative to gravity which can be figured outfrom the readouts of the sensor can determine the cameraposition and the rotation angle between two positions candetermine the rotation matrix and translation vector from
0
02
04
06
08
1
12
0 01 02 03 04 05 06 07 08 09Rotation angle
minus02
q0qxqyqz
q0998400
qx998400
qy998400
qz998400
Figure 4 Curve of the quaternion of rotationmatrix with respect torotation angle 119902 = [119902
0 119902119909 119902119910 119902119911] is converted from rotation matrix
and 1199021015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] is calculated by the proposed method
0
10
20
30
40
50
60
0 01 02 03 04 05 06 07 08 09
(mm
)
Rotation angleminus10
TxTyTz
Tx998400
Ty998400
Tz998400
Figure 5 Curve of translation vector with respect to rotation angle119879 = [119879
119909 119879119910 119879119911] was directly from calibration data and 119879
1015840=
[1198791015840
119909 1198791015840
119910 1198791015840
119911] was calculated by (23) with rotation angle
one coordinate system of the camera to another Accordinglygiven the rotation angle and the images of the object at twodifferent positions one before and the other after camerarotation the 3D coordinates of the points on the object canbe determined Theoretical analysis and experimental resultsshow the validity of our method
It should be noticed that few digital cameras are providedwith 3-axis accelerometer sensor However to obtain stereovision we believe that the inexpensive sensor embedded in
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
ISRNMachine Vision 3
Thus the 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (object coordinate) can be determined by
1199091= (1198831minus 119888119909)1199111
119891119909
1199101= (1198841minus 119888119910)1199111
119891119910
1199111=
119891119909119891119910(119891119909119905119909+ 119888119909119905119911minus 1198832119905119911)
119861
(6)
where
119861 = (1198832minus 119888119909) [1199037119891119910(1198831minus 119888119909) + 1199038119891119909(1198841minus 119888119910) + 1199039119891119910119891119909]
minus 119891119909[1198911199101199031(1198831minus 119888119909) + 1198911199091199032(1198841minus 119888119910) + 1198911199091199033119891119910]
(7)
From (6) we can see that the object coordinate can befound provided that the intrinsic parameters 119891
119909 119891119910 119888119909 and
119888119910 the rotation matrix 119877
21 and translation vector 119879
21are
availableWith the camera calibration method proposed by Zhang
[14] the camera intrinsic parameters and extrinsic parame-ters describing the camera motion around a static scene canbe figured out Let 119877
1and 119879
1denote the extrinsic parameters
at 1205791and 119877
2and 119879
2the extrinsic parameters at 120579
2 For
simplicity 119875119908= (119909 119910 119911) 119875
1= (1199091 1199101 1199111) and 119875
2= (1199092 1199102
and 1199112) stand for the coordinates of the point on the object
relative to the world coordinate system 1198741-119909119910119911 and 119874
2-119909119910119911
respectively Thus
1198751= 1198771119875119908+ 1198791 119875
2= 1198772119875119908+ 1198792 (8)
From (8) we get
119875119908= 119877minus1
11198751minus 119877minus1
11198791 (9)
Substituting (9) into (8) we get
1198752= 1198772119877minus1
11198751minus 1198772119877minus1
11198791+ 1198792 (10)
Equation (4) can be rewritten as
1198752= 119877211198751+ 11987921 (11)
Thus
11987721
= 1198772119877minus1
1 119879
21= 1198792minus 1198772119877minus1
11198791 (12)
Based on the previous discussions the object coordinatecan be determined by the following process
(1) The camera is calibrated and the intrinsic parameters119891119909 119891119910 119888119909 and 119888
119910 are obtained
(2) At the positions 1205791and 120579
2 the camera is calibrated
and the corresponding extrinsic parameters 1198771 1198791
1198772 and 119879
2 are obtained Then 119877
21and 119879
21can be
acquired from (12)
(3) In the course of measurement the camera is rotatedcarefully to the positions 120579
1and 120579
2 At these two
positions the two images of the object 1198681and 119868
2
are collected Once the image coordinates of theobject (119883
1 1198841) and (119883
2 1198842) are known the object
coordinate can be figured out by (6)
It should be noticed that it is rather difficult for the user torotate the camera accurately to the positions 120579
1and 1205792 where
camera is calibrated
3 Calculation Method of Rotation Matrix andTranslation Vector
Intuitively the orientation and position of 1198742-119909119910119911 relative to
1198741-119909119910119911 119877
21and 119879
21 depend on the camera rotation angle
12057321 Thus 119877
21and 119879
21may be figured out by 120573
21
Let 119874119877-119909119910119911 be the coordinate system associated with
the rotation axis which is unit vector on the 119911 axis of thecoordinate system Let 119877
119909and 119879
119909denote the orientation
and position of 119874119877-119909119910119911 relative to 119874
1-119909119910119911 (orientation and
position of the rotation axis) Its homogeneous matrix can bewritten as
119872119883
= [119877119883
119879119883
0 1] (13)
The rotation matrix for the camera rotation around the 119911axis of 119874
119877-119909119910119911 from 120579
1to 1205792(device rotation matrix) can be
modeled as
11987712057321
= [
[
cos12057321
minus sin12057321
0
sin12057321
cos12057321
0
0 0 1
]
]
(14)
Its homogeneous matrix can be written as
Rot (12057321) = [
11987712057321
0
0 1] (15)
The coordinate system transformation can be representedas a graph (Figure 2) A directed edge represents a relation-ship between two coordinate systems and is associated with ahomogeneous transformation From Figure 2 we can get
11987221119872119883= 119872119883Rot (120573
21) (16)
where
11987221
= [11987721
11987921
0 1] (17)
Substituting (13) (15) and (17) into (16) we get
[11987721
11987921
0 1] [
119877119883
119879119883
0 1] = [
119877119883
119879119883
0 1] [
11987712057321
0
0 1] (18)
[11987721119877119883
11987721119879119883+ 11987921
0 1] = [
11987711988311987712057321
119879119883
0 1] (19)
11987721
= 11987711988311987712057321
119877minus1
119883 (20)
11987921
= 119879119883minus 11987721119879119883 (21)
4 ISRNMachine Vision
M21
y
z
x
O2 minus xyzO1 minus xyz
Mx Mx
Rot (12057321)
z
x
Figure 2 Coordinate system transformation represented by agraph
From (20) and (21) we can see that the 11987721and119879
21can be
calculated by the rotation angle 12057321provided that 119877
119909and 119879
119909
are available
4 Calculation Method of Position andOrientation of Rotation Axis
41 Orientation of Rotation Axis Suppose that the camerabe calibrated at different positions At position 120579
119894 we get
camera calibration data Let 1205731198941= 120579119894minus 1205791 and let 119877
1205731198941denote
the device rotation matrix with respect to rotation angle 1205731198941
1198771198941
and 1198791198941
denote rotation matrix and translation vectorbetween camera coordinate systems at the positions 120579
119894and
1205791 Equation (20) can be written as
1198771198941119877119883= 1198771198831198771205731198941
(22)
When the rotation axis is fixed 119877119909would be constant
Thus given the values of 1198771198941 1198791198941 and 120573
1198941 we can solve (22)
for 119877119909
Using a normalized quaternion to define the rotationbetween two coordinate systems provides a simple andelegant way to formulate successive rotations [15ndash17] Givenrotation matrix
119877 = [
[
1199031
1199032
1199033
1199034
1199035
1199036
1199037
1199038
1199039
]
]
(23)
it can be transformed to the quaternion with the followingequation [18]
1199020= radic1 + 119903
1+ 1199035+ 1199039
119902119909=
1199038minus 1199036
4 times 1199020
119902119910=
1199037minus 1199033
4 times 1199020
119902119911=
1199034minus 1199032
4 times 1199020
(24)
Similarly the quaternion can be transformed to the rotationmatrix with the following equation [18]
119877=[[
[
1199022
0+1199022
119909+1199022
119910+1199022
1199112 (119902119909119902119910minus 1199020119902119911) 2 (119902
119909119902119911+ 1199020119902119910)
2 (119902119909119902119910+ 1199020119902119911) 1199022
0+1199022
119910minus1199022
119909minus 1199022
1199112 (119902119910119902119911minus 1199020119902119909)
2 (119902119909119902119911minus 1199020119902119910) 2 (119902
119910119902119911+ 1199020119902119909) 1199022
0+1199022
119911minus1199022
119909minus1199022
119910
]]
]
(25)
Let 119902119860119894 119902119883 and 119902
119861119894denote the quaternion of 119877
1198941 119877119909
and 1198771205731198941
respectively With quaternion the sequence ofrotation can be formulated as an equation without involvingrotation matrices [15] As a result the problem of solving1198771198941119877119883
= 1198771198831198771205731198941
can be transformed into an equivalentproblem involving the corresponding quaternion as follows[15]
119902119860119894
otimes 119902119883= 119902119883otimes 119902119861119894 (26)
Since the quaternion multiplication can be written inmatrix form and with notations introduced in [19] we havethe following [16]
119902119860119894
otimes 119902119883
= 119876 (119902119860119894) 119902119883 119902
119883otimes 119902119861119894
= 119882(119902119861119894) 119902119883 (27)
where letting 119902 = [1199020 119902119909 119902119910 119902119911]
119876 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
minus119902119911
119902119910
119902119910
119902119911
1199020
minus119902119909
119902119911
minus119902119910
119902119909
1199020
]]]
]
119882 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
119902119911
minus119902119910
119902119910
minus119902119911
1199020
119902119909
119902119911
119902119910
minus119902119909
1199020
]]]
]
(28)
Moreover these two matrices are orthogonal [16] that is
[119876 (119902)]119879
119876 (119902) = [119882 (119902)]119879
119882(119902) = 119868 (29)
Thus
119876 (119902119860119894) 119902119883= 119882(119902
119861119894) 119902119883
1003817100381710038171003817119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883
1003817100381710038171003817
2
= [119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]119879
[119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]
= 119902119879
1198832119868 minus [119876 (119902
119860119894)]119879
119882(119902119861119894) minus [119882 (119902
119861119894)]119879
119876 (119902119860119894) 119902119883
= 119902119879
119883[2119868 minus 119862
119894] 119902119883
(30)
where
119862119894= [119876 (119902
119860119894)]119879
119882(119902119861119894) + [119882 (119902
119861119894)]119879
119876 (119902119860119894) (31)
Thus the total error function allowing us to compute 119902119883
becomes
119891 (119902119909) =
119899
sum
119894=1
119902119879
119883[2119868 minus 119862
119894] 119902119883
= 119902119879
119883[2119899119868 minus
119899
sum
119894=1
119862119894] 119902119883
= 119902119879
119883[2119899119868 minus 119878] 119902
119883
(32)
ISRNMachine Vision 5
where 119899 is the number of positions of the camera
119878 =
119899
sum
119894=1
119862119894 (33)
119902119909is the unit quaternion Therefore 119902
119909can be obtained by
solving the following problem
min 119891 (119902119883) = min 119902
119879
119883[2119899119868 minus 119878] 119902
119883
st 119902119879
119883119902119883= 1
(34)
42 Position of Rotation Axis At position 120579119894 (21) can be
written as
(119868 minus 1198771198941) 119879119883= 1198791198941 (35)
When the rotation axis is fixed 119879119909would be constant
Thus given the values of 1198771198941and 119879
1198941 we can solve (35) for
119879119883Let 119864119894= [minus119879
1198941 119868 minus 119877
1198941] 119879119884= [1 119879
119879
119909]119879 and thus
119864119894119879119884= 0
10038171003817100381710038171198641198941198791198841003817100381710038171003817
2
2= 119879119879
119884119864119879
119894119864119894119879119884
(36)
Let 119899 denote the number of positions of the camera Thetotal error function is
119892 (119879119884) =
119899
sum
119894=1
119879119879
119884119864119879
119894119864119894119879119884= 119879119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884 (37)
Since the camera rotation axis is approximately verti-cal to 119879
119883 the value of 119905
119911approaches zero Thus 119879
119910=
[1199051199100 1199051199101 1199051199102 1199051199103] can be obtained by solving the following
problem
min 119892 (119879119884) = min119879
119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884
st1199051199100
= 1
1199051199103
lt 1
(38)
5 Calculation of Position and Orientation ofRotation Axis and 3D Coordinate
51 Calculation of Position and Orientation of Rotation AxisBased on the previous discussions the complete process ofcalculation of position and orientation of rotation axis isoutlined below
(1) The chessboard (see Figure 3) is printed and plasteredon plane By rotating and moving the camera prop-erly a set of chessboard images is collected Thenthe values of the intrinsic parameters of the camera119891119909 119891119910 119888119909 and 119888
119910 are obtained by calling the method
proposed in [14]
Figure 3 Chessboard
(2) The camera is fixed on a camera tripod and therotation axis of the camera which is parallel to the119911 axis of the sensor coordinate system lies in anapproximately horizontal plane
(3) By rotating the camera around the fixed axis to thedifferent positions another set of chessboard imagesare collected
(4) The extrinsic parameters of the camera at the posi-tions 120579
119894119877119894 and119879
119894are obtained by calling the function
cvFindExtrinsicCameraParams2 in Opencv [20](5) The rotation matrix and translation vector between
the camera coordinate systems at the positions 120579119894and
1205791 1198771198941 and 119879
1198941are figured out by (12) The rotation
angle 1205731198941and its corresponding rotation matrix R
1205731198941
are also calculated by (2) and (14)(6) The 119877
1198941and 119877
1205731198941are converted into quaternions 119902
119860119894
and 119902119861119894 respectively by using the method proposed
by Bar-Itzhack [18](7) 119902119883is found by solving problem (34) As a result the
119877119909can be obtained
(8) 119879119883is obtained by solving problem (38)
52 3D Coordinate Calculation Based on the previous dis-cussions we present the complete process of 3D coordinatecalculation as follows
(1) The camera is fixed on a camera tripodwhose rotationaxis lies in an approximately horizontal plane
(2) At certain position 1205791 the image of the object 119868
1is
collected(3) By rotating the camera on the fixed axis to another
position 1205792 we get another image of the object 119868
2
(4) The rotation angle 12057321and its corresponding rotation
matrix 11987712057321
are figured out by (2) and (14)(5) With119877
1205732111987721and11987921are calculated by (20) and (21)
(6) The image coordinate of the point of the object(1198831 1198841) on the image 119868
1is appointed manually
6 ISRNMachine Vision
(7) The corresponding image coordinate in the image 1198682
(1198832 1198842) can be determined by stereo correspondence
method for example the function FindStereoCorre-spondenceBM in Opencv [20] or by manual
(8) The 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (119909
1 1199101 1199111) can be figured out by (6)
6 Experiments
Since digital camera with 3-axis accelerometer sensor is notavailable for us IPhone 4 which has the sensor is adopted Tosimulate digital camera the phone is placed in a box which isfixed on a camera tripod And it is ensured that the 119911 axis ofthe sensor (experiment results show that the axis is parallel tothe optical axis of the camera) is parallel to the rotation axisof the camera tripod so that the value of 119866
119911keeps steady in
the course of the rotation In calibration course the distancebetween the camera and the chessboard is about 1000mm
Figure 4 illustrates the curve of the quaternion of therotation matrix between the camera coordinate systems atthe 119894th position and 1st position with respect to the rotationangle The quaternion 119902
1015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] was calculated
by the proposed method with rotation angle while 119902 =
[1199020 119902119909 119902119910 119902119911] was converted directly from rotation matrix
1198771198941 which was from calibration data From the graph one
can see that the proposed method can calculate the rotationmatrix by rotation angle
Figure 5 plots the curve of translation vector between thecamera coordinate systems at the 119894th position and 1st positionwith respect to rotation angle The vector 119879
1015840= [1198791015840
119909 1198791015840
119910 1198791015840
119911]
was calculated by the proposed method with rotation anglewhile 119879 = [119879
119909 119879119910 119879119911] was directly from calibration data 119879
1198941
From the graphs one can see that the proposed method cancalculate effectively translation vector by rotation angle
In order to estimate the accuracy of 3D coordinatecalculated by the proposed method the chessboard whichhas a bigger block than the one for calibration is printed Thewidth of the block is 466mm and the distance between thechessboard and the camera is about 1200mm The distancebetween two neighbor corners rather than the distancebetween the camera and the chessboard is calculated becausethe measurement of the former by manual is easier For sim-plicity the corners of the blocks in images are automaticallydetected by calling the Opencv function ldquocvFindChessboard-Cornersrdquo [19] Figure 6 depicts the measurement error ratioof the distance between two corners with respect to rotationangle
7 Conclusions
This paper proposed a stereo vision system with singlecamera which requires digital camera with 3-axis accelerom-eter sensor rotating around the fixed axis which is parallelto the axis 119911 of the sensor Under these conditions theslope angle relative to gravity which can be figured outfrom the readouts of the sensor can determine the cameraposition and the rotation angle between two positions candetermine the rotation matrix and translation vector from
0
02
04
06
08
1
12
0 01 02 03 04 05 06 07 08 09Rotation angle
minus02
q0qxqyqz
q0998400
qx998400
qy998400
qz998400
Figure 4 Curve of the quaternion of rotationmatrix with respect torotation angle 119902 = [119902
0 119902119909 119902119910 119902119911] is converted from rotation matrix
and 1199021015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] is calculated by the proposed method
0
10
20
30
40
50
60
0 01 02 03 04 05 06 07 08 09
(mm
)
Rotation angleminus10
TxTyTz
Tx998400
Ty998400
Tz998400
Figure 5 Curve of translation vector with respect to rotation angle119879 = [119879
119909 119879119910 119879119911] was directly from calibration data and 119879
1015840=
[1198791015840
119909 1198791015840
119910 1198791015840
119911] was calculated by (23) with rotation angle
one coordinate system of the camera to another Accordinglygiven the rotation angle and the images of the object at twodifferent positions one before and the other after camerarotation the 3D coordinates of the points on the object canbe determined Theoretical analysis and experimental resultsshow the validity of our method
It should be noticed that few digital cameras are providedwith 3-axis accelerometer sensor However to obtain stereovision we believe that the inexpensive sensor embedded in
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
4 ISRNMachine Vision
M21
y
z
x
O2 minus xyzO1 minus xyz
Mx Mx
Rot (12057321)
z
x
Figure 2 Coordinate system transformation represented by agraph
From (20) and (21) we can see that the 11987721and119879
21can be
calculated by the rotation angle 12057321provided that 119877
119909and 119879
119909
are available
4 Calculation Method of Position andOrientation of Rotation Axis
41 Orientation of Rotation Axis Suppose that the camerabe calibrated at different positions At position 120579
119894 we get
camera calibration data Let 1205731198941= 120579119894minus 1205791 and let 119877
1205731198941denote
the device rotation matrix with respect to rotation angle 1205731198941
1198771198941
and 1198791198941
denote rotation matrix and translation vectorbetween camera coordinate systems at the positions 120579
119894and
1205791 Equation (20) can be written as
1198771198941119877119883= 1198771198831198771205731198941
(22)
When the rotation axis is fixed 119877119909would be constant
Thus given the values of 1198771198941 1198791198941 and 120573
1198941 we can solve (22)
for 119877119909
Using a normalized quaternion to define the rotationbetween two coordinate systems provides a simple andelegant way to formulate successive rotations [15ndash17] Givenrotation matrix
119877 = [
[
1199031
1199032
1199033
1199034
1199035
1199036
1199037
1199038
1199039
]
]
(23)
it can be transformed to the quaternion with the followingequation [18]
1199020= radic1 + 119903
1+ 1199035+ 1199039
119902119909=
1199038minus 1199036
4 times 1199020
119902119910=
1199037minus 1199033
4 times 1199020
119902119911=
1199034minus 1199032
4 times 1199020
(24)
Similarly the quaternion can be transformed to the rotationmatrix with the following equation [18]
119877=[[
[
1199022
0+1199022
119909+1199022
119910+1199022
1199112 (119902119909119902119910minus 1199020119902119911) 2 (119902
119909119902119911+ 1199020119902119910)
2 (119902119909119902119910+ 1199020119902119911) 1199022
0+1199022
119910minus1199022
119909minus 1199022
1199112 (119902119910119902119911minus 1199020119902119909)
2 (119902119909119902119911minus 1199020119902119910) 2 (119902
119910119902119911+ 1199020119902119909) 1199022
0+1199022
119911minus1199022
119909minus1199022
119910
]]
]
(25)
Let 119902119860119894 119902119883 and 119902
119861119894denote the quaternion of 119877
1198941 119877119909
and 1198771205731198941
respectively With quaternion the sequence ofrotation can be formulated as an equation without involvingrotation matrices [15] As a result the problem of solving1198771198941119877119883
= 1198771198831198771205731198941
can be transformed into an equivalentproblem involving the corresponding quaternion as follows[15]
119902119860119894
otimes 119902119883= 119902119883otimes 119902119861119894 (26)
Since the quaternion multiplication can be written inmatrix form and with notations introduced in [19] we havethe following [16]
119902119860119894
otimes 119902119883
= 119876 (119902119860119894) 119902119883 119902
119883otimes 119902119861119894
= 119882(119902119861119894) 119902119883 (27)
where letting 119902 = [1199020 119902119909 119902119910 119902119911]
119876 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
minus119902119911
119902119910
119902119910
119902119911
1199020
minus119902119909
119902119911
minus119902119910
119902119909
1199020
]]]
]
119882 (119902) =[[[
[
1199020
minus119902119909
minus119902119910
minus119902119911
119902119909
1199020
119902119911
minus119902119910
119902119910
minus119902119911
1199020
119902119909
119902119911
119902119910
minus119902119909
1199020
]]]
]
(28)
Moreover these two matrices are orthogonal [16] that is
[119876 (119902)]119879
119876 (119902) = [119882 (119902)]119879
119882(119902) = 119868 (29)
Thus
119876 (119902119860119894) 119902119883= 119882(119902
119861119894) 119902119883
1003817100381710038171003817119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883
1003817100381710038171003817
2
= [119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]119879
[119876 (119902119860119894) 119902119883minus 119882(119902
119861119894) 119902119883]
= 119902119879
1198832119868 minus [119876 (119902
119860119894)]119879
119882(119902119861119894) minus [119882 (119902
119861119894)]119879
119876 (119902119860119894) 119902119883
= 119902119879
119883[2119868 minus 119862
119894] 119902119883
(30)
where
119862119894= [119876 (119902
119860119894)]119879
119882(119902119861119894) + [119882 (119902
119861119894)]119879
119876 (119902119860119894) (31)
Thus the total error function allowing us to compute 119902119883
becomes
119891 (119902119909) =
119899
sum
119894=1
119902119879
119883[2119868 minus 119862
119894] 119902119883
= 119902119879
119883[2119899119868 minus
119899
sum
119894=1
119862119894] 119902119883
= 119902119879
119883[2119899119868 minus 119878] 119902
119883
(32)
ISRNMachine Vision 5
where 119899 is the number of positions of the camera
119878 =
119899
sum
119894=1
119862119894 (33)
119902119909is the unit quaternion Therefore 119902
119909can be obtained by
solving the following problem
min 119891 (119902119883) = min 119902
119879
119883[2119899119868 minus 119878] 119902
119883
st 119902119879
119883119902119883= 1
(34)
42 Position of Rotation Axis At position 120579119894 (21) can be
written as
(119868 minus 1198771198941) 119879119883= 1198791198941 (35)
When the rotation axis is fixed 119879119909would be constant
Thus given the values of 1198771198941and 119879
1198941 we can solve (35) for
119879119883Let 119864119894= [minus119879
1198941 119868 minus 119877
1198941] 119879119884= [1 119879
119879
119909]119879 and thus
119864119894119879119884= 0
10038171003817100381710038171198641198941198791198841003817100381710038171003817
2
2= 119879119879
119884119864119879
119894119864119894119879119884
(36)
Let 119899 denote the number of positions of the camera Thetotal error function is
119892 (119879119884) =
119899
sum
119894=1
119879119879
119884119864119879
119894119864119894119879119884= 119879119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884 (37)
Since the camera rotation axis is approximately verti-cal to 119879
119883 the value of 119905
119911approaches zero Thus 119879
119910=
[1199051199100 1199051199101 1199051199102 1199051199103] can be obtained by solving the following
problem
min 119892 (119879119884) = min119879
119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884
st1199051199100
= 1
1199051199103
lt 1
(38)
5 Calculation of Position and Orientation ofRotation Axis and 3D Coordinate
51 Calculation of Position and Orientation of Rotation AxisBased on the previous discussions the complete process ofcalculation of position and orientation of rotation axis isoutlined below
(1) The chessboard (see Figure 3) is printed and plasteredon plane By rotating and moving the camera prop-erly a set of chessboard images is collected Thenthe values of the intrinsic parameters of the camera119891119909 119891119910 119888119909 and 119888
119910 are obtained by calling the method
proposed in [14]
Figure 3 Chessboard
(2) The camera is fixed on a camera tripod and therotation axis of the camera which is parallel to the119911 axis of the sensor coordinate system lies in anapproximately horizontal plane
(3) By rotating the camera around the fixed axis to thedifferent positions another set of chessboard imagesare collected
(4) The extrinsic parameters of the camera at the posi-tions 120579
119894119877119894 and119879
119894are obtained by calling the function
cvFindExtrinsicCameraParams2 in Opencv [20](5) The rotation matrix and translation vector between
the camera coordinate systems at the positions 120579119894and
1205791 1198771198941 and 119879
1198941are figured out by (12) The rotation
angle 1205731198941and its corresponding rotation matrix R
1205731198941
are also calculated by (2) and (14)(6) The 119877
1198941and 119877
1205731198941are converted into quaternions 119902
119860119894
and 119902119861119894 respectively by using the method proposed
by Bar-Itzhack [18](7) 119902119883is found by solving problem (34) As a result the
119877119909can be obtained
(8) 119879119883is obtained by solving problem (38)
52 3D Coordinate Calculation Based on the previous dis-cussions we present the complete process of 3D coordinatecalculation as follows
(1) The camera is fixed on a camera tripodwhose rotationaxis lies in an approximately horizontal plane
(2) At certain position 1205791 the image of the object 119868
1is
collected(3) By rotating the camera on the fixed axis to another
position 1205792 we get another image of the object 119868
2
(4) The rotation angle 12057321and its corresponding rotation
matrix 11987712057321
are figured out by (2) and (14)(5) With119877
1205732111987721and11987921are calculated by (20) and (21)
(6) The image coordinate of the point of the object(1198831 1198841) on the image 119868
1is appointed manually
6 ISRNMachine Vision
(7) The corresponding image coordinate in the image 1198682
(1198832 1198842) can be determined by stereo correspondence
method for example the function FindStereoCorre-spondenceBM in Opencv [20] or by manual
(8) The 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (119909
1 1199101 1199111) can be figured out by (6)
6 Experiments
Since digital camera with 3-axis accelerometer sensor is notavailable for us IPhone 4 which has the sensor is adopted Tosimulate digital camera the phone is placed in a box which isfixed on a camera tripod And it is ensured that the 119911 axis ofthe sensor (experiment results show that the axis is parallel tothe optical axis of the camera) is parallel to the rotation axisof the camera tripod so that the value of 119866
119911keeps steady in
the course of the rotation In calibration course the distancebetween the camera and the chessboard is about 1000mm
Figure 4 illustrates the curve of the quaternion of therotation matrix between the camera coordinate systems atthe 119894th position and 1st position with respect to the rotationangle The quaternion 119902
1015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] was calculated
by the proposed method with rotation angle while 119902 =
[1199020 119902119909 119902119910 119902119911] was converted directly from rotation matrix
1198771198941 which was from calibration data From the graph one
can see that the proposed method can calculate the rotationmatrix by rotation angle
Figure 5 plots the curve of translation vector between thecamera coordinate systems at the 119894th position and 1st positionwith respect to rotation angle The vector 119879
1015840= [1198791015840
119909 1198791015840
119910 1198791015840
119911]
was calculated by the proposed method with rotation anglewhile 119879 = [119879
119909 119879119910 119879119911] was directly from calibration data 119879
1198941
From the graphs one can see that the proposed method cancalculate effectively translation vector by rotation angle
In order to estimate the accuracy of 3D coordinatecalculated by the proposed method the chessboard whichhas a bigger block than the one for calibration is printed Thewidth of the block is 466mm and the distance between thechessboard and the camera is about 1200mm The distancebetween two neighbor corners rather than the distancebetween the camera and the chessboard is calculated becausethe measurement of the former by manual is easier For sim-plicity the corners of the blocks in images are automaticallydetected by calling the Opencv function ldquocvFindChessboard-Cornersrdquo [19] Figure 6 depicts the measurement error ratioof the distance between two corners with respect to rotationangle
7 Conclusions
This paper proposed a stereo vision system with singlecamera which requires digital camera with 3-axis accelerom-eter sensor rotating around the fixed axis which is parallelto the axis 119911 of the sensor Under these conditions theslope angle relative to gravity which can be figured outfrom the readouts of the sensor can determine the cameraposition and the rotation angle between two positions candetermine the rotation matrix and translation vector from
0
02
04
06
08
1
12
0 01 02 03 04 05 06 07 08 09Rotation angle
minus02
q0qxqyqz
q0998400
qx998400
qy998400
qz998400
Figure 4 Curve of the quaternion of rotationmatrix with respect torotation angle 119902 = [119902
0 119902119909 119902119910 119902119911] is converted from rotation matrix
and 1199021015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] is calculated by the proposed method
0
10
20
30
40
50
60
0 01 02 03 04 05 06 07 08 09
(mm
)
Rotation angleminus10
TxTyTz
Tx998400
Ty998400
Tz998400
Figure 5 Curve of translation vector with respect to rotation angle119879 = [119879
119909 119879119910 119879119911] was directly from calibration data and 119879
1015840=
[1198791015840
119909 1198791015840
119910 1198791015840
119911] was calculated by (23) with rotation angle
one coordinate system of the camera to another Accordinglygiven the rotation angle and the images of the object at twodifferent positions one before and the other after camerarotation the 3D coordinates of the points on the object canbe determined Theoretical analysis and experimental resultsshow the validity of our method
It should be noticed that few digital cameras are providedwith 3-axis accelerometer sensor However to obtain stereovision we believe that the inexpensive sensor embedded in
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
ISRNMachine Vision 5
where 119899 is the number of positions of the camera
119878 =
119899
sum
119894=1
119862119894 (33)
119902119909is the unit quaternion Therefore 119902
119909can be obtained by
solving the following problem
min 119891 (119902119883) = min 119902
119879
119883[2119899119868 minus 119878] 119902
119883
st 119902119879
119883119902119883= 1
(34)
42 Position of Rotation Axis At position 120579119894 (21) can be
written as
(119868 minus 1198771198941) 119879119883= 1198791198941 (35)
When the rotation axis is fixed 119879119909would be constant
Thus given the values of 1198771198941and 119879
1198941 we can solve (35) for
119879119883Let 119864119894= [minus119879
1198941 119868 minus 119877
1198941] 119879119884= [1 119879
119879
119909]119879 and thus
119864119894119879119884= 0
10038171003817100381710038171198641198941198791198841003817100381710038171003817
2
2= 119879119879
119884119864119879
119894119864119894119879119884
(36)
Let 119899 denote the number of positions of the camera Thetotal error function is
119892 (119879119884) =
119899
sum
119894=1
119879119879
119884119864119879
119894119864119894119879119884= 119879119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884 (37)
Since the camera rotation axis is approximately verti-cal to 119879
119883 the value of 119905
119911approaches zero Thus 119879
119910=
[1199051199100 1199051199101 1199051199102 1199051199103] can be obtained by solving the following
problem
min 119892 (119879119884) = min119879
119879
119884[
119899
sum
119894=1
119864119879
119894119864119894]119879119884
st1199051199100
= 1
1199051199103
lt 1
(38)
5 Calculation of Position and Orientation ofRotation Axis and 3D Coordinate
51 Calculation of Position and Orientation of Rotation AxisBased on the previous discussions the complete process ofcalculation of position and orientation of rotation axis isoutlined below
(1) The chessboard (see Figure 3) is printed and plasteredon plane By rotating and moving the camera prop-erly a set of chessboard images is collected Thenthe values of the intrinsic parameters of the camera119891119909 119891119910 119888119909 and 119888
119910 are obtained by calling the method
proposed in [14]
Figure 3 Chessboard
(2) The camera is fixed on a camera tripod and therotation axis of the camera which is parallel to the119911 axis of the sensor coordinate system lies in anapproximately horizontal plane
(3) By rotating the camera around the fixed axis to thedifferent positions another set of chessboard imagesare collected
(4) The extrinsic parameters of the camera at the posi-tions 120579
119894119877119894 and119879
119894are obtained by calling the function
cvFindExtrinsicCameraParams2 in Opencv [20](5) The rotation matrix and translation vector between
the camera coordinate systems at the positions 120579119894and
1205791 1198771198941 and 119879
1198941are figured out by (12) The rotation
angle 1205731198941and its corresponding rotation matrix R
1205731198941
are also calculated by (2) and (14)(6) The 119877
1198941and 119877
1205731198941are converted into quaternions 119902
119860119894
and 119902119861119894 respectively by using the method proposed
by Bar-Itzhack [18](7) 119902119883is found by solving problem (34) As a result the
119877119909can be obtained
(8) 119879119883is obtained by solving problem (38)
52 3D Coordinate Calculation Based on the previous dis-cussions we present the complete process of 3D coordinatecalculation as follows
(1) The camera is fixed on a camera tripodwhose rotationaxis lies in an approximately horizontal plane
(2) At certain position 1205791 the image of the object 119868
1is
collected(3) By rotating the camera on the fixed axis to another
position 1205792 we get another image of the object 119868
2
(4) The rotation angle 12057321and its corresponding rotation
matrix 11987712057321
are figured out by (2) and (14)(5) With119877
1205732111987721and11987921are calculated by (20) and (21)
(6) The image coordinate of the point of the object(1198831 1198841) on the image 119868
1is appointed manually
6 ISRNMachine Vision
(7) The corresponding image coordinate in the image 1198682
(1198832 1198842) can be determined by stereo correspondence
method for example the function FindStereoCorre-spondenceBM in Opencv [20] or by manual
(8) The 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (119909
1 1199101 1199111) can be figured out by (6)
6 Experiments
Since digital camera with 3-axis accelerometer sensor is notavailable for us IPhone 4 which has the sensor is adopted Tosimulate digital camera the phone is placed in a box which isfixed on a camera tripod And it is ensured that the 119911 axis ofthe sensor (experiment results show that the axis is parallel tothe optical axis of the camera) is parallel to the rotation axisof the camera tripod so that the value of 119866
119911keeps steady in
the course of the rotation In calibration course the distancebetween the camera and the chessboard is about 1000mm
Figure 4 illustrates the curve of the quaternion of therotation matrix between the camera coordinate systems atthe 119894th position and 1st position with respect to the rotationangle The quaternion 119902
1015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] was calculated
by the proposed method with rotation angle while 119902 =
[1199020 119902119909 119902119910 119902119911] was converted directly from rotation matrix
1198771198941 which was from calibration data From the graph one
can see that the proposed method can calculate the rotationmatrix by rotation angle
Figure 5 plots the curve of translation vector between thecamera coordinate systems at the 119894th position and 1st positionwith respect to rotation angle The vector 119879
1015840= [1198791015840
119909 1198791015840
119910 1198791015840
119911]
was calculated by the proposed method with rotation anglewhile 119879 = [119879
119909 119879119910 119879119911] was directly from calibration data 119879
1198941
From the graphs one can see that the proposed method cancalculate effectively translation vector by rotation angle
In order to estimate the accuracy of 3D coordinatecalculated by the proposed method the chessboard whichhas a bigger block than the one for calibration is printed Thewidth of the block is 466mm and the distance between thechessboard and the camera is about 1200mm The distancebetween two neighbor corners rather than the distancebetween the camera and the chessboard is calculated becausethe measurement of the former by manual is easier For sim-plicity the corners of the blocks in images are automaticallydetected by calling the Opencv function ldquocvFindChessboard-Cornersrdquo [19] Figure 6 depicts the measurement error ratioof the distance between two corners with respect to rotationangle
7 Conclusions
This paper proposed a stereo vision system with singlecamera which requires digital camera with 3-axis accelerom-eter sensor rotating around the fixed axis which is parallelto the axis 119911 of the sensor Under these conditions theslope angle relative to gravity which can be figured outfrom the readouts of the sensor can determine the cameraposition and the rotation angle between two positions candetermine the rotation matrix and translation vector from
0
02
04
06
08
1
12
0 01 02 03 04 05 06 07 08 09Rotation angle
minus02
q0qxqyqz
q0998400
qx998400
qy998400
qz998400
Figure 4 Curve of the quaternion of rotationmatrix with respect torotation angle 119902 = [119902
0 119902119909 119902119910 119902119911] is converted from rotation matrix
and 1199021015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] is calculated by the proposed method
0
10
20
30
40
50
60
0 01 02 03 04 05 06 07 08 09
(mm
)
Rotation angleminus10
TxTyTz
Tx998400
Ty998400
Tz998400
Figure 5 Curve of translation vector with respect to rotation angle119879 = [119879
119909 119879119910 119879119911] was directly from calibration data and 119879
1015840=
[1198791015840
119909 1198791015840
119910 1198791015840
119911] was calculated by (23) with rotation angle
one coordinate system of the camera to another Accordinglygiven the rotation angle and the images of the object at twodifferent positions one before and the other after camerarotation the 3D coordinates of the points on the object canbe determined Theoretical analysis and experimental resultsshow the validity of our method
It should be noticed that few digital cameras are providedwith 3-axis accelerometer sensor However to obtain stereovision we believe that the inexpensive sensor embedded in
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
6 ISRNMachine Vision
(7) The corresponding image coordinate in the image 1198682
(1198832 1198842) can be determined by stereo correspondence
method for example the function FindStereoCorre-spondenceBM in Opencv [20] or by manual
(8) The 3D coordinate of the point on the object relativeto 1198741-119909119910119911 (119909
1 1199101 1199111) can be figured out by (6)
6 Experiments
Since digital camera with 3-axis accelerometer sensor is notavailable for us IPhone 4 which has the sensor is adopted Tosimulate digital camera the phone is placed in a box which isfixed on a camera tripod And it is ensured that the 119911 axis ofthe sensor (experiment results show that the axis is parallel tothe optical axis of the camera) is parallel to the rotation axisof the camera tripod so that the value of 119866
119911keeps steady in
the course of the rotation In calibration course the distancebetween the camera and the chessboard is about 1000mm
Figure 4 illustrates the curve of the quaternion of therotation matrix between the camera coordinate systems atthe 119894th position and 1st position with respect to the rotationangle The quaternion 119902
1015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] was calculated
by the proposed method with rotation angle while 119902 =
[1199020 119902119909 119902119910 119902119911] was converted directly from rotation matrix
1198771198941 which was from calibration data From the graph one
can see that the proposed method can calculate the rotationmatrix by rotation angle
Figure 5 plots the curve of translation vector between thecamera coordinate systems at the 119894th position and 1st positionwith respect to rotation angle The vector 119879
1015840= [1198791015840
119909 1198791015840
119910 1198791015840
119911]
was calculated by the proposed method with rotation anglewhile 119879 = [119879
119909 119879119910 119879119911] was directly from calibration data 119879
1198941
From the graphs one can see that the proposed method cancalculate effectively translation vector by rotation angle
In order to estimate the accuracy of 3D coordinatecalculated by the proposed method the chessboard whichhas a bigger block than the one for calibration is printed Thewidth of the block is 466mm and the distance between thechessboard and the camera is about 1200mm The distancebetween two neighbor corners rather than the distancebetween the camera and the chessboard is calculated becausethe measurement of the former by manual is easier For sim-plicity the corners of the blocks in images are automaticallydetected by calling the Opencv function ldquocvFindChessboard-Cornersrdquo [19] Figure 6 depicts the measurement error ratioof the distance between two corners with respect to rotationangle
7 Conclusions
This paper proposed a stereo vision system with singlecamera which requires digital camera with 3-axis accelerom-eter sensor rotating around the fixed axis which is parallelto the axis 119911 of the sensor Under these conditions theslope angle relative to gravity which can be figured outfrom the readouts of the sensor can determine the cameraposition and the rotation angle between two positions candetermine the rotation matrix and translation vector from
0
02
04
06
08
1
12
0 01 02 03 04 05 06 07 08 09Rotation angle
minus02
q0qxqyqz
q0998400
qx998400
qy998400
qz998400
Figure 4 Curve of the quaternion of rotationmatrix with respect torotation angle 119902 = [119902
0 119902119909 119902119910 119902119911] is converted from rotation matrix
and 1199021015840= [1199021015840
0 1199021015840
119909 1199021015840
119910 1199021015840
119911] is calculated by the proposed method
0
10
20
30
40
50
60
0 01 02 03 04 05 06 07 08 09
(mm
)
Rotation angleminus10
TxTyTz
Tx998400
Ty998400
Tz998400
Figure 5 Curve of translation vector with respect to rotation angle119879 = [119879
119909 119879119910 119879119911] was directly from calibration data and 119879
1015840=
[1198791015840
119909 1198791015840
119910 1198791015840
119911] was calculated by (23) with rotation angle
one coordinate system of the camera to another Accordinglygiven the rotation angle and the images of the object at twodifferent positions one before and the other after camerarotation the 3D coordinates of the points on the object canbe determined Theoretical analysis and experimental resultsshow the validity of our method
It should be noticed that few digital cameras are providedwith 3-axis accelerometer sensor However to obtain stereovision we believe that the inexpensive sensor embedded in
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
ISRNMachine Vision 7
0
2
4
6
8
10
12
0 01 02 03 04 05 06 07 08
Erro
r rat
io (
)
Rotation angle
Figure 6 Measurement error ratio () of the distance between twocorners with respect to rotation angle
digital camera is worthy Moreover due to higher imagequality and larger focus range higher accuracy and largerrange of measurement may be obtained Furthermore thesmart phone which has the sensor is popular If a mini fixedrotation axis is built in a corner of the phone and it does notmove in the course of rotation with the proposed methodthe phone may estimate the size of the object being focusedon and distance between the phone and the object
Acknowledgment
This work is supported by Science and Technology ResearchProject of Chongqingrsquos Education Committee (KJ110806)
References
[1] Y Sooyeong and N Ahuja ldquoAn omnidirectional stereo visionsystem using a single camerardquo in Proceedings of the 18thInternational Conference on Pattern Recognition (ICPR rsquo06) pp861ndash865 August 2006
[2] D Marr and T Poggio ldquoA computational theory of humanstereo visionrdquo Proceedings of the Royal Society of London vol204 no 1156 pp 301ndash328 1979
[3] F Tombari S Mattoccia L D Stefano and E AddimandaldquoClassification and evaluation of cost aggregation methodsfor stereo correspondencerdquo in Proceedings of the 26th IEEEConference on Computer Vision and Pattern Recognition (CVPRrsquo08) pp 1ndash8 June 2008
[4] H C Longuet-higgins ldquoA computer algorithm for reconstruct-ing a scene from two projectionsrdquoNature vol 293 no 5828 pp133ndash135 1981
[5] J Gluckman and S K Nayar ldquoCatadioptric stereo using planarmirrorsrdquo International Journal of Computer Vision vol 44 no1 pp 65ndash79 2001
[6] D Lee and I Kweon ldquoA novel stereo camera system by abiprismrdquo IEEE Transactions on Robotics and Automation vol16 no 5 pp 528ndash541 2000
[7] T Svoboda and T Pajdla ldquoEpipolar geometry for centralcatadioptric camerasrdquo International Journal of Computer Visionvol 49 no 1 pp 23ndash37 2002
[8] D Southwell A Basu M Fiala and J Reyda ldquoPanoramicstereordquo International Conference on Pattern Recognition vol 1pp 378ndash382 1996
[9] A Criminisi I Reid and A Zisserman ldquoSingle view metrol-ogyrdquo International Journal of Computer Vision vol 40 no 2 pp123ndash148 2000
[10] A Criminisi ldquoSingle-view metrology algorithms and applica-tionsrdquo in Proceedings of the 24th DAGM Symposium on PatternRecognition vol 2449 of Lecture Notes in Computer Science pp224ndash239 2002
[11] G Wang Z Hu F Wu and H Tsui ldquoSingle view metrologyfrom scene constraintsrdquo Image and Vision Computing vol 23no 9 pp 831ndash840 2005
[12] A Saxena S H Chung and A Y Ng ldquo3-D depth reconstruc-tion from a single still imagerdquo International Journal of ComputerVision vol 76 no 1 pp 53ndash69 2008
[13] J Rekimoto ldquoTilting operations for small screen interfacesrdquo inProceedings of the 9thACMAnnual Symposium onUser InterfaceSoftware and Technology pp 167ndash168 November 1996
[14] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[15] J C K Chou and M Kamel ldquoFinding the position and orien-tation of a sensor on a robot manipulator using quaternionsrdquoInternational Journal of Robotics Research vol 10 no 3 pp 240ndash254 1991
[16] F Dornaika and R Horaud ldquoSimultaneous robot-world andhand-eye calibrationrdquo IEEE Transactions on Robotics andAutomation vol 14 no 4 pp 617ndash622 1998
[17] K H Strobl and G Hirzinger ldquoOptimal hand-eye calibrationrdquoin Proceedings of the IEEERSJ International Conference onIntelligent Robots and Systems (IROS rsquo06) pp 4647ndash4653October 2006
[18] I Y Bar-Itzhack ldquoNew method for extracting the quaternionfrom a rotation matrixrdquo Journal of Guidance Control andDynamics vol 23 no 6 pp 1085ndash1087 2000
[19] M W Walker L Shao and R A Volz ldquoEstimating 3-Dlocation parameters using dual number quaternionsrdquo ImageUnderstanding vol 54 no 3 pp 358ndash367 1991
[20] BGary andKAdrian LearningOpenCV OrsquoReillyMedia 2008
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of