Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Bundle AdjustmentSparse Estimation in Multi-View Geometry
Manmohan Krishna Chandraker
CSE 252C, Fall 2004, UCSD
– p.1
Overview• Recap• Exploiting structure in bundle adjustment• Partitioned Levenberg-Marquardt• Solving sparse linear systems• Banded linear systems
Estimation in Multi-View Geometry – p.2
NotationFor an entityq,• q - true value of the entity• q̄ - measured value of the entity• q̂ - estimated value of the entity
Estimation in Multi-View Geometry – p.3
Gauss-Newton• Given measurementX ∈ R
N and functionf ,estimate parameter̂P ∈ R
M such that
‖ǫ‖ = ‖f(P̂)−X‖
is minimized.• Solution :
• Initial guess :P0.
• Assume linearity aroungP0.
• ǫ0 = X− f(P0).• FindP1 = P0 + ∆ to minimize‖X− f(P1)‖ = ‖X− f(P0)− J∆‖ = ‖ǫ0 − J∆‖.
Recap – p.4
Jacobian
• J =∂X̂
∂Pevaluated at̂P.
J =
∂ bX1
∂P1
∂ bX1
∂P2
· · · ∂ bX1
∂PM
.... ..
...∂ bXN
∂P1
∂ bXN
∂P2
· · · ∂ bXN
∂PM
N×M
• Update equations:
J∆ = ǫ
• Normal equations:
J⊤J∆ = J⊤ǫ
Recap – p.5
Levenberg-Marquardt• Augmented normal equations :
(J⊤J + λI)∆ = J⊤ǫ
• O(n3) in number of parameters
• Homography computation• Noiseless case• Noisy correspondences
• Bundle adjustment
Recap – p.6
Bundle Adjustment : Jacobian
Recap – p.7
Primary & Secondary Structure
Recap – p.8
Exploiting StructureObservations:• Hessian coarsely divided into 4 blocks
H =
[A B
C D
].
• Hessian is symmetric.• BlocksA andD are sparse and block diagonal.
Exploiting Structure – p.9
Exploiting Partitioning• Schur’s complement.
• Reduced bundle system.
Exploiting Structure – p.10
Exploiting symmetry• A square, non-singular matrixA can be
factorized asLDU whereL is lower triangularwith unit diagonal entries,U is upper triangularandD is a diagonal matrix.
• If A is also symmetric,U = L⊤.• If A is also positive definite,D = I.
Exploiting Structure – p.11
Symmetric System SolverGivenAx = LDL⊤x = b, solve forx
• Forward substitution: SolveLx′ = b in order for
components ofx′
x′
i = bi −i−1∑
j=1
Lijx′
j.
• Scaling: SolveDx′′ = x′.
• Back-substitution: SolveL⊤x = x′′ in order for
components ofx
xi = x′′i −
n∑
j=i+1
Ljixj.
Exploiting Structure – p.12
Sparse Factorization• Arrowhead matrices.• Block tridiagonal systems.• Divide and conquer - recursive partitioning.
Exploiting Structure – p.13
Arrowhead MatricesTrivial LDL-decomposition.
Exploiting Structure – p.14
Block Tridiagonal Systems• L andU factors are also block tridiagonal.• Reduced system obtained by recursive2× 2
Schur complementation.
Exploiting Structure – p.15
Recursive Partitioning• Construct elimination graph• Find a separating vertex cut• Re-order into connected components, separating
ones last.
Exploiting Structure – p.16
Partitioned L-M• Parameter vector,P =
[a⊤,b⊤
]⊤∈ R
M
• Measurement vector,X ∈ RN (given).
• Objective : FindP that minimizes squaredMahalanobis distance,
‖ǫ‖2ΣX= ‖ǫ⊤Σ−1
Xǫ‖
whereǫ = X− X̂.
Partitioned Levenberg-Marquardt – p.17
Partitioned Jacobian• Jacobian sub-matrices
A =[∂X̂/∂a
]B =
[∂X̂/∂b
]
• Update equation
J∆ = [A|B]
(∆a
∆b
)= ǫ
• Normal equations
J⊤J∆ = J⊤ǫ
Partitioned Levenberg-Marquardt – p.18
Normal Equations• Normal equations
[A⊤A A⊤B
B⊤A B⊤B
](∆a
∆b
)=
(A⊤ǫ
B⊤ǫ
)
• Augmented normal equations[
U∗ W
W⊤ V∗
](∆a
∆b
)=
(ǫAǫB
)
Partitioned Levenberg-Marquardt – p.19
Solution
• Pre-multiply by
I −WV∗−1
0 I
to get
U∗ −WV∗−1
W⊤ 0
W⊤ V∗
∆a
∆b
=
ǫA −WV∗−1
ǫB
ǫB
• Solve for∆a :
(U∗ −WV∗−1
W⊤)∆a = ǫA −WV∗−1
ǫB
• Back-substitute for∆b :
V∗∆b = ǫB −W⊤∆a
.Partitioned Levenberg-Marquardt – p.20
Update• Pnew = P +
(∆⊤
a,∆⊤
b
)⊤.
• ǫnew = f(Pnew)−X.• If ‖ǫnew‖ ≤ ‖ǫ‖:
λ← λ/10
else:λ← λ ∗ 10
• Re-iterate until convergence.
Partitioned Levenberg-Marquardt – p.21
Sparse Levenberg-Marquardt• Measurement vector,X = (X⊤1 , · · · ,X⊤n )⊤ ∈ R
N
• Independent measurements⇒ block-diagonalcovariance matrix
ΣX =
ΣX1
ΣX2
. ..ΣXn
Sparse Levenberg-Marquardt – p.22
Sparseness• Parameter vector,
P = (a⊤,b⊤1 , · · · ,b⊤n )⊤ ∈ RM .
• Sparseness assumption : X̂i depends ona andbi
only
∂X̂i
∂bj
= 0, for i 6= j
Sparse Levenberg-Marquardt – p.23
Sparse Jacobian• Jacobian sub-matrices
Ai =[∂X̂i/∂a
]Bi =
[∂X̂i/∂bi
]
• Error vector :ǫ = X− X̂ = (ǫ⊤1 , · · · , ǫ⊤n )⊤.
• Update equations :
J∆ =
A1 B1
A2 B2
..... .
An Bn
∆a
∆b1
...
∆bn
=
ǫ1
...
ǫn
Sparse Levenberg-Marquardt – p.24
SolutionU =
∑
i
A⊤i Ai
V = diag(V1, · · · ,Vn), where Vi = B⊤i Bi
W = [W1, · · · ,Wn] , where Wi = A⊤i Bi
ǫB = (ǫ⊤B1
, · · · , ǫ⊤Bn
)⊤, where ǫBi= B⊤i ǫi
ǫA =∑
i
A⊤i ǫi Yi = WiV∗−1
i
• Compute∆a :(U∗ −
∑i YiW
⊤i )∆a = ǫA −
∑i YiǫBi
.
• Compute∆b : ∆bi= V∗
−1
i (ǫB1−W⊤
1 ∆a).Sparse Levenberg-Marquardt – p.25
Sparseness AdvantageThe matrixV∗ is block diagonal, so• Inversion is easy.• The update equations can be solved one block at
a time.• Each step of the algorithm is linear, so total
complexity isO(n).
Sparse Levenberg-Marquardt – p.26
2D Homography Estimation• Measurements: (xi,x
′i)
• Measurements are noisy versions of true points(x̄i, x̄
′i).
• Objective: Minimize∑
i
d(xi, x̄i)2 + d(x′i,Hx̄i)
2
w. r. t. parameter vectorP = (h, x̄1, · · · , x̄n)⊤.
Application: Homography estimation – p.27
Sparsity Structure• Estimated measurement vector,
f(P) = (X̂⊤1 , X̂⊤2 , · · · , X̂⊤n )⊤
• Sparseness assumption: X̂i depends only onHandx̂i and is independent of̂xj for j 6= i.
• Sparse Jacobian
J∆ =
A1 B1
A2 B2
..... .
An Bn
∆a
∆b1
...
∆bn
Application: Homography estimation – p.28
More ....• X̂i = (x̂⊤i ,Hx̂⊤i )⊤ = (x̂⊤i , x̂′
⊤
i )⊤
• x̂i is independent ofh :
Ai =∂X̂i
∂h=
[0
∂x̂′⊤
i /∂h
]
• Similarly,
Bi =∂X̂i
∂x̂i
=
[I
∂x̂′⊤
i /∂x̂i
]
Application: Homography estimation – p.29
Bundle Adjustment• m cameras,n points.• xij : image ofi-th point byj-th camera.
• Measurement: X = (X1, · · · ,Xn)⊤ where
Xi = (x⊤i1,x⊤i2, · · · ,x
⊤im)⊤.
• Parameter vector: (a⊤,b⊤)⊤
• Camera parameters,a = (a⊤1 , · · · ,a⊤m)⊤.• 3D point parameters,b = (b⊤1 , · · · ,b⊤n )⊤.
• Objective: Minimize total reprojection error
Bundle Adjustment – p.30
Sparsity Structure• xij depends only onj-th camera
∂x̂ij
∂ak
= 0 for j 6= k
• Jacobian submatrix
Ai =[∂X̂i/∂a
]= diag(Ai1, · · · ,Aim)
whereAij = ∂x̂ij/∂aj.
Bundle Adjustment – p.31
Sparsity Structure (contd.)• Imagexij depends only oni-th 3D point,Xi
∂x̂ij
∂bk
= 0 for j 6= k
• Jacobian submatrix
Bi =[∂X̂i/∂bi
]=
[B⊤i1, · · · ,B
⊤im
]⊤
whereBij = ∂x̂ij/∂bi.
Bundle Adjustment – p.32
Sparsity Structure (contd.)• Completely uncorrelated measurements :
• ΣX = diag(ΣX1, · · · ,ΣXn
).• ΣXi
= diag(Σxi1, · · · ,Σxim
).
Bundle Adjustment – p.33
Sparse Jacobiana1 a2 a3 b1 b2 b3 b4
Figure 1:Bundle adjustment Jacobian: 3 cameras, 4 featuresBundle Adjustment – p.34
Sparse Hessian
TFigure 2:Bundle adjustment Hessian: 3 cameras and 4 features
Bundle Adjustment – p.35
Algorithm: Bundle Adjustment• Compute error vectorǫij = xij − x̂ij and
derivative matrices
Aij = [∂x̂ij/∂aj] Bij = [∂x̂ij/∂bj]
• Compute, fori = 1, · · · , n andj = 1, · · · ,m:
Uj =∑
i
A⊤ijAij Vi =∑
i
B⊤ijBij
ǫaj=
∑
i
A⊤ijǫij ǫbi=
∑
i
B⊤ijǫij
Wij = A⊤ijBij Yij = WijV∗−1
i
Bundle Adjustment – p.36
Algorithm (contd.)• Compute∆a = (∆⊤
a1, · · · ,∆⊤
am)⊤ from
S∆a = (ǫ⊤1 , · · · , ǫ⊤m)⊤
S is m×m block matrix, with block
Sjk = −∑
i
YijW⊤ik + U∗jδjk
ǫj = ǫaj−
∑i Yijǫbi
.
• Back-substitute for∆b = (∆⊤b1
, · · · ,∆⊤bn
)⊤
∆bi= V∗
−1
i (ǫbi−
∑
j
W⊤ij∆aj
)
Bundle Adjustment – p.37
Missing Data• Each point is visible only in some arbitrary subset
of views.
• Supposei-th point not visible inj-th image (xij
not defined).• Simply ignore terms subscripted byij.
Bundle Adjustment – p.38
Sparse Tracks
Figure 3:Track Lifetimes For a tracking sequence, most frames
survive only a fraction of the length of the total sequence.
Banded Strcuture – p.39
Banded Normal Equations• Limited bandwidth tracking leads to banded
structure for matrixS.• Update equations are of the form
S∆a = ǫ
whereS is symmetric, positive definite.
Banded Strcuture – p.40
Banded StructureBlock Sjk 6= [0]⇔ some point is visible in bothj-thandk-th images.
Proof:
• Sjk = −∑
i
WijV∗−1
i W⊤ik.
• Wij = [∂x̂ij/∂aj]⊤ [∂x̂ij/∂bi].
• Wij 6= [0]⇔ featurei is visible to cameraj.
• Sjk 6= [0]⇔ (Wij 6= [0] andWik 6= [0]).
Banded Strcuture – p.41
Banded Sparse Systems
En t i tyEn t i tyEn t i ty
En t i tyEn t i ty
En t i tyEn t i tyEn t i ty
En t i tyEn t i ty
En t i tyEn t i ty
En t i t yEn t i ty
En t i tyEn t i t y
En t i ty En t i tyEn t i t y En t i t yFigure 4: A banded sparse matrix. Black shows non-zero en-
tries.Banded Strcuture – p.42
Skyline Structure
Figure 5:The skyline structure of a sparse matrix. All non-zero
entries lie within the shaded region.Banded Strcuture – p.43
Skyline DecompositionLet A be a symmetric matrix such that Aij = 0 for
j < mi. Let A = LDL⊤. Then Lij = 0 for j < mi.
Skyline structure of a matrix is preserved under
LDL decomposition.
Banded Strcuture – p.44
Solving Banded Systems
• Forward substitution : x′i = bi −i−1∑
j=mi
Lijx′j.
Let Lij = 0 for i > mj. Then,
• Back substitution : xi = x′′i −
mj∑
j=i+1
Ljixj
Banded Strcuture – p.45
Summary• Structure and sparsity help in optimization.• Bundle adjustment is inherently sparse at various
levels.• Carefully structured Levenberg-Marquardt
algorithm can exploit these sparsities at all levels.
Banded Strcuture – p.46