58
Image Segmentation February 27, 2007

Image Segmentation

  • Upload
    nickan

  • View
    24

  • Download
    0

Embed Size (px)

DESCRIPTION

Image Segmentation. February 27, 2007. Implicit Scheme is considerably better with topological change. Transition from Active Contours: contour v(t)  front (t) contour energy  forces F A F C image energy  speed function k I Level set: - PowerPoint PPT Presentation

Citation preview

Page 1: Image Segmentation

Image Segmentation

February 27, 2007

Page 2: Image Segmentation

Implicit Scheme is considerably better with topological change.

•Transition from Active Contours:–contour v(t) front (t)

–contour energy forces FA FC

–image energy speed function kI

•Level set:–The level set c0 at time t of a function (x,y,t) is the set of arguments { (x,y) , (x,y,t) = c0 }

–Idea: define a function (x,y,t) so that at any time,

(t) = { (x,y) , (x,y,t) = 0 }•there are many such

• has many other level sets, more or less parallel to

•only has a meaning for segmentation, not any other level set of

Page 3: Image Segmentation

Need to figure out how to evolve the level set function!

Page 4: Image Segmentation

Usual choice for : signed distance to the front (0)

- d(x,y, ) if (x,y) inside the front

(x,y,0) = 0 “ on “

d(x,y, ) “ outside “

0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

-1-1

-1

-1

-1

-1

-1

-1

-1-1-1

-1

-1

-1-1-1

-1

-1

-1 -1 -1 -1

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2 -3 -3

-3

-3

-3

-3

1

1

1

1

1

1

1

1

111

1

1

111

1

1

1

1

1

1

1 1 1 1

1 1

1

1

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2 2 2

2

2

2

3

3

3

3

3

3

3

3

3

33

333

3

3

3

3

3

3

3 4

4

4

4

4

4

4

4444

4

4

4

4

4

4

7

5

5

5

5

55

5

5

5

5

6

6

6 6

(t)(x,y,t)

(x,y,t)

0

-2

5

Level Set Framework

Page 5: Image Segmentation

0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

-1-1

-1

-1

-1

-1

-1

-1

-1-1-1

-1

-1

-1-1-1

-1

-1

-1 -1 -1 -1

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2 -3 -3

-3

-3

-3

-3

1

1

1

1

1

1

1

1

111

1

1

111

1

1

1

1

1

1

1 1 1 1

1 1

1

1

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2 2 2

2

2

2

3

3

3

3

3

3

3

3

3

33

333

3

3

3

3

3

3

3 4

4

4

4

4

4

4

4444

4

4

4

4

4

4

7

5

5

5

5

55

5

5

5

5

6

6

6 6

(x,y,t)

(x,y,t+1) = (x,y,t) + ∆(x,y,t)

1

-1

-1

-1

-1

-1

-1

-1

-1-1

-1

-1

-1-1-1

-1

-1 -1

1

1

1

-1

1

1

1

1

1

1

1

1

1

111

1

111

1

1

1

1

1

1 1

1 1

1

-2

-2

-3

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2 -3 -3

-3

-3

-3

-3

2

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2

0

0

0

0

0

0

0

0

0

0

0

0

00

0

000

0

0

0

0

0

0

0 0

0 0

0

2

2

2

2

3

3

3

3

3

3

3

33

333

3

3

3

3

3

3

3 4

4

4

4

4

4

4

4444

4

4

4

4

4

4

7

5

5

5

5

55

5

5

5

5

6

6

6 6

• no movement, only change of values

• the front may change its topology

• the front location may be between samples

Level Set

Page 6: Image Segmentation

Segmentation with LS:• Initialise the front (0)• Compute (x,y,0)• Iterate:

(x,y,t+1) = (x,y,t) + ∆(x,y,t)

until convergence

• Mark the front (tend)

Level Set

Page 7: Image Segmentation

t

ˆ k I FA FG () 0

link between spatial and temporal derivatives, but not the same type of motion as contours!

div

constant “force”

(balloon pressure)

(x,y,t+1) - (x,y,t)

extension of the speed function kI (image influence)

smoothing “force” depending on the local curvature (contour influence)

spatial derivative of

product of influences

Equation for Front Propagation

Page 8: Image Segmentation

• Speed function:

– kI is meant to stop the front on the object’s boundaries

– similar to image energy: kI(x,y) = 1 / ( 1 + || I (x,y) || )

– only makes sense for the front (level set 0)

– yet, same equation for all level sets

extend kI to all level sets, defining

– possible extension:

k̂I

k̂I (x,y) = kI(x’,y’)where (x’,y’) is the point in the front closest to (x,y)

^( such a kI (x,y) depends on the front location )

Equation for Front Propagation

Page 9: Image Segmentation

0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

-1-1

-1

-1

-1

-1

-1

-1

-1-1-1

-1

-1

-1-1-1

-1

-1

-1 -1 -1 -1

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2 -3 -3

-3

-3

-3

-3

1

1

1

1

1

1

1

1

111

1

1

111

1

1

1

1

1

1

1 1 1 1

1 1

1

1

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2 2 2

2

2

2

3

3

3

3

3

3

3

3

3

33

333

3

3

3

3

3

3

3 4

4

4

4

4

4

4

4444

4

4

4

4

4

4

7

5

5

5

5

55

5

5

5

5

6

6

6 6

1

-1

-1

-1

-1

-1

-1

-1

-1-1

-1

-1

-1-1-1

-1

-1 -1

1

1

1

-1

1

1

1

1

1

1

1

1

1

111

1

111

1

1

1

1

1

1 1

1 1

1

-2

-2

-3

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2 -3 -3

-3

-3

-3

-3

2

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2

0

0

0

0

0

0

0

0

0

0

0

0

00

0

000

0

0

0

0

0

0

0 0

0 0

0

2

2

2

2

3

3

3

3

3

3

3

33

333

3

3

3

3

3

3

3 4

4

4

4

4

4

4

4444

4

4

4

4

4

4

7

5

5

5

5

55

5

5

5

5

6

6

6 6

1. compute the speed kI on the front extend it to all other level sets

2. compute (x,y,t+1) = (x,y,t) + ∆(x,y,t)

3. find the front location (for next iteration) modify (x,y,t+1) by linear interpolation

0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

-1-1

-1

-1

-1

-1

-1

-1

-1-1-1

-1

-1

-1-1-1

-1

-1

-1 -1 -1 -1

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2 -3 -3

-3

-3

-3

-3

1

1

1

1

1

1

1

1

111

1

1

111

1

1

1

1

1

1

1 1 1 1

1 1

1

1

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2 2 2

2

2

2

3

3

3

3

3

3

3

3

3

33

333

3

3

3

3

3

3

3 4

4

4

4

4

4

4

4444

4

4

4

4

4

4

7

5

5

5

5

55

5

5

5

5

6

6

6 6

0

0

0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

(x,y,t)

Algorithm

Page 10: Image Segmentation

• Weaknesses of algorithm 1– update of all (x,y,t): inefficient, only care about the front

– speed extension: computationally expensive

• Improvement:– narrow band: only update a few level sets around – other extended speed: kI(x,y) = 1 / ( 1 + || I(x,y)|| )^

0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

-1-1

-1

-1

-1

-1

-1

-1

-1-1-1

-1

-1

-1-1-1

-1

-1

-1 -1 -1 -1

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2

1

1

1

1

1

1

1

1

111

1

1

111

1

1

1

1

1

1

1 1 1 1

1 1

1

1

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2 2 2

2

2

2

Narrow band extension

Page 11: Image Segmentation

• Caution:– extrapolate the

curvature at the edges

– re-select the narrow band regularly:an empty pixel cannot get a value may restrict the evolution of the front 0

0

0

0 0 0 0

0 0

0

0

0

0

0

0

0

0

000

0

0

000

0

-1-1

-1

-1

-1

-1

-1

-1

-1-1-1

-1

-1

-1-1-1

-1

-1

-1 -1 -1 -1

-2 -2 -2 -2

-2

-2

-2

-2-2-2

-2

-2

-2 -2

1

1

1

1

1

1

1

1

111

1

1

111

1

1

1

1

1

1

1 1 1 1

1 1

1

1

2

2

2

2

2

2

2

22

2

222

2

2

2

2

2

2

2

2

2 2 2 2 2

2

2

2

Narrow band extension

Page 12: Image Segmentation

• Level sets:– function : [ 0 , Iwidth ] x [ 0 , Iheight ] x N R

( x , y , t ) (x,y,t)

– embed a curve : (t) = { (x,y) , (x,y,t) = 0 } (0) is provided externally, (x,y,0) is computed (x,y,t+1) is computed by changing the values of (x,y,t)

– changes using a product of influences

– on convergence, (tend) is the border of the object

• Issue:– computation time (improved with narrow band)

Summary

Page 13: Image Segmentation
Page 14: Image Segmentation

Segmentation: A region in the image

1. with some homogeneous properties (intensity, colors, texture, … )

2. Cohesion (moved in a similar way, motion segmentation)

Active contours will have difficulties with natural images such as

Image Segmentation

Similarity (intensity difference)Similarity (intensity difference) ProximityProximity ContinuityContinuity

Page 15: Image Segmentation

Image Segmentation

• The first step towards higher level vision (object recognition etc.

• There may not be a single correct answer.

• Segmentation can be considered as a partition problem.

• Literature on this topic is tremendous.

• Many approaches:

• Cues such as color, regions, contours, texture, motion, etc.

• Automatic vs. user-assisted

Page 16: Image Segmentation

Image Segmentation Results

Page 17: Image Segmentation

1. Histogram-based segmentation

2. Region-based segmentation

• Edge detection

• Region growing, spliting and merging.

3. Clustering

• K-means

4. Graph based clustering

Main Approaches

Page 18: Image Segmentation

Simple Example (text segmentation)

Thresholding

How to choose threshold value? (128/256, median/mean etc…)

Page 19: Image Segmentation

• Break images into K regions.

• Reducing intensity values into K different levels.

Threshold value

Histogram-based Methods

Page 20: Image Segmentation

Consider the image as a set of points in N-dimensional feature space:

1. Intensity values or color values [ (x, y, I) or (x, y, r, g, b) ]

2. Texture and other features

Segmentation as a clustering problem

Work directly in the feature space and cluster these points in the feature space.

Require:

1. A good definition of feature space

2. Distance between feature points should be meaningful

Page 21: Image Segmentation
Page 22: Image Segmentation

What is the “correct” grouping?

Page 23: Image Segmentation

Segmentation as a clustering problem

Page 24: Image Segmentation

Two Sub-problems

Page 25: Image Segmentation

K-means clustering

Page 26: Image Segmentation

Segmentation Result

Page 27: Image Segmentation

• Set of points of the feature space represented as a weighted, undirected graph, G = (V, E)

• The points of the feature space are the nodes of the graph.

• Edge between every pair of nodes.• Weight on each edge, w(i, j), is a function of the

similarity between the nodes i and j.• Partition the set of vertices into disjoint sets where

similarity within the sets is high and across the sets is low.

Graph Partitioning Model

Page 28: Image Segmentation

• Weight measure (reflects likelihood of two pixels belonging to the same object)

Weight function

Note: the function is based on local similiarity

Page 29: Image Segmentation

Images as Graphs

Page 30: Image Segmentation
Page 31: Image Segmentation

• A graph can be partitioned into two disjoint sets by simply removing the edges connecting the two parts

• The degree of dissimilarity between these two pieces can be computed as total weight of the edges that have been removed

• More formally, it is called the ‘cut’

Global Criterion Selection

Page 32: Image Segmentation
Page 33: Image Segmentation

• Minimize the cut value

• No of such partitions is exponential (2N) but the minimum cut can be found efficiently

Reference: Z. Wu and R. Leahy, “An Optimal Graph Theoretic Approach to Data Clustering: Theory and Its Application to Image Segmentation”. IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, no. 11, pp. 1101-1113, Nov. 1993.

Subject to the constraints:

Optimization Problem

Page 34: Image Segmentation
Page 35: Image Segmentation

• Picking an appropriate criterion to minimize which would result in a “good” segmentation

• Finding an efficient way to achieve the minimization

Challenges

Page 36: Image Segmentation

• Set of points in the feature space with similarity (relation) defined for pairs of points.

• Problem: Partition the feature points into disjoint sets where similarity within the sets is high and across the sets is low.

• Construct a complete graph V, and nodes are the points.

• Edge between every pair of nodes.• Weight on each edge, w(i, j), is a function of the

similarity between the nodes i and j.• A cut (of V) gives the partition.

Graph Partitioning Model

Page 37: Image Segmentation

• Minimize the cut value

• No of such partitions is exponential (2N) but the minimum cut can be found efficiently

Reference: Z. Wu and R. Leahy, “An Optimal Graph Theoretic Approach to Data Clustering: Theory and Its Application to Image Segmentation”. IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, no. 11, pp. 1101-1113, Nov. 1993.

Subject to the constraints:

Optimization Problem

Page 38: Image Segmentation

Weights defined as Wij = exp (-|si-sj|2/22)

Page 39: Image Segmentation

• We must avoid unnatural bias for partitioning out small sets of points

• Normalized Cut - computes the cut cost as a fraction of the total edge connections to all the nodes in the graph

where

Normalized Cut

J. Shi and J. Malik, “Normalized Cuts and Image Segmentation,”

IEEE Trans. Pattern Analysis and Machine Intelligence,

vol. 22, no. 8, pp. 888-905, Aug. 2000.

Page 40: Image Segmentation

• Our criteria can also aim to tighten similarity within the groups

• Minimizing Ncut and maximizing Nassoc are equivalent (2-Nassoc = Ncut)

Normalized Cut

Page 41: Image Segmentation

Tighter cluster

Weights defined as Wij = exp (-|si-sj|2/22)

Larger Nassoc(A, B) value reflects tigher cluster.

Smaller Nassoc(A, B) value

Page 42: Image Segmentation

Let e be an indicator vector (of dimension N ):

e = 1, if i belongs to A

0, otherwise• Assoc(A, A) = eTWe

• Assoc(A, V) = eTDe

• Cut(A, V-A) = eT(D – W)e

Matrix Formulation

Find two indicator vectors e1, e2, such that (e1t e2=0)

is minimized.

Page 43: Image Segmentation

• Exact solution to minimizing normalized cut is an NP-complete problem

• However, approximate discrete solutions can be found efficiently

• Normalized cut criterion can be computed efficiently by solving a generalized eigenvalue problem

Computational Issues

Page 44: Image Segmentation

Relaxation

Page 45: Image Segmentation

1. Construct the weighted graph representing the image. Summarize the information into matrices, W and D. Edge weight is an exponential function of feature similarity as well as distance measure.

2. Solve for the eigenvector with the second smallest eigenvalue of:

Lx=(D – W)x = Dx (L = D-W)

Algorithm (for image segmentation)

Page 46: Image Segmentation

3. Partition the graph into two pieces using the second smallest eigenvector. Signs tell us exactly how to partition the graph.

4. Recursively run the algorithm on the two partitioned parts. Recursion stops once the Ncut value exceeds a certain limit. This maximum allowed Ncut value controls the number of groups segmented.

Algorithm (cont.)

Page 47: Image Segmentation
Page 48: Image Segmentation

Example

Page 49: Image Segmentation
Page 50: Image Segmentation

Results

Page 51: Image Segmentation

Results

Page 52: Image Segmentation

K-way Normalized Cut One can recursively apply normalized cut to obtain desired number of subgraphs. Better still, we can do the following.

Let e1, e2, … eK denote the indicator vectors of K-partition of the graph.

We want to maximize this cost function.

The trick is to rewrite this optimization problem

Page 53: Image Segmentation

K-way Normalized Cut

Proposition (Bach-Jordan 2003) CW(e1, …, ek) equals to tr ( Yt D-1/2 W D-1/2 Y ) for any N-by-K matrix Y such that (N is the number of data points )

1. Columns of D-1/2 Y are piecewise constant with respect to the clusters

2. Y has orthonormal columns (Yt Y = I).

Why this? Because we reduce (and relax) the problem to a very solvable form (appears frequently)

Maximize tr ( Yt M Y) subject to Yt Y = I (with M symmetric!)

Answer: Y=K-largest eigenvectors of M.

Page 54: Image Segmentation

Spectral Clustering (Ng, Weiss&Jordan, 2001)

Using spectral method (i.e., eigenstructure of some symmetric matrix) to computr clusters.

Given a set of points s1, s2, ... sn

1. Form affinity matrix A (W before) Aij = exp (-|si-sj|2/22) if i j, and Aij=0.

2. D the diagonal matrix whose (i, i)-element is the sum of A’s i-th row, and let L=D-1/2 A D-1/2.

3. Find x1, x2, . . ., xk the k largest eigenvectors of L and form X=[x1x2 . . . xk].

4. Renormalizing each of X’s rows to have unit length

5. Treating each row of Y as a point in Rk, cluster them into k clusters via K-means or other algorithm.

6. Si is assigned to cluster j if and only if row i of Y was assigned to cluster j.

Page 55: Image Segmentation

Spectral Clustering

In the ideal case when there is no inter-cluster affinity, the matrix A is block-diagonal (after some row, column permutation).

Computed Affinity matrix

Page 56: Image Segmentation

Spectral Clustering

Each block has 1 as its largest eigenvalue and the top K eigenvectors are all 1. X = [ X1 X2 X3 … XK ]

X’s rows (after normalization) give you a ‘projection’ of the data points in the sphere in RK.

Page 57: Image Segmentation

Spectral Clustering

Page 58: Image Segmentation

K-MeansSpectral Technique