Contents Grayscale coloring background Luminance / Luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) Training (ii)
Classification (iii) Color transfer (iv) Optimization 2
Slide 3
Grayscale coloring background Colorization definition: The
process of adding colors to monochrome image. 3
Slide 4
Grayscale coloring background Colorization is a term introduced
by Wilson Markle in 1970 to describe the computer-assisted process
he invented for adding color to black and white movies or TV
programs. 4
Slide 5
Grayscale coloring background Black magic ( PC tool ) Motion
video and film colorization Color transfer between images (Reinhard
et al.) Transferring the color pallet from one color image to
another Transferring color to greyscale images (Welsh et al.)
Colorizes an image by matching small pixel neighborhoods in the
image to those in the reference image Unsupervised colorization of
black-and-white cartoons (Sykora et al.) Colorization of black and
white cartoons (segmented), patch-based sampling and probabilistic
reasoning. 5
Slide 6
6 Reinhard et al. Black magic (tool)
Slide 7
7 Welsh et al Sykora et al.
Slide 8
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 8
Slide 9
Luminance / Luminance channel Luminance The amount of light
that passes through or is emitted from a particular area Luminance
Channel Y - Full resolution plane that represents the mean
luminance information only U, V - Full resolution, or lower, planes
that represent the chroma (color) information only 9
Slide 10
Luminance / Luminance channel 10
Slide 11
11 Luminance / Luminance channel
Slide 12
12 Luminance / Luminance channel
Slide 13
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 13
Slide 14
Segmentation The process of partitioning a digital image into
multiple segments (sets of pixels, also known as superpixels)
14
Slide 15
Segmentation Making the image more meaningful and easier to
analyze locate objects and boundaries assigning a label to every
pixel in an image 15
Slide 16
Segmentation Superpixel - A polygonal part of a digital image,
larger than a normal pixel, that is rendered in the same color and
brightness 16
Slide 17
Segmentation Possible implementation is mean-shift segmentation
17
Slide 18
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 18
Slide 19
Discrete Cosine Transform Finite sequence of data points in
terms of a sum of cosine functions oscillating at different
frequencies DCT is a Fourier-related transform similar to the
discrete Fourier transform (DFT), but using only real numbers
19
Slide 20
Discrete Cosine Transform 20
Slide 21
Discrete Cosine Transform Can be used for compression 21
Slide 22
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 22
Slide 23
K-nearest-neighbor (Knn) In pattern recognition, the k-nearest
neighbor algorithm (k-NN) is a non-parametric method for
classifying objects based on closest training examples in the
feature space. 23
Slide 24
K-nearest-neighbor (Knn) All instances are points in
n-dimensional space Closeness between points determined by some
distance measure Example Classification made by Majority Vote among
the neighbors 24
Slide 25
Given n points K-nearest-neighbor 2D Ex 25 a a a a a a a a a a
a a a b b b b b b b b b b b b Point location Point Class Given new
point Classification for k=2 Given new point Classification for
k=5
Slide 26
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 26
Slide 27
Linear discriminant analysis (LDA) In the field of machine
learning, the goal of statistical classification is to use an
object's characteristics to identify which class (or group) it
belongs to 27 Background
Slide 28
Linear discriminant analysis (LDA) A linear classifier achieves
this by making a classification decision based on the value of a
linear combination of the characteristics An object's
characteristics are also known as feature values and are typically
presented to the machine in a vector called a feature vector. 28
Background
Slide 29
Linear discriminant analysis (LDA) There are two broad classes
of methods for determining the parameters of a linear classifier
Generative models (conditional density functions) LDA (or Fishers
linear discriminant) Discriminative models Support vector machine
(SVM) 29 Background
Slide 30
Linear discriminant analysis (LDA) Discriminative training
often yields higher accuracy than modeling the conditional density
functions. However, handling missing data is often easier with
conditional density models 30 Background
Slide 31
Linear discriminant analysis (LDA) LDA seeks to reduce
dimensionality while preserving as much of the class discriminatory
information as possible LDA finds a linear subspace that maximizes
class separability among the feature vector projections in this
space 31
Slide 32
LDA two classes Having a set of D-dimensional samples: The
samples are divided into 2 groups: N1 belongs to class w1 N2
belongs to class w2 Seek to obtain a scalar y by projecting the
samples x onto a line: 32 http://research.cs.tamu.edu
Slide 33
LDA two classes Of all the possible lines we would like to
select the one that maximizes the separability of the scalars
33
Slide 34
Try to separate the two classes by projecting it onto different
lines: 34 Unsuccessful separation LDA two classes
Slide 35
Try to separate the two classes by projecting it onto different
lines: 35 Successful separation Reducing the problem dimensionality
from two features (x1,x2) to only a scalar value y. LDA two
classes
Slide 36
In order to find a good projection vector, we need to define a
measure of separation Measure by Distance between mean vectors 36
This axis yields better class separability This axis has a larger
distance between means
Slide 37
LDA two classes - Fishers solution Fisher suggested maximizing
the difference between the means, normalized by a measure of the
within- class scatter For each class we define the scatter, an
equivalent of the variance The Fisher linear discriminant is
defined as the linear function that maximizes the criterion
function 37 Within class scatter Scatter (per class)
Slide 38
LDA two classes - Fishers solution Therefore, we are looking
for a projection where samples from the same class are projected
very close to each other and, at the same time, the projected means
are as farther apart as possible 38 w hyperplane
Slide 39
2 sample classes X1, X2 39 Two Classes - Example
Slide 40
are the mean vectors of each class 40 Two Classes - Example S1,
S2 are the covariance matrixes of X1 & X2 (the scatter)
Slide 41
41 Two Classes - Example S b is the Between-class scatter
matrix S w is the Within-class scatter matrix
Slide 42
42 Two Classes - Example Finding eigenvalues and
eigenvectors
Slide 43
43 Two Classes - Example LDA Projection found by Fishers Linear
Discriminant Apparently, the projection vector that has the highest
eigen value provides higher discrimination power between
classes
Slide 44
LDA Limitation LDA is a parametric method since it assumes
Gaussian conditional density models Therefore if the samples
distribution are non- Gaussian, LDA will have difficulties to make
the classification for complex structures 44
Slide 45
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 45
Slide 46
Colorization using optimization User scribbles desired colors
inside regions Colors are propagated to all pixels Looking at the
YUV space Remember: neighboring pixels with similar intensities
should have similar colors 46 Levin at el.
Slide 47
Colorization using optimization input: Y(x; y; t) intensity
volume output: U(x; y; t) color volume V(x; y; t) color volume
w(rs) is a weighting function that sums to one and are the mean and
variance of the intensities in a window around the pixel 47 Levin
at el.
Slide 48
48 Colorization using optimization
Slide 49
49 Colorization using optimization
Slide 50
Contents Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis (LDA) Colorization using
optimization Colorization by Example (i) training (ii)
classification (iii) color transfer (iv) optimization 50
Slide 51
Colorization by Example Levin at el. Process main disadvantage
is the need for manually adding colored scribbles. If we could
place colored scribbles automatically, we could get Levin Process
to do the rest. Given a reference color image and a grayscale one,
the new process should output a colored image. 51 R. Irony, D.
Cohen-Or and D. Lischinski
Slide 52
Colorization by Example 52 Input grayscale image Automatically
create scribbled image Input reference colored imageOutput colored
image
Slide 53
Training stage 53 Segment reference image
Slide 54
Training stage 54 Use the reference image in the luminance
space (the Y dimension)
Slide 55
Training stage 55 Randomly extract k x k pixels surrounding a
single pixel. (mach it to its given label)
Slide 56
Training stage 56 Extract DCT from each k x k pixels, and add
it to the training set Feature vector
Slide 57
Colorization by Example Colorization by example has four stages
I. Training The luminance channel of the reference image along with
the accompanying partial segmentation are provided as a training
set Construct a feature space and a corresponding classifier To
classify between pixels, the classifier must be able to distinguish
between different classes mainly based on texture. 57
Slide 58
Classification stage This classifier examines the K nearest
neighbors of the feature vector and chooses the label by a majority
vote. Extracting K x K pixel surrounding a single pixel from the
grayscale image Appling DCT transform on the K x K pixels as it`s
feature vector Enter the vector to the classifier 58
Slide 59
Classification stage 59 Sometimes Knn is not enough
Slide 60
Classification stage For better results - Discriminating
subspace by LDA 60
Slide 61
Classification stage Applying Knn in a discriminating subspace
61
Slide 62
Classification stage The result of this process is a
transformation T which transforms the vector of k^2 DCT
coefficients to a point in the low-dimensional subspace between
pixels p and q as Let f(pixel) its feature vector (DCT
coefficients) Let the distance between pixels p and q to be 62
Slide 63
Classification stage 63 Grayscale imageNaive nearest neighbor
Voting in feature space
Slide 64
Classification stage Replace the label of p with the dominant
label in his kk neighborhood 64 The dominant label is the label
with the highest confidence conf(p, l )
Slide 65
Classification stage 65 p middle pixel N(p) pixels from K x K
neighbors N(p, l ) - pixels from K x K neighbors with label l W x
(x some pixel) weight function, depend on the distance between the
pixel x and its best match M x - (x some pixel) the nearest
neighbor of q in the feature space, which has the same label as q
Note: this improvement done in the image space
Slide 66
Classification stage 66
Slide 67
Classification stage 67
Slide 68
Colorization by Example I. II. Classification Attempt to
robustly determine, for each grayscale image pixel, which region
should be used as a color reference for this pixel Using pixels
nearest neighbors in the feature space for classification 68
Slide 69
Color transfer stage Getting the color for each grayscale pixel
69 C(p) chrominance coordinates of a pixel p Mq(p) denotes the
pixel in the colored reference image, whose position with respect
to Mq is the same as the position of p with respect to q
Slide 70
Color transfer stage Each neighbor of p has a matching
neighborhood in the reference image (Mq and Mr respectively), which
predicts a different color for p The color of p is a result of a
weighted average between these predictions 70
Slide 71
Colorization by Example I. II. III. Color transfer The matches
found for each pixel and its image space neighbors also determine
the color that should be assigned to each pixel, along with a
measure of confidence in that choice 71
Slide 72
Optimization stage Transferring color in this manner produces a
colorized result Since some areas might still be misclassied, the
colorization will be wrong in such areas 72
Slide 73
Optimization stage To improve the colorization, color
transferred only to pixels whose condence in their label is
sufficiently large, conf(p, l ) > 0.5 Those pixels are
considered micro-scribbles 73
Slide 74
Colorization by Example I. II. III. IV. Optimization High level
of confidence are given as micro-scribbles to the
optimization-based colorization algorithm of Levin et al. 74
Slide 75
Results 75
Slide 76
Lets review Grayscale coloring background luminance / luminance
channel Segmentation Discrete Cosine Transform K-nearest-neighbor
(Knn) Linear Discriminant Analysis Levin at el. Colorization by
Example (i) training (ii) classification (iii) color transfer (iv)
optimization 76