77
Image Recoloring Ron Yanovich & Guy Peled 1

Image Recoloring

  • Upload
    kioshi

  • View
    30

  • Download
    6

Embed Size (px)

DESCRIPTION

Image Recoloring. Ron Yanovich & Guy Peled. Contents. Grayscale coloring background Luminance / Luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor ( Knn ) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) Training - PowerPoint PPT Presentation

Citation preview

Colorization by Example

Image RecoloringRon Yanovich & Guy Peled1

ContentsGrayscale coloring backgroundLuminance / Luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) Training(ii) Classification(iii) Color transfer(iv) Optimization

22Grayscale coloring backgroundColorization definition:The process of adding colors to monochrome image.3

Grayscale coloring backgroundColorization is a term introduced by Wilson Markle in 1970 to describe the computer-assisted process he invented for adding color to black and white movies or TV programs.4

Grayscale coloring backgroundBlack magic ( PC tool )Motion video and film colorization

Color transfer between images (Reinhard et al.)Transferring the color pallet from one color image to another

Transferring color to greyscale images (Welsh et al.)Colorizes an image by matching small pixel neighborhoods in the image to those in the reference image

Unsupervised colorization of black-and-white cartoons (Sykora et al.)Colorization of black and white cartoons (segmented), patch-based sampling and probabilistic reasoning.56

Reinhard et al.Black magic (tool)7

Welsh et alSykora et al.ContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

88Luminance / Luminance channelLuminanceThe amount of light that passes through or is emitted from a particular area

Luminance ChannelY - Full resolution plane thatrepresents the mean luminance information onlyU, V - Full resolution,or lower, planes that represent the chroma (color) information only

9

Luminance / Luminance channel10

11

Luminance / Luminance channel12

Luminance / Luminance channelContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

1313SegmentationThe process of partitioning a digital image into multiple segments (sets of pixels, also known as superpixels)

14SegmentationMaking the image more meaningful and easier to analyzelocate objects and boundariesassigning a label to every pixel in an image15

SegmentationSuperpixel - A polygonal part of a digital image, larger than a normal pixel, that is rendered in the same color and brightness16

SegmentationPossible implementation is mean-shift segmentation

17

ContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

1818Discrete Cosine TransformFinite sequence of data points in terms of a sum of cosine functions oscillating at different frequenciesDCT is a Fourier-related transform similar to the discrete Fourier transform (DFT), but using only real numbers

19

19Discrete Cosine Transform20

Discrete Cosine TransformCan be used for compression21

21ContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

2222K-nearest-neighbor (Knn)In pattern recognition, the k-nearest neighbor algorithm (k-NN) is a non-parametric method for classifying objects based on closest training examples in the feature space.23K-nearest-neighbor (Knn)All instances are points in n-dimensional spaceCloseness between points determined by some distance measureExample

Classification made by Majority Vote among the neighbors24Given n pointsK-nearest-neighbor 2D Ex25

aaaaaaaaaaaaabbbbbbbbbbbb

Point locationPoint ClassGiven new point Classification for k=2Given new point Classification for k=525ContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

2626Linear discriminant analysis (LDA)In the field of machine learning, the goal of statistical classification is to use an object's characteristics to identify which class (or group) it belongs to

27Background

PCA Principal Component Analysis PCA .LDA Linear Discriminant Analysis ( ) ( ) .

27Linear discriminant analysis (LDA)A linear classifier achieves this by making a classification decision based on the value of a linear combination of the characteristicsAn object's characteristics are also known as feature values and are typically presented to the machine in a vector called a feature vector.28Background

PCA Principal Component Analysis PCA .LDA Linear Discriminant Analysis ( ) ( ) .

28Linear discriminant analysis (LDA)There are two broad classes of methods for determining the parameters of a linear classifier

Generative models (conditional density functions)LDA (or Fishers linear discriminant)

Discriminative modelsSupport vector machine (SVM)29BackgroundDiscriminative training often yields higher accuracy than modeling the conditional density functions. However, handling missing data is often easier with conditional density models.

Linear Discriminant Analysis (or Fisher's linear discriminant) (LDA)assumes Gaussian conditional density models#####################discriminative models, which attempt to maximize the quality of the output on a training set. :::Support vector machinean algorithm that maximizes the margin between the decision hyperplane and the examples in the training set.29Linear discriminant analysis (LDA)Discriminative training often yields higher accuracy than modeling the conditional density functions.

However, handling missing data is often easier with conditional density models

30BackgroundDiscriminative training often yields higher accuracy than modeling the conditional density functions. However, handling missing data is often easier with conditional density models.

Linear Discriminant Analysis (or Fisher's linear discriminant) (LDA)assumes Gaussian conditional density models#####################discriminative models, which attempt to maximize the quality of the output on a training set. :::Support vector machinean algorithm that maximizes the margin between the decision hyperplane and the examples in the training set.30Linear discriminant analysis (LDA)LDA seeks to reduce dimensionality while preserving as much of the class discriminatory information as possible

LDA finds a linear subspace that maximizes class separability among the feature vector projections in this space 31

PCA Principal Component Analysis PCA .LDA Linear Discriminant Analysis ( ) ( ) .

31LDA two classesHaving a set of D-dimensional samples:

The samples are divided into 2 groups:N1 belongs to class w1N2 belongs to class w2

Seek to obtain a scalar y by projecting the samples x onto a line: 32http://research.cs.tamu.eduLDA two classesOf all the possible lines we would like to select the one that maximizes the separability of the scalars33

Try to separate the two classes by projecting it onto different lines:34

Unsuccessful separationLDA two classes34Try to separate the two classes by projecting it onto different lines:35

Successful separationReducing the problem dimensionality from two features (x1,x2) to only a scalar value y.

LDA two classes35LDA two classesIn order to find a good projection vector, we need to define a measure of separation

Measure by Distance between mean vectors

36

This axis yields better class separability This axis has a larger distance between means LDA two classes - Fishers solutionFisher suggested maximizing the difference between the means, normalized by a measure of the within-class scatterFor each class we define the scatter, an equivalent of the varianceThe Fisher linear discriminant is defined as the linear function that maximizes the criterion function37

Within class scatterScatter (per class)LDA two classes - Fishers solutionTherefore, we are looking for a projection where samples from the same class are projected very close to each other and, at the same time, the projected means are as farther apart as possible38

whyperplane2 sample classes X1 , X2 39Two Classes - Example

39 are the mean vectors of each class40Two Classes - ExampleS1 , S2 are the covariance matrixes of X1 & X2 (the scatter)

4041Two Classes - ExampleSb is the Between-class scatter matrixSw is the Within-class scatter matrix4142Two Classes - ExampleFinding eigenvalues and eigenvectors4243Two Classes - ExampleLDA Projection found by Fishers Linear DiscriminantApparently, the projection vector that has the highest eigen value provides higher discrimination power between classes 43LDA LimitationLDA is a parametric method since it assumes Gaussian conditional density modelsTherefore if the samples distribution are non-Gaussian, LDA will have difficulties to make the classification for complex structures44

ContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

4545Colorization using optimizationUser scribbles desired colors inside regionsColors are propagated to all pixelsLooking at the YUV spaceRemember: neighboring pixels with similar intensities should have similar colors46Levin at el.

46Colorization using optimizationinput:Y(x; y; t) intensity volumeoutput:U(x; y; t) color volumeV(x; y; t) color volumew(rs) is a weighting function that sums to one and are the mean and variance of the intensities in a window around the pixel47Levin at el.

4748

Colorization using optimization49

Colorization using optimizationContentsGrayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant Analysis (LDA)Colorization using optimizationColorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

5050Colorization by ExampleLevin at el. Process main disadvantage is the need for manually adding colored scribbles.

If we could place colored scribbles automatically, we could get Levin Process to do the rest.

Given a reference color image and a grayscale one, the new process should output a colored image.51R. Irony, D. Cohen-Or and D. LischinskiColorization by Example52

Input grayscale imageAutomatically create scribbled imageInput reference colored imageOutput colored imageTraining stage53

Segment reference imageTraining stage54

Use the reference image in the luminance space(the Y dimension)

Training stage55

Randomly extract k x k pixels surrounding a single pixel.(mach it to its given label)

Training stage56

Extract DCT from each k x k pixels, and add it to the training setFeature vectorColorization by ExampleColorization by example has four stagesTrainingThe luminance channel of the reference image along with the accompanying partial segmentation are provided as a training setConstruct a feature space and a corresponding classifierTo classify between pixels, the classifier must be able to distinguish between different classes mainly based on texture.57Classification stageThis classifier examines the K nearest neighbors of the feature vector and chooses the label by a majority vote.Extracting K x K pixel surrounding a single pixel from the grayscale image Appling DCT transform on the K x K pixels as it`s feature vectorEnter the vector to the classifier 58Classification stage59

Sometimes Knn is not enoughClassification stageFor better results - Discriminating subspace by LDA60Classification stageApplying Knn in a discriminating subspace61Classification stageThe result of this process is a transformation T which transforms the vector of k^2 DCT coefficients to a point in the low-dimensional subspace between pixels p and q asLet f(pixel) its feature vector (DCT coefficients)Let the distance between pixels p and q to be62

Classification stage63

Grayscale imageNaive nearest neighborVoting in feature spaceClassification stageReplace the label of p with the dominant label in his kk neighborhood64The dominant label is the label with the highest confidence conf(p, l)Classification stage65

p middle pixelN(p) pixels from K x K neighborsN(p, l) - pixels from K x K neighbors with label lWx (x some pixel) weight function, depend on the distance between the pixel x and its best matchMx - (x some pixel) the nearest neighbor of q in the feature space, which has the same label as q

Note: this improvement done in the image spaceClassification stage66

Classification stage67

Colorization by Example ClassificationAttempt to robustly determine, for each grayscale image pixel, which region should be used as a color reference for this pixelUsing pixels nearest neighbors in the feature space for classification68Color transfer stageGetting the color for each grayscale pixel69

C(p) chrominance coordinates of a pixel pMq(p) denotes the pixel in the colored reference image, whose position with respect to Mq is the same as the position of p with respect to qColor transfer stageEach neighbor of p has a matching neighborhood in the reference image (Mq and Mr respectively), which predicts a different color for p The color of p is a result of a weighted average between these predictions70

Colorization by Example Color transferThe matches found for each pixel and its image space neighbors also determine the color that should be assigned to each pixel, along with a measure of confidence in that choice71Optimization stageTransferring color in this manner produces a colorized result

Since some areas might still be misclassied, the colorization will be wrong in such areas72

Optimization stageTo improve the colorization, color transferred only to pixels whose condence in their label is sufficiently large, conf(p, l) > 0.5Those pixels are considered micro-scribbles73

Colorization by Example OptimizationHigh level of confidence are given as micro-scribbles to the optimization-based colorization algorithm of Levin et al.74Results75

Lets review Grayscale coloring backgroundluminance / luminance channelSegmentationDiscrete Cosine TransformK-nearest-neighbor (Knn)Linear Discriminant AnalysisLevin at el.Colorization by Example(i) training(ii) classification(iii) color transfer(iv) optimization

7676Referencehttp://www.realtypin.com/news/story/1066-how-to-choose-the-perfect-paint-colorhttp://www.gimp.org/tutorials/Selective_Color/http://www.inf.ufrgs.br/~eslgastal/DomainTransform/Colorization/index.htmlhttp://www.kyivpost.com/guide/people/ukrainian-puts-color-into-soviet-movies-320761.html?flavour=mobilehttp://www.blackmagic-color.com/?xcmpx=2557http://maginmp.blogspot.co.il/2012/11/rec601709-and-luminance-range-explained.htmlhttp://www.pixiq.com/article/luminance-vs-luminosityhttp://www.umiacs.umd.edu/~mishraka/activeSeg.htmlhttp://www.na-mic.org/Wiki/index.php/Projects:BayesianMRSegmentationhttp://ivrg.epfl.ch/research/superpixelshttp://en.wiktionary.org/wiki/superpixelhttp://en.wikipedia.org/wiki/YUVhttp://www.slidefinder.net/n/nearest_neighbor_locality_sensitive_hashing/nearestniegborlecture/15562362http://webee.technion.ac.il/~lihi/Teaching/048983/Colorization.pdf