800 likes | 995 Views
Image Recoloring. Ron Yanovich & Guy Peled. Contents. Grayscale coloring background Luminance / Luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor ( Knn ) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) Training
E N D
Image Recoloring Ron Yanovich & Guy Peled
Contents • Grayscale coloring background • Luminance / Luminance channel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) Training • (ii) Classification • (iii) Color transfer • (iv) Optimization
Grayscale coloring background • Colorization definition: ‘The process of adding colors to monochrome image.’
Grayscale coloring background • Colorization is a term introduced by Wilson Markle in 1970 to describe the computer-assisted process he invented for adding color to black and white movies or TV programs.
Grayscale coloring background • Black magic ( PC tool ) • Motion video and film colorization • “Color transfer between images” (Reinhardet al.) • Transferring the color pallet from one color image to another • “Transferring color to greyscaleimages” (Welsh et al.) • Colorizes an image by matching small pixel neighborhoods in the image to those in the reference image • “Unsupervised colorization of black-and-white cartoons” (Sykoraet al.) • Colorization of black and white cartoons (segmented), patch-based sampling and probabilistic reasoning.
Black magic (tool) Reinhardet al.
Welsh et al Sykoraet al.
Contents • Grayscalecoloringbackground • luminance / luminance channel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization
Luminance / Luminance channel • Luminance • The amount of light that passes through or is emitted from a particular area • Luminance Channel • Y - Full resolution plane that represents the mean luminance information only • U, V - Full resolution, or lower, planes that represent the chroma (color) information only
Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization
Segmentation • The process of partitioning a digital image into multiple segments (sets of pixels, also known as superpixels)
Segmentation • Making the image more meaningful and easier to analyze • locate objects and boundaries • assigning a label to every pixel in an image
Segmentation • ‘Superpixel’ - A polygonal part of a digital image, larger than a normal pixel, that is rendered in the same color and brightness
Segmentation • Possible implementation is mean-shift segmentation
Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • Discrete Cosine Transform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization
Discrete Cosine Transform • Finite sequence of data points in terms of a sum of cosine functions oscillating at different frequencies • DCT is a Fourier-related transform similar to the discrete Fourier transform (DFT), but using only real numbers
Discrete Cosine Transform • Can be used for compression
Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization
K-nearest-neighbor (Knn) • In pattern recognition, the k-nearest neighbor algorithm (k-NN) is a non-parametric method for classifying objects based on closest training examples in the feature space.
K-nearest-neighbor (Knn) • All instances are points in n-dimensional space • “Closeness” between points determined by some distance measure • Example • Classification made by Majority Vote among the neighbors
K-nearest-neighbor – 2D Ex • Given n points Point location Point Class a b a a a Given new point Classification for k=2 Given new point Classification for k=5 a a b a a b a a a b b a b b b a b b b b
Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • Linear Discriminant Analysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization
Linear discriminant analysis (LDA) Background • In the field of machine learning, the goal of statistical classification is to use an object's characteristics to identify which class (or group) it belongs to
Linear discriminant analysis (LDA) Background • A linear classifier achieves this by making a classification decision based on the value of a linear combination of the characteristics • An object's characteristics are also known as feature values and are typically presented to the machine in a vector called a feature vector.
Linear discriminant analysis (LDA) Background • There are two broad classes of methods for determining the parameters of a linear classifier • Generative models (conditional density functions) • LDA (or Fisher’s linear discriminant) • Discriminative models • Support vector machine (SVM)
Linear discriminant analysis (LDA) Background • Discriminative training often yields higher accuracy than modeling the conditional density functions. • However, handling missing data is often easier with conditional density models
Linear discriminant analysis (LDA) • LDA seeks to reduce dimensionality while preserving as much of the class discriminatory information as possible • LDA finds a linear subspace that maximizes class separability among the feature vector projections in this space
LDA– two classes • Having a set of D-dimensional samples: • The samples are divided into 2 groups: N1 – belongs to class w1 N2– belongs to class w2 • Seek to obtain a scalar yby projecting the samples x onto a line: http://research.cs.tamu.edu
LDA– two classes • Of all the possible lines we would like to select the one that maximizes the separability of the scalars
LDA – two classes • Try to separate the two classes by projecting it onto different lines: Unsuccessful separation
LDA – two classes • Try to separate the two classes by projecting it onto different lines: Successful separation Reducing the problem dimensionality from two features (x1,x2) to only a scalar value y.
LDA – two classes • In order to find a good projection vector, we need to define a measure of separation • Measure by Distance between mean vectors This axis yields better class separability This axis has a larger distance between means
LDA – two classes - Fisher’s solution • Fisher suggested maximizing the difference between the means, normalized by a measure of the within-class scatter • For each class we define the scatter, an equivalent of the variance • The Fisher linear discriminant is defined as the linear function that maximizes the criterion function Scatter (per class) Within class scatter
LDA – two classes - Fisher’s solution • Therefore, we are looking for a projection where samples from the same class are projected very close to each other and, at the same time, the projected means are as farther apart as possible w hyperplane
Two Classes - Example • 2 sample classes X1 , X2
Two Classes - Example • are the meanvectors of each class = • S1 , S2 are the covariancematrixes of X1 & X2 (the scatter) =
Two Classes - Example • Sbis the Between-class scatter matrix • Swis the Within-class scatter matrix = = +
Two Classes - Example • Finding eigenvalues and eigenvectors
Two Classes - Example • Apparently, the projection vector that has the highest eigenvalue provides higher discrimination power between classes • LDA Projection found by Fisher’s Linear Discriminant
LDA Limitation • LDA is a parametric method since it assumes Gaussian conditional density models • Therefore if the samples distribution are non-Gaussian, LDA will have difficulties to make the classification for complex structures
Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • LinearDiscriminantAnalysis (LDA) • Colorization using optimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization
Colorization using optimization Levinat el. • User scribbles desired colors inside regions • Colors are propagated to all pixels • Looking at the YUV space • Remember: neighboring pixels with similar intensities should have similar colors
Colorization using optimization Levinat el. • input: • Y(x; y; t) intensity volume • output: • U(x; y; t) color volume • V(x; y; t) color volume • w(rs) is a weighting function that sums to one • and are the mean and variance of the intensities in a window around the pixel
Contents • Grayscalecoloringbackground • luminance/luminancechannel • Segmentation • DiscreteCosineTransform • K-nearest-neighbor (Knn) • LinearDiscriminantAnalysis (LDA) • Colorizationusingoptimization • Colorization by Example • (i) training • (ii) classification • (iii) color transfer • (iv) optimization