360 likes | 453 Views
An Introduction to Texture Synthesis. 2008/05/29. Desirable Properties. Result looks like the input Efficient General Easy to use Extensible. The Challenge. Need to model the whole spectrum: from repeated to stochastic texture. repeated. stochastic. Both?.
E N D
An Introduction to Texture Synthesis 2008/05/29
Desirable Properties Result looks like the input Efficient General Easy to use Extensible
The Challenge Need to model the whole spectrum: from repeated to stochastic texture repeated stochastic Both?
[Shannon,’48] proposed a way to generate English-looking text using N-grams: • Assume a generalized Markov model • Use a large text to compute prob. distributions of each letter given N-1 previous letters • Starting from a seed repeatedly sample this Markov chain to generate new letters • Also works for whole words
p non-parametric sampling Assuming Markov property, compute P(p|N(p)) • Building explicit probability tables infeasible • Instead, let’s search the input image for all similar neighborhoods — that’s our histogram for p To synthesize p, just pick one match at random Input image Synthesizing a pixel
Basic Algorithm Exhaustively search neighborhoods
Neighborhood Neighborhood size determines the quality & cost 33 55 77 739 s 423 s 528 s 1111 4141 99 24350 s 1020 s 1445 s
Efros & Leung ’99 The algorithm • Very simple • Surprisingly good results • Synthesis is easier than analysis! • …but very slow Optimizations and Improvements • Multi-resolution Pyramid • Jump Map • Chaos Mosaic
Multi-resolution Pyramid High resolution Low resolution
Multi-resolution Pyramid Better image quality & faster computation 1 level 55 1 level 1111 3 levels 55
What is a Jump Map? Same size as input
What is a Jump Map? Same size as input Set of jumps per pixel
What is a Jump Map? 0.4 0.2 0.3 • Same size as input • Set of jumps per pixel • Jumps are weighted according to similarity • Need not sum to 1
Synthesis with Jump Maps Input Output
Chaos Mosaic [Xu, Guo & Shum, ‘00] Process: 1) tile input image; 2) pick random blocks and place them in random locations 3) Smooth edges input idea result Used in Lapped Textures [Praun et.al,’00]
Chaos Mosaic [Xu, Guo & Shum, ‘00] Of course, doesn’t work for structured textures input result
Image Quilting Idea: • let’s combine random block placement of Chaos Mosaic with spatial constraints of Efros & Leung
Efros & Leung ’99 extended B p Synthesizing a block • Idea: unit of synthesis = block • Exactly the same but now we want P(B|N(B)) • Much faster: synthesize all pixels in a block at once • Not the same as multi-scale! non-parametric sampling Observation: neighbor pixels are highly correlated Input image
B1 B1 B2 B2 Neighboring blocks constrained by overlap Minimal error boundary cut block Input texture B1 B2 Random placement of blocks
Minimal error boundary 2 _ = overlap error min. error boundary overlapping blocks vertical boundary
Algorithm • Pick size of block and size of overlap • Synthesize blocks in raster order • Search input texture for block that satisfies overlap constraints (above and left) • Easy to optimize using NN search • Paste new block into resulting texture • use dynamic programming to compute minimal error boundary cut
Failures (Chernobyl Harvest)
? + = Image Quiltingfor Texture Transfer
Texture Transfer Take the texture from one object and “paint” it onto another object • This requires separating texture and shape • That’s HARD, but we can cheat • Assume we can capture shape by boundary and rough shading Then, just add another constraint when sampling: similarity to underlying image at that spot
parmesan + = rice + =
+ =
Source correspondence image Target correspondence image Source texture Target image
+ =
Portilla & Simoncelli Xu, Guo & Shum input image Wei & Levoy Image Quilting
Portilla & Simoncelli Xu, Guo & Shum input image Wei & Levoy Image Quilting
Homage to Shannon! Portilla & Simoncelli Xu, Guo & Shum input image Wei & Levoy Image Quilting