460 likes | 727 Views
Digital image basics. Human vision perception system Image formation Human vision property and model Image acquisition Image transform Image quality Connected components Image sensing Image formats. 1. Human vision. 1. Human vision (cont’d). Two types of receptors
E N D
Digital image basics • Human vision perception system • Image formation • Human vision property and model • Image acquisition • Image transform • Image quality • Connected components • Image sensing • Image formats
1. Human vision (cont’d) • Two types of receptors • -- Cones (fovea): sensitive to brightness and color • - 7M • - Cone-vision (photopic, bright-light vision) • -- Rod (cell): sensitive to low-level illumination • - 100M • - Rod-vision (scopotic, dim-light vision)
3. Vision property (cont’d) • Brightness adaptation • -- There are a range of intensity levels that human eye can adapt • - photopic: 10^(-3) (mL) – 10^(3) (mL) • - scopotic: 10^(-3) (mL) – 10^(-1) (mL) • -- Human eyes have brightness adaptation level, • they cannot adapt the whole range • simultaneously
3. Vision property (cont’d) • Brightness discrimination • -- The ability to discriminate different intensity level • - Weber ratio: just noticeable difference of intensity versus the background intensity • -- The intensity defined in the digital image is not the real intensity. It is a contrast scale (e.g., gray scale)
3. Vision property (cont’d) • Contrast • -- Absolute contrast • C = Bmax / Bmin • whereBmax is the maximum brightness intensity • Bmin is the minimum brightness intensity • -- Relative contrast • Cr = (B – B0) / B0 • B is the brightness of object; B0 is the background brightness • -- Mach Band: over-shooting effect
theta eye d L 3. Vision property (cont’d) • Spatial discrimination (SD) • -- minimum view angle which can discriminate two points on the object to be viewed • d/(2 * Pi * L) = theta / 360
3. Vision property (cont’d) • Spatial discrimination (SD) • -- low illumination (SD decreases) • -- low contrast (SD decreases) • -- too high illumination (SD does not increase too much) • -- SD of color is weaker than SD of brightness • -- projection on fovea (SD increases)
Optical system H(u,v) ~ h(x,y) Input image f(x,y) Output image g(x,y) 3. Vision property (cont’d) • Human vision model • -- g(x, y) = T [ f(x, y)] • -- T: transform input optical scene to output image • - linear or non-linear transform • - H(u,v) low pass filter (e.g., limited discrimination, linear) • - log response to the brightness (e.g., non-linear) • - time-delay effect (e.g., “image-remain” effect)
4. Image acquisition • Wavelength • -- electromagnetic spectrum
4. Image acquisition (cont’d) • Principle of imaging sensor • -- transform illumination energy into digital image • -- output voltage waveform is proportional to light • -- e.g., single sensor, group sensors (one-strip, CT/MRI), group sensors (2D array CCD)
4. Image acquisition (cont’d) • Image digitizing • -- Sampling: digitizing the coordinate values (spatially) • - Nyquest rate: 2*F(max) • - limited by the number of sensors • - spatial sampling: uniform and non-uniform • (e.g., fovea-based, fish-eye based) • -- Quantization: digitizing the amplitude values • - uniform • - non-uniform (based on image characteristics)
4. Image acquisition (cont’d) • Image digitizing • -- f(x, y) is the gray level at pixel location (x, y) • -- Gray level is not real illumination intensity (it is an • index of the gray scale) • -- f(x, y) is in the range of [0, 255] for 8-bit image • -- the image with size of M*N and k bits per pixel, • has the total bits: M*N*k
4. Image acquisition (cont’d) • Spatial resolution • -- number of pixels with respect to the image size • -- line pair: smallest discernible detail per unit • distance in an image • - e.g., 100 lp/mm.
4. Image acquisition (cont’d) • Relationship between spatial resolution N and gray level resolution K • -- N and K quality • -- N and K contrast • -- N (detail) K (number of gray level) can be • (e.g., half-tone image)
4. Image acquisition (cont’d) • Aliasing problem • -- JigJag or staircase effect. • -- occurs in image acquisition (e.g., image processing) • -- occurs in display (e.g., computer graphics) • -- Reasons: • The sampling or displaying resolution is lower than the • minimum rate 2*F(max), which is the Nyquest rate. • -- Possible solution: • - Smooth image before sampling to reduce the F(max) • - side-effect: image blurred, quality
5. Image transform • Size change • -- Zoom-in • -- Zoom-out • -- pixel replication • -- pixel interpolation • -- super-resolution • Shape change • -- geometric transformation
6. Image quality • Subjective • -- Rating (e.g., R=1, 2,…, 5) • where N is the number of evaluators; JiR • -- application in image enhancement, restoration, compression, etc.
6. Image quality • Objective • -- Mean square error • -- dB value: -10Log(E) • -- f(x,y) is the image to be evaluated. • f^(x,y) is the reference image to be compared with. • -- application in image coding, etc.
P P 7. Connected components • Relationship of pixels • -- Four neighbors of pixel P • - N4(P) (strong neighbors) • - ND(P) (weak neighbors) • -- Eight neighbors of pixel P • - N8(P) = N4(P) + ND(P) P Strong weak 8-neighbor
P 7. Connected components (cont’d) • Adjacency • -- 4-adjacency • -- 8-adjacency • -- m-adjacency (mixed-adjacency) q P q P q 4-connected pq is not m-connected 8-connected m-adjacent: if q is N4(p), or q is Nd(p) and N4(p) N4(q) =
P q P q 7. Connected components (cont’d) • Path • -- If p and q is connected, there is a path between p and q. • -- m path: the path between p and q based on m-connected pixels. • -- closed path: starting p and ending q are connected
7. Connected components (cont’d) • Connected component • -- set of pixels which are connected • -- The set is also called connected set • Concept • -- R is a region if R is a connected set • -- boundary of R is “closed path” • -- edge: gray-level discontinuity at a point • - link edge points edge segment
7. Connected components (cont’d) • Distance • -- D(p, q) is defined as the distance between p and q. • D(p, q) >=0 • D(p, q) = D(q, p) • D(p, q) <= D(p,z) + D(q,z) • -- Euclidean distance (disk shape) • De(p,q) = sqrt[(xp – xq)^(2) + (yp – yq)^(2)]
7. Connected components (cont’d) • Distance • -- D4 distance (city-block distance) (diamond shape) • D4(p,q) = |(xp – xq)|+ |(yp – yq)| • 2 • 2 1 2 • 2 1 0 1 2 • 2 1 2 • 2
7. Connected components (cont’d) • Distance • -- D8 distance (chessboard distance) (square shape) • D8(p,q) = max(|(xp – xq)|, |(yp – yq)|) • 2 2 2 2 2 • 2 1 1 1 2 • 2 1 0 1 2 • 2 1 1 1 2 • 2 2 2 2 2
7. Connected components (cont’d) • Distance • -- Dm distance (shortest m-path between two points) • 1 - 1 • | • 1 - 1 • | • 1 • Dm = 4
8. Pixel operation • Point-wise operation • -- M*N image bound matrix t r (r,t): coordinates of upper-left component; each component is either defined (which is represented by a certain intensity value), or undefined (which is represented by “*”).
8. Pixel operation (Cont’d) • Arithmetic operation • (1) ADD[f, g](I,j) • = f(I,j) + g(I,j) IF f(I,j) and g(I,j) (C1) • = otherwise • (2) Mult[f,g](I,j) • = f(I,j) • g(I,j) IF C1 • = otherwise • (3) SCALAR[t; f](I,j) • = t • f(I,j) IF f(I,j) • = otherwise
8. Pixel operation (Cont’d) • Arithmetic operation • (4) Max[f,g](I,j) • = max[f(I,j), g(I,j)] IF C1 • = otherwise • (5) Min[f,g](I,j) • = min[f(I,j), g(I,j)] IF C1 • = otherwise • (6) Sub[f](I,j) • = -f(I,j) IF f(I,j) • = otherwise • (6) SCALAR[t; f](I,j) • = t • f(I,j) IF C1 • = otherwise
8. Pixel operation (Cont’d) • Arithmetic operation • (7) EXTEND[f,g](I,j) • = f(I,j) IF f(I,j) • = g(I,j) otherwise • (8) EXTADD[f,g](I,j) • = ADD[f,g](I,j) IF C1 • = f(I,j) IF f(I,j) and g(I,j) = • = g(I,j) IF g(I,j) and f(I,j) = • = * both g and f on undefined domain
8. Pixel operation (Cont’d) • Arithmetic operation • (9) THRESH[f,t](I,j) • = 1 IF f(I,j) t • = 0 IF f(I,j) < t • = IF f(I,j) = • (10) TRUNC[f,t](I,j) • = f(I,j) IF f(I,j) t • = 0 IF f(I,j) < t • = IF f(I,j) = • TRUNC[f,g](I,j) = Mult[f, THRESH(f, t)]
8. Pixel operation (Cont’d) • Arithmetic operation • (11) EQUAL[f,t](I,j) • = 1 IF f(I,j) = t • = 0 otherwise • = * on the undefined domain • (12) similar definition for • GREATER[f,t](I,j) • BETWEEN[f, t1, t2](I,j) • (13) operation with masking: • AND, OR, NOT.
8. Pixel operation (Cont’d) • Arithmetic operation • (14) PIXSUM(f) is the summation of all pixels on the • defined domain • (15) DOT(f,g) = SUM[f(I,j) g(I,j)] on the common domain • (16) Norm(f) = [SUM[f(I,j)^2]]^(1/2) • Norm(f) = (DOT(f,f))^(1/2)
8. Pixel operation (Cont’d) • Arithmetic operation • (17) REST[f,g](I,j) • = f(I,j) IF g(I,j) • = IF g(I,j) = • (18) Note: • Linear operation: H(af + bg) = aH(f) + bH(g) • otherwise: non-linear operation (e.g., |f-g| operation) • H: operator • f, g: images • a, b: scale values
Image Sensing • Single Image Sensor • Line Sensor (Sensor strip) • Array Sensor
Image Sensing • Linear motion • Rotation • Sensing Ring for CT (x-ray) to create cross-sectional images
Image Format • TIF (LZW – lossless coding) • GIF • JPEG • BMP
Image Format • TIF (LZW – lossless coding) • Tagged image file format • Image head: • field = tags + values • image size • compression • color depth • location of data • bits per sample • …….
Image Format • JPEG • 8*8 blocks DCT Coefficient quantization Huffman coding zig-zag run-length coding
Image Format • BMP • PBM - portable bitmap file format (binary) • PGM – portable greymap (grey scale) • PPM – portable pixmap (color)