510 likes | 631 Views
DIGITAL IMAGE PROCESSING. PRESENTED BY : RITESH ROHAN (BRANCH- E.T.C. , REG. NO. - 0501222276) & MUKESH KUMAR (BRANCH – E.T.C. , REG. NO. - 0501222074) I.A.C.R. ENGG. COLLEGE, RAYAGADA. Media.
E N D
DIGITAL IMAGE PROCESSING PRESENTED BY : RITESH ROHAN (BRANCH- E.T.C. , REG. NO. - 0501222276) & MUKESH KUMAR (BRANCH – E.T.C. , REG. NO. - 0501222074) I.A.C.R. ENGG. COLLEGE, RAYAGADA
TOPICS • 1. Different stages of image processing • 2. Components of Image processing system • 3. A review of various Mathematical Transforms • 4. Perception of color • 5. Image Formation • 6. Image Digitization • 7. Wavelet Transforms
1. IMAGE SENSORS • 2. DIGITIZERS • 3. MASS STORAGE • 4. PROCESSORS • 5. HARD COPIERS • 6. DISPLAY DEVICES
IMAGE SENSORS 1.PHOTOCHEMICAL 2.PHOTO ELECTRONIC DIGITIZER Produces digital image composed of discrete intensity values at discrete positions. PROCESSORS 1.CISC/RISC 2.MIMD 3.Pipelined 4.SIMD DISPLAY UNIT
APPLICATIONS • OFFICE AUTOMATION • INDUSTRIAL AUTOMATION • BIOMEDICAL • REMOTE SENSING • SCIENTIFIC APPLICATION • CRIMINILOGY • ASTRONOMY & SPACE APPLICATION • METEROLOGY • INFORMATION TECHNOLOGY • ENTERTAINMENT & CONSUMER ELECTRONICS • PRINTING & GRAPHICS ART • MILITARY APPLICATION
MATHEMATICAL TRANSFORMS • MATRIX • FUZZY SETS • MATHEMATICAL MORPHOLOGY • WAVELET TRANSFORM
What is an image? • Ideally, we think of an image as a 2-dimensional light intensity function, f(x,y), where x and y are spatial coordinates, and f at (x,y) is related to the brightness or color of the image at that point. • In practice, most images are defined over a rectangle. • Continuous in amplitude („continuous-tone“) • Continuous in space: no pixels!
IMAGE FORMATION • TRANSFORMING A 3-DIMENSIONAL SCENE TO A 2-DIMENSIONAL PLANE • ASSIGNING AN INTENSITY TO AN IMAGE POINT THAT CORRESPONDS TO A PARTICULAR POINT OF THE SCENE
IMAGE FORMATION • TWO BASIC MODELS • GEOMETRIC MODEL(BASIC TRANSFORMATION,TRANSLATION,SCALING,ROTATION,CAMERA CALIBRATION) • PHOTOMETRIC MODEL(INTENSITY,TRANSFORMATION OF ENERGY)
Image Formation • For natural images we need a light source (λ: wavelength of the source) . – E(x; y; z; ¸): incident light on a point (x; y; z world coordinates of the point) • Each point in the scene has a reflectivity function. – r(x; y; z; ¸): reflectivity function • Light reflects from a point and the reflected light is captured by an imaging device. – c(x; y; z; ¸) = E(x; y; z; ¸) £ r(x; y; z; ¸): reflected light.
Digital Images and Pixels • A digital image is the representation of a continuous image f(x,y) by a 2-d array of discrete samples. The amplitude of each sample is quantized to be represented by a finite number of bits. • Each element of the 2-d array of samples is called a pixel or pel (from „picture element“) • Pixels are point samples, without extent. • A pixel is not: • Round, square, or rectangular • An element of an image sensor • An element of a display
A digital image can be represented as a matrix The pixel values f(x,y) are sorted into the matrix , with x corresponding to the column and y to the row index. Matlab uses this convention. For a color image, f might be one of the components.
Tour Guide Image Acquisition Image Generation D.I.P. Theme Park Image Compression Image Analysis Image Manipulation Image Display Image Perception
Why do we process images? • Enhancement and restoration • remove artifacts and scratches from an old photo/movie • improve contrast and correct blurred images • Transmission and storage • images from oversea via Internet, or from a remote planet • Information analysis and automated recognition • providing “human vision” to machines • Security and rights protection • encryption and watermarking
Why Digital? • “Exactness” • Perfect reproduction without degradation • Perfect duplication of processing result • Convenient & powerful computer-aided processing • Can perform rather sophisticated processing through hardware or software • Even kindergartners can do it! • Easy storage and transmission • 1 CD can store hundreds of family photos! • Paperless transmission of high quality photos through network within seconds
Human Vision System • Image is to be seen. • Perceptual Based Image Processing • Focus on perceptually significant information • Discard perceptually insignificant information • Issues: • Biological • Psychophysical
Color • Color is the perceptual result of light having wavelength 400 nm to 700 nm that is incident upon the retina. • “Power distribution exists in the physical world, but color exists only in the eye and the brain.” • Does “red” mean the same to different people?
BRIGHTNESS & CONTRAST • Brightness is the sensation to the light intensity. • Contrast may be defined as the difference in perceived brightness. • Visual spectrum 0.4 micrometer → 0.7 micrometer Human eye can distinguish-350000 colors
COLOR MODELS • A color model is a 3D unique representation of a color. There are different color models and the use of one over the other is problem oriented. • RGB • YIQ • CMY • HLS • HSI
RGB • Useful for color generation • Color cube --Cartesian system–Corners: primary and secondary colors–Grayscale: BW diagonal –All values normalized in [0,1] • Pixel depth: number of bits used to represent each pixel in the RGB space–e.g.: full-color image: 24-bit RGB color image • Additive primaries The color model RGB is used in hardware applications like PC monitors, cameras and scanners.
CMY (CMYK) Color Models • For color printing • Secondary colors of light • [C M Y] = [1 1 1] –[R G B] • Subtractive primaries • CMYK: in order to have “true black" instead of “muddy black" produced from CMY combination. CMY color model is used in color printers
HSI • Color description; closest to human perception • Hue –pure color • Saturation –amount of dilution by white light • Brightness –subjective measure; ~intensity • RGB→HSI (below) • HIS→RGB (see Book) In color image manipulation the two models widely used are HSI and HSV.
Understanding HSI from RGB Turn the RGB cube so that Black-White axis is vertical • Each plane containing the B-W axis and any color point contains all the colors of the same hue • Hue can be represented as angle between the plane and a reference plane (e.g. Red) • Color of the same hue can be made less saturated by mixing more grey colors • Intensity can be measured by intersection with the B-W axis.
Formation of the HIS color cone • Cross sections of the RGB cube along the B-W axis • The cross section shape changes from triangle to hexagon to triangle • Hue is represented by the angle from Red line • Saturation is represented by the distance to the origin • The hexagonal shape can be approximated by a circle or a triangle.
YIQ YIQ model in television broadcast
DIGITIZATION • The method of converting an image , which is continuous in space as well as in its value, into a discrete numerical form is called “IMAGE DIGITIZATION” . • This includes 2 processes- • Sampling • Quantization
DIGITIZATION • Image sampling refers to discretization of spatial coordinates whereas quantization refers to discretization of gray level. • Normally, the sampling and quantization deals with integer values. After the image is sampled, with respect to x & y coordinates the number of samples used along the x & y directions are denoted as M & N respectively. The N & M are usually the integer power of 2. Hence N & M are represented as – • M=2n & N=2k
DIGITIZATION • Similarly when we discretize the gray levels, we use the integer values and the number of the integer values can be denoted as G. The number of integer gray level values usually used to represent an image usually are an integer power of 2. • G=2m • Where m represents the number of bits used to represent a gray level value in the image. An image of size M*N consisting of MN pixels and the number of bits required to store a digital image can be given by the following equation- • b=M*N*m
DIGITIZATION • The number of pixels that can be accommodated in a unit area is called the resolution of an image which depends strongly on- • The number of values for N. • The number of bits used to represent the gray level When we increase N and M values, the resolution increases and the storage requirements also increases.
DIGITIZATION • The number of bits required to store the image size of 64*64 with 16 gray levels are- • b=64*64*4=16384 bits or 2048 bytes • If the image size is increased , then N=256 m=6 then the number of bits required=?
WAVELET TRANSFORM • A wavelet is a “small wave” which has its energy concentrated in time . • It is used in the analysis of transient ,non-stationary or time varying phenomena. • This tool allow simultaneous time & frequency analysis. • It has a novel approach of analyzing signals having abrupt transitions superimposed on lower frequency such as speech, music & bioelectric signals. • This has a multi resolution capacity.
Stationary signals whose spectral characteristics do not change with time are represented as a time or frequency. • Non-stationary signals which involve both time & frequency especially the auditory & visual perceptions require time-frequency analysis. • The time-frequency analysis involves mapping a signal which has a one dimensional function of time into an image which is a two dimensional function of time & frequency that displays temporal localizations of spectral components of signal. • The STFT (short time Fourier transform) can map one dimensional function ƒ (t) into two dimensional STFT (φ,ƒ).
STFT Frequency parameter Time parameter Signal to be analyzed FT Kernel (basis function) STFT of signal x(t): Computed for each window centered at t=t’ Windowing function Windowing function centered at t=t’
APPLICATIONS OF WAVELET TRANSFORM • DATA COMPRESSION • SOUND SYNTHESIS • COMPUTER & HUMAN VISION • FINGER PRINT COMPRESSION • DENOISING NOISY DATA