530 likes | 549 Views
REMOTE -SENSORS An Overview. Name Of Speaker: Mrs. S. S. Palsule Name of Course: Master In Geomatics Date: 6-JAN-2010 Venue: NEW CEPT CLASS-ROOM. Remote Sensors – An Overview
E N D
REMOTE -SENSORSAn Overview • Name Of Speaker: Mrs. S. S. Palsule • Name of Course: Master In Geomatics • Date: 6-JAN-2010 • Venue: NEW CEPT CLASS-ROOM
Remote Sensors – An Overview Remote Sensors: These sensors are instruments that measure the properties of electromagnetic radiation leaving a surface / medium due to scattering or emission. The main property is a radiance measurement, which is a function of wavelength and energy collected over a spatial extent including the angular dependence of the observations. Apart from radiance, the μ-wave sensors and optical sensors measure the state of polarization.
Classification of Sensors Remote sensors are classified, as passive and active sensors. The technology development is not same for all sensors so they are further classified as , OIR and Microwave sensors. These sensors could be , imaging or non-imaging.
Passive Sensors: These sensors senses natural radiation which is either emitted or reflected from the earth are called passive sensors. Active Sensor: The sensors which produces its own electromagnetic radiation of a specific wavelength or band of wavelengths as part of sensors system and the interaction of this radiation with the target is studied for target identification are called Active Sensors. OIR Sensors: The sensor which are operating in optical and infra-red regions are OIR Sensors.
Microwave Sensors: These sensors operate for microwave frequencies of E.M. Spectrum. Imaging Sensors: These sensor gives two dimensional spatial distribution of the emitted or reflected intensities of the E.M. radiations. Non-Imaging Sensors: These sensors measure the intensity of radiation within the field of view and in some cases a function of distance along the line of sight of the instrument (spectra-radiometer – vertical temperature profile radiometer).
Selection of Sensor Parameters The information collected by the remote sensor is meant to identify and map various earth resources, which means that sensor performance is evaluated based on classification as well as its mapping accuracy requirements. The classification and mapping accuracy will depend on instruments ability to detect small differences in the emittance / reflectance of the earth’s surface in a number of spectral bands for as small an object as possible and as often as possible.
The question is “what is the optimum set of specification for a remote sensor”. There is no unique answer as the choice of the parameter depends on THE THEME UNDER STUDY. The realization of identified parameters in a sensor system is a complex problem due to interrelationship of these identified parameters. The sensor parameters under four domains are -SPATIAL - -SPECTRAL -RADIOMETRIC -TEMPORAL
Basic Elements of OIR imaging system • The basic elements of an imaging system in the OIR region can be broadly classified as • Collecting optics / imaging system • Scanning mechanism for mechanical scanning • Color separation system • Detectors • In-flight calibration system • Associated electronics
Basic Elements of Satellite – borne Microwave Radiometer • An antenna, which receives the incoming radiation. • A scanning mechanism (Mechanical or Electrical) • A receiver and associated Electronics which detects and amplifies the received radiation and produce voltage output. • In-flight calibrate system (hot body, sky horn) • Auxiliary logic system • Housekeeping system
Space borne SAR System • Active Antenna System • RF & Base band system • On board calibration system • Active Antenna System has following elements, • Antenna electronics • RF power distribution • TR modules • TR control module
Contd. • RF & Base System has following elements • Feeder SSPA • Receiver • Control Frequency Generator • Digital Chips Generator • Data acquisition & compression system
Spatial Resolution The spatial resolution is a measure of the sensors ability to image (record) closely spaced objects so that they are distinguishable as separate objects. A sensor with 1 meter spatial resolution can reproduce ‘finer details’ of the scene image compared to a sensor with a 10 meter resolution. The number of sensor design parameters which influences the spatial resolution is subsequently analyzed and discussed.
Concept of resolving two objects • The theoretical limit of resolving two objects by an imaging system say lens is due to diffraction, i.e. • the phenomenon of EMR bending at the edge of an object on its path. • An idle aberration – free lens system to image a point object as a point in the image plane (which is not practically possible).
The image point at infinitely imaging system will consist of a bright disc surrounded by concentric bright and dark rings called an airy pattern. The central disc is called an airy disc which has distribution of energy with peak followed by number of minimum and maximum.
The energy contained of the first peak is 84% of the total energy and the diameter is 2.44 (L/D) f L is the wavelength of the observation. f is the focal length of lens. D is aperture of the lens. Two objects can be just resolved if the peak of the airy pattern of one object falls on the first minimum of the other, i.e. two objects can be resolved if the separation of two objects on ground is 1.22 ( L/D) f.
For IRS -1C sensor’ L= 0.5m, D=22cm , Height = 810km (By using above mentioned formula for object separations On ground the spatial Resolution limit is 2 meter ) PAN resolution camera is 5.6 μm which means the object separated by 5.6 μm can be imaged faithfully. If we use same optics for SWIR at 2 μm, the spatial resolution limit will be 8.0 m.
Resolving Power / Resolution • The term resolution and resolving power is important, • the resolving power applies to optical components like lens, film and resolution applies to the image produced by the system • Resolution is combined effect of resolving power and other associated components to form the image.
Contrast Ratio The difference in radiance of the object in a scene plays crucial role for target detection. The contrast is usually defined as the ratio of maximum and minimum radiance between the adjacent area. Contrast modulation is another term for measuring system performance
Modulation Transfer Function The contrast of image and object will be different and reduction or degradation of the contrast by imaging system is termed as MTF. MTF = Contrast Modulation in Image Contrast Modulation in object This term is strictly applied for sinusoidal wave, if square wave function is considered than term is used as contrast transfer function.
Clear/Hazy Atmosphere An object which is taken in clear atmosphere is clearly resolvable as compared to image taken in “scattered/hazy atmosphere.” Image in clear atmosphere has contrast ratio 2.5 and same object in hazy atmosphere has contrast ration 1.4.
Therefore the contrast ratio is a critical parameter in determining the ability to resolve and detect object. An object with high contrast ratio is more easily identified than a low contrast. For photographic camera Ground Resolution = ½ (1/Rs) H f Where RS is the resolution of the system in line pairs / unit length f is focal length of the lens,H is observation height
Scale: The map scale is the ratio of the distance measured on a map to that measured on the ground between the same two points. Instantaneous Field of View: When electronic sensors using discrete detectors are used for generating imagery, spatial resolution is used to denote the projection of the detector element on the ground through optics.
IRS-1C PAN Camera has 5.8 meter resolution means that projection of one CCD element on the ground through the imaging optics from the satellite orbit is 5.8 meter. The foot print of the detector element on the ground is referred as spatial resolution and same term in angular measurement is referred as instantaneous field of view (IFOV). IFOV = d/ f (radian) If d is detector dimension and f is the focal length of the imaging system, than IGFOV characterize the sensor irrespective of the outlined of image taken.
Definition of Pixel/Swath • Pixel: The image contained within IFOV is represented by picture element in the image plane is referred as pixel. • Swath: The total IFOV is the view angle of the camera which defines the swath of the system.
Effective Instantaneous Field of View IFOV alone cannot give an idea of the ‘detection’ capability of sensor, it is design parameter while effective IFOV (EIFOV) is used to compare the sensor performance, EIFOV is the resolution for which the MTF is 50%. This terminology implies that sensor system at 50% MTF gives effective spatial resolution for target detection / identification.
GSD • Ground Sampling Distance: • The technology is capable of sampling at certain along track ground distance by controlling / measuring platform velocity. • This means that the data can be generated by sampling at certain specified ground distances which are smaller than IGFOV.
For achieving better image quality for high spatial resolution sensor, along track radiometry is improved by reducing smear effect, while across track IGFOV remain same. • Having discussed all interrelated terminology, the question arises that what is the optimum spatial resolution? • No unique answer.
The spatial resolution at which data are acquired has two following effects. • The ability to identify various features, • which relates to classification accuracy. • The ability to quantify their extent, • which relates to the accuracy of measurements. • The important aspect for classification accuracy is the contribution of boundary pixels which covers the boundary of two features and radiance measurement is a combination of two features.
Although boundary pixel error is reduced with improved resolution at the cost of scene noise. Scene noise is variability in reflection in a scene, even if it contains similar objects (mainly due to leave orientations, backgrounds). It is obvious that the accuracy of measurement of an area will depend upon the accuracy of locating the boundary which is possible within fraction of pixel. The larger the pixel size, the more will be the error in the area estimation. The percentage error will be more for features with small area.
Spectral Resolution • In multi spectral remote sensing the variation in reflected / emitted spatial resolution is used to distinguish various features. • It is difficult to get continuous spectral information, • therefore observations are made by sampling few selected wavelengths. • The wave region of observation is called spectral bands, which is defined in terms of central wavelength (c) and bandwidth ().
Selections of Spectral Bands • Spectral band selection depends on • Location of central wavelength • The Band width • The total number of bands • Bandwidth BW is given by the lower (1) and upper (2) cut off wave length. • The spectral resolution is given by (2 - 1) which describes the wavelength is made. (smaller the L, higher spectral resolution)
The selection of bandwidth is a trade off between the energy to be collected and spectral shape of the feature to be observed for land observation in VNIR, a few 10s of nanometer is usually used, while for ocean observation (10-20) nanometer is used. .
Location of Spectral Bands • The most important criteria is that • (i) they should be in the atmospheric window. • (ii) Away from the absorption bands of atmospheric constituents • The selected bands should be un-correlated the extent possible, since correlated band will give redundant information.
Contd. • selection of optimum number of bands is essential and primarily it is based on theme application yet detail studies have shown that addition of the middle I R band with any other band combination gives improved seperability in agriculture classification
Radiometric Resolution Radiometric resolution is a measure of the capability of the sensor to differentiate the smallest change in the spectral reflectance / emittance between various targets. In practical sense it is represented as the noise equivalent reflectance (temperature) change NET.
Radiometric Parameters • The radiometric resolution depends on a number of parameter such as the S/N ratio, the saturation radiance setting and number of quantization bits. • The more number of bits increases the dynamic range so that measurement of objects with radiance varying from ocean to snow can be performed without changing gain setting.
Radiometric Quality Radiometric Quality of the image depends primarily on radiometric resolution, calibration accuracy and Modulation transfer function. Resolution in general is the minimum difference between two discrete values that can be distinguished by measuring device. However high resolution does not mean high accuracy. Accuracy is a measure of how close the measurement is to be the true value.
Absolute Accuracy • Two kinds of Radiometric accuracy :- • Absolute Accuracy • How close is sensor measured data (radiance as mw/cm2/sr/m, degree temperature), • to a primary standard of radiance is the measure of absolute accuracy.
Relative Accuracy This refer to the relative accuracy among bands with respect to a primary standard. Eg. MSS four bands of camera although each band does not represent the radiance value accurately, but their ratio with reference to one band is the same as the true value ratio. For digital classification relative differences of radiance values among bands is more important than absolute value of radiances.
Radiometric Errors • Radiometric Errors are introduced by the MTF of the camera system. MTF is a measure of image contrast in image plane from the object plane. • The radiance measured by the sensor is effected by two factors. (I) Signals from the adjacent pixel spill over known as adjacency effect • (ii) Atmospheric effect which corrupts the actual reflectance reaching at sensor. • Radiometric Error generally leads to poor classification accuracy.
Temporal Resolution The main advantage of satellite remote sensing is its ability to repeatedly observe a scene at regular intervals. Temporal resolution refers to the temporal frequency with which a given scene can be imaged usually expressed in days and it is also called as Repetivity. Repetivity depends on orbit characteristics and derived swath. The larger the swath, higher the temporal resolution. Sea wits, ocean sat repetitively is the 2days