1 / 65

Imaging Science for Archivists - 101 Don Williams - Image Science Associates - d.williams@imagescienceassociates ( Te

Imaging Science for Archivists - 101 Don Williams - Image Science Associates - d.williams@imagescienceassociates.com ( Terms of Use ).

sandra_john
Download Presentation

Imaging Science for Archivists - 101 Don Williams - Image Science Associates - d.williams@imagescienceassociates ( Te

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Imaging Science for Archivists - 101Don Williams - Image Science Associates -d.williams@imagescienceassociates.com(Terms of Use) Today’s standards for characterizing imaging performance and capability are based on an image science architecture. I begin by introducing this perspective and then describe its application to scanner and digital camera performance in an archiving environment. The standards and accompanying tools will help the understand tone and color response, resolution, and noise. Anecdotal thinking comes naturally; Science requires training Michael Shermer

  2. Table of Contents • What You’ll See • General Imaging Steps • Digital Capture Systems • Image Processing • Measurement Requirements • OECF • Spatial Resolution • SFR • Noise • Artifacts

  3. Introduction / Background

  4. Introduce Imaging Science Concepts Signal and Noise Quality, Performance, Capability The Imaging Chain Key imaging performance metrics OECF Spatial Frequency Response Noise Artifacts Software execution, Results interpretations …and corrective action ? What you’ll see

  5. A number of these categories have ISO standards that define the metrology practice. Though intended for digital imaging devices their basis was derived from decades of analog (e.g., film) imaging experience. Important Imaging Characteristics Primary Imaging Performance Functions • Signal – Any response that provides valued information - Large area response to light OECF -Opto-Electronic Conversion Function - Spatial proximity behavior Spatial Frequency Response – SFR ( or MTF) • Noise – Any response that detracts from a desired signal - Light intensity distortions – Total noise - Geometric/Spatial distortions

  6. ‘y’ ‘x’ What is an Image… and how is it characterized ? A two dimensional spatial structure of varying light levels and colors. It is characterized by measuring physically realizable light intensities over a two dimensional space. These variations can occur over short distances, like edges, ( high frequencies) or larger distances or areas, like sky or facial features ( low frequencies).

  7. Interpret or/and Display input output Process Acquire CAPTURE Encode - Converting light to numbers DISPLAY Decode - Converting numbers to light The Big Picture- General Imaging Steps - “The Digital Image” An encoded proxy image Imaging Performance Metrics indicate how an imaging system or component acts on, modifies or limits the effective optical characteristics of an input scene. Once digitally captured, the image ceases to exist as light intensities. They are now encoded as N-bit ( M channel) digital files. Because the encoding is easily manipulated, ISO imaging capture performance metrics attempt to trace the data back to the original scene input intensities ( input referred ) or expected output intensities ( output referred )

  8. Image forming optics Illumination optics detector source sample Processing What is a Digital Imaging System? ( for capture) A collection of optical, software, or electronic functions that convert, encode, or otherwise act upon images or their optical or digital derivatives. Digital image file The performance of a digital capture system is influenced by all of the above in addition to operator training and environment.

  9. 1000 units reflected 1000 units incident = 1.000 reflectance = 0.0 density Incident Illumination - 1000 units 100 units reflected 1000 units incident = 0.1000 reflectance = 1.0 density 10 units reflected 1000 units incident = 0.0100 reflectance = 2.0 density 1 unit reflected 1000 units incident = 0.0010 reflectance = 3.0 density Reflective or transmissive hardcopy How are the optical properties of objects characterized ?- Reflectance and Transmission - Density= - log10 (reflectance) * In practice, it is often more convenient to specify in terms of density, especially for densities greater than 1.0

  10. Know your Collection- Typical density ranges for collection content - B&W Photographic Paper B&W Photographic Film Kodachrome film Color photographic paper Q-13 target Munsell papers Motion picture print film Non Photographic reflective material Density

  11. Know your CollectionTargets :object surrogates • Intended for both calibration and performance testing • Should be well characterized, and durable to light and handling. • Got software ? • How many if these colors are in your collection ?

  12. Fully Integrated – all capture components combined into a single plug-and-play scanner unit Flatbed scanners – Epson ($), Creo IQSmart3 ($$$) Copy Stand – Zeutschel, I2S products, Stokes Special Purpose - Kirtas, Treventus, and Qidenus D-I-Y Copy stands ( i.e. camera-on-a-stick) Camera Backs – Betterlight, Hasselblad, Sinar, Digital SLRs – Canon EOS, Nikon Types of Digital Capture Systems- Scanners vs. Cameras -

  13. Popular Detector Arrangements Linear arrays – one dimensional ordered arrangement of single detectors Area arrays – two dimensional ordered arrangement of single detectors. Step and Repeat (Macro or Micro) area strategy – capture and combine several area captures into a single large image

  14. Technology -Detectors Linear Arrays ( Tri-linear) • Three filtered ( Red, Green, Blue) rows of sensors • The sensor stares at the object in the row dimension and scans by the object in other direction. Sometimes called a pushbroom. • Used in scan back cameras ( e.g. BetterLight, PhaseOne FX, Seitz), flatbed scanners and most film scanners. • Frequently have different performance behaviors in the two directions different directions From : The Focal encyclopedia of photography

  15. From : The Focal encyclopedia of photography Technology -Detectors 2-D Area Arrays - Color Filter Array (CFA) pattern - • Three filter mosaic pattern ( Red, Green, Blue) of sensors. Sometimes other patterns are used but the “Bayer” pattern ( shown here) is by far the most popular. Used in virtually all consumer and professional digital cameras • One shot /one layer, sparsely populated color capture. Fully populated color achieved by interpolation algorithms (demosaicing) or micro-stepping. • Color filters integrated onto the sensor chip at manufacturing. • Occasionally will have subtle checkerboard artifacts or color aliasing rainbows in final delivered image.

  16. Image processing functions will absolutely affect imaging performance measurements. These include, Image resizing, rotation Unsharp masking and blurring sharpening Tone/Color adjustments Color profile changes Noise reduction Two approaches for performance measurement Practice - measure the “as delivered” image file consistent with workflow practices. Benchmarking – measure the raw image file with as many functions as possible set to “null”. Image Processing- the brains -

  17. Imaging Performance: Objectively measured behaviors of an imaging system in preserving information of the original object. For example: Tone response, Resolution, Noise, Color error, White balance, Light falloff ( uniformity ) Imaging Capability – Imaging Performance under optimal conditions Operator Environment Ease of use Both Accuracy and Precision are measurement requirements Image Quality: Task, appearance, or use case dependent measure. It is almost always some weighted combination of imaging performance metrics. Aerial reconnaissance – high resolution and low noise Health Imaging – tone control, low noise, resolution is dependent on task Document Imaging – OCR accuracy Consumer Imaging – Memory-color saturation, moderate resolution Sharpness Graininess Colorfulness Naturalness Image Performance vs. Imaging Quality

  18. X X X X X X X X X X Aim – The point or set of points intended to be hit Measurement Requirements Measurements usually require some level of both accuracy and precision. • Accuracy: average error from an aim • Precision: variability about the average reading Factors that influence measurements • Location on platen • Image processing • spatial sampling • image noise • environment • operator skill Performance is more about consistency ( precision) than accuracy. In imaging, accuracy is often not absolute but rather a preference.

  19. N bits – M channel ?- eh ? - Light intensities (i.e., analog signal) reaching the detector are assigned one of 2N digital count values for each of M color channels. For grayscale and high quality color imaging N= 14-16 bits and M=3 color (RGB) channels. This is referred to as N bit quantization 28 = 256 values ( 0-255) : 216 = 65536 values (0-65535) High light intensity = high reflectances = low density  high count value Low light intensity = low reflectances = high density  low count value

  20. N bits – M channel ? - utility - Raw image files are typically delivered as 16 bit/channel or a total of 48 bit color. These files are often maintained as digital or safety masters because they allow maximum future utility. 2(16*3) = 65536 levels for each of three colors. 48 bit color More commonly, a scanner’s or camera’s software will use the higher number of raw bits to provide a “finished file” of lesser number of bits that is consistent with a user’s specific application. Future utility of this finished file is compromised though. 21 = 2 levels ( 0-1 ) one bit or bi-tonal image 28 = 256 levels ( 0-255 ) 8 bit or grayscale image 2 (8*3) = 256 levels for each of three colors. True or 24 bit color.

  21. - Image Speak -

  22. Acutance Geometric Distortion Lateral Color Error White Balance Wobble Depth of field Sharpness Delta E Shading Exposure Gamma Sharpening Aliasing Dynamic Range Noise Flare Exposure Resolution Signal Noise Vocabulary - staying dry in a storm of vernacular idioms -

  23. Imaging Performance Frameworkhttp://www.digitizationguidelines.gov/stillimages/documents/imaging.html

  24. SIGNALAny response that provides valued information ISO 22028-1

  25. SIGNALAny response that provides valued information

  26. NOISEAny response that detracts from a desired signal

  27. NOISEAny response that detracts from a desired signal

  28. Opto Electronic Conversion Function (OECF)ISO 14545akaTone Transfer CurveorTone Reproduction Curve (TRC)

  29. (1) Scan grayscale step target with known optical reflectances* (2) Compute average count value from the digital image file of scanned grayscale target (3) Table of reflectance – average count value data pairs for each color channel average count value average count value neutral optical red green blue red green blue reflectance 8 8 7 13 13 14 0.010 8 8 7 18 17 18 0.013 13 13 14 24 25 25 0.016 18 17 18 31 32 30 0.020 24 25 25 38 40 39 0.025 31 32 30 45 46 45 0.033 38 40 39 52 51 51 0.042 45 46 45 60 62 61 69 70 69 0.051 52 51 51 81 82 81 0.065 60 62 61 92 93 92 0.083 69 70 69 105 105 104 0.107 81 82 81 119 120 120 0.132 92 93 92 135 132 132 0.166 105 105 104 153 155 154 0.209 119 120 120 172 173 172 0.263 135 132 132 194 193 193 219 220 220 0.339 153 155 154 247 250 249 0.437 172 173 172 0.550 194 193 193 0.708 219 220 220 0.912 247 250 249 How is the OECF created ? * Most grayscale targets come with documentation on reflectances/densities of each patch. Over time, these may fade or change though. A densitometer can be used to verify any change.

  30. Density Reflectance Graphical Representation of OECF Using Density is more revealing since the data values tend to be more evenly distributed

  31. Standards, Guidelines, Exposure- efficiently assigning density to count values - If most measured data values are above a selected OECF aim curve, over-exposure is indicated, conversely if most measured data values are below a selected OECF aim curve, under-exposure is indicated

  32. How do software driver setting terms effect OECF ? • • Shadow and Highlight settings provide upper and lower boundaries for controlling the darkest lightest light values encoded in the digital file. Some of these behave very much like the “Levels” function in Photoshop by allowing control of the boundary values of the OECF • • Auto contrast features are usually proprietary algorithms that provide custom OECFs through internal gamma, shadow, and highlight settings. • Because they do this on an image-by-image basis, inconsistent OECF behaviors occur within acollection. • • A higher Gamma setting at capture will increase the brightness of a displayed image…….. do not assume that a given gamma setting will yield the same OECF behavior for all scanners or software drivers.

  33. White Balance / Neutrality-Keep all of the Neutrals neutral - OECFs can be measured for each color channel using a target’s neutral gray patches. 85% of good color imaging performance is keeping the Red, Green, and Blue OECFs the same… Really! This is a good example of a well white balanced capture Note that all color channel OECFs lie on top of each other

  34. White Balance / Neutrality-Keep all of the Neutrals neutral - This is an example where the white balance performance is poor. Note how the blue channel OECF departs from the red and green OECF

  35. White Balance Assessment using Photoshop Histograms Though gray patches appear gray, the one patch is actually slightly colored. All three histogram should align if white balance is correct

  36. A few words onColor Encoding Error ( ∆E)- erroneously called color accuracy - • Digital cameras/ scanners do not reproduce colors; they encode them. • Color accuracy implies that two physically realizable colors are at hand to compare. • Assumptions on how RGB code values will ultimately be interpreted ( i.e. rendered or decoded) as physically reproduced color are often very wrong. • Delta E ( i.e. color error ) measures for digital capture devices are suitable for performance consistency monitoring but not as absolute measures of color accuracy

  37. Example ∆E Analysis for Adobe RGB encoding Summary measures for color encoding error

  38. OECF – Summary and Recommendations • There is no single best or standardized OECF. Most are based on policy maker or user preferences for the “look” of the image with respect to a particular use case. • Generally, any gamma setting between 1.5 – 2.5 is fine for capturing all significant image information for common reflection media densities. • The important practice is to document what the OECF is for ease of future image reproduction and rendering. • 85% of good color management relies on keeping the neutrals neutral. Do not assume that individual color channel OECFs for a gray scale target will be identical. By documenting the OECF though, any mismatch can usually be corrected. ( called “corrective action” ) • Leave sufficient CV ”buffer” in the OECF for future tone and spatial image manipulations. Dmax≈15 count value (CV), Dmin≈ 240 CV • An alternative to documenting the OECF is to include an image of a grayscale target in each digital image file and maintain it as part of the image file. Be sure the optical densities of each patch are somehow indicated.

  39. Spatial ResolutionSpatial Frequency Response (SFR)ISO 16067-1, 16067-2, 12233, 15529

  40. Recall the context…

  41. Related Metrics • • Number of pixels • Sampling frequency - dpi, ppi • *Limiting Resolution, Resolving Power • *Spatial Frequency Response - SFR • *Modulation Transfer Function - MTF What is Spatial Resolution….and what are all those related terms ? • The ability of an imaging component or system to distinguish finely spaced detail. Specifically, the ability to maintain the relative contrast of finely spaced detail. • Highest frequency ( smallest distance) at which light and dark parts of image are reliably distinguishable. Sometimes called limiting resolution *Related Standards – ISO 16067-1, 16067-2, 12233, & 15529 all employ these metrics

  42. Two ways to look at image details Space: size of the smallest important (signal) feature (mm) Frequency: How many small important features/area will my image store (number/mm, cycles/mm, dots/inch, pixels/inch) Small size implies high frequency Why usespace vs. frequencydescriptions? Compatible with engineering descriptions of information, bandwidth Simplifies some forms of system analysis Compatible with several visual image quality descriptions Space and Frequency

  43.  Two factors in a digital imaging system dictate the level of spatial resolution or detail that can ultimately be captured. Quantity; sampling frequency ( i.e. samples per mm, inch ) Sets a capability limit that can be achieved Quality; effective optical quality Defines the level of optical “blur” that the imaging optics, environmental factors, hardware, and image processing impose on the captured image.  These are necessary but insufficient components, by themselves, for capturing spatial detail. Simply improving one to reconcile deficiencies in the other will not enhance real resolution performance. Spatial Resolution

  44. peak-to-peak difference peak-to-peak sum = 0.10 ≡ modulation = Clearly resolved 1.0 - 0.81 1.0 + 0.81 Just resolved Not resolved Sampling Frequency…Spot Size…and Resolution Cross section Illustration of two spots just resolved according to the Rayleigh criterion (c. 1879)

  45. Traditional method using bar targets and visual evaluation Imaged target P P P P P P P P P P Yes, the features are readable Limiting Resolution= That spatial frequency at which finely spaced features are no longer visually detectable P P Ð P P Ð Ð P P P P P P P No, the features are illegible P P P Ð Ð Ð Increasing spatial frequency ( lp/mm or cycles/mm) CONS * Relies on human readable subjective evaluation * Results are target contrast dependent * Provides little diagnostic or image quality insight * Purely an ON/OFF (threshold) criterion PROS * Simple to measure * Easily understood * Suitable for binary scanners - Limiting Resolution -….so how does one measure it ?

  46. Sampling Frequency is not Resolution Increasing # lines/inch DPI (PPI) -> Quantity of data Resolution -> Quality of data (information) (line width = 1/200 “) (line width = 1/300 “) (line width = 1/400 “) Resolution Wedge Test Feature Example

  47. Sampling rate (DPI) is not Resolutionreal data example : results are not simulated Resolution limit = whenever all five lines are undetectable Resolution Sampling Rate Though the sampling rate is increased to 600 dpi the true resolution for this scanner is only 300 dpi.

  48. Target feature modulation after being imaged Modulation Transfer = Target feature modulation before being imaged Limiting resolution will generally correspond to the spatial frequency having a 0.10 modulation transfer value P P P P P P P P P P 1.0 Modulation Transfer 0.5 0.0 Ð Ð Ð Increasing spatial frequency ( lp/mm or cycles/mm) –Spatial Frequency Response* (SFR) the better alternative Instead of using a subjective YES/NO criterion along the vertical axis, an objective contrast (i.e. modulation) transfer ratio is adopted. The SFR it is an indicator of modulation( i.e. contrast) loss as a function of spatial frequency. * Also referred to as the Modulation Transfer Function (MTF)

  49. How limiting resolution fails to predict quality& why the SFR is better for doing so.

  50. Anatomy of the SFR- regions of behavior - Sharpening operators (over-sharpening, halo-ing, scream) Aliasing artifacts ( jaggies, moire) Limiting resolution Flare (haze) Acutance and Sharpness

More Related