1 / 68

Dr. John R. Jensen Department of Geography University of South Carolina Columbia, SC 29208

Digital Image Processing Hardware and System Considerations. Dr. John R. Jensen Department of Geography University of South Carolina Columbia, SC 29208. Jensen, 2004. Image Processing System Considerations.

seamus
Download Presentation

Dr. John R. Jensen Department of Geography University of South Carolina Columbia, SC 29208

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Digital Image Processing Hardware and System Considerations Dr. John R. Jensen Department of Geography University of South Carolina Columbia, SC 29208 Jensen, 2004

  2. Image Processing System Considerations • Digital remote sensor data are analyzed using a digital image processing system that consists of computer hardware and special-purpose image processing software. This lecture describes: • fundamental digital image processing system hardware • characteristics, • digital image processing software (computer program) • requirements, and • public and commercialsources of digital image • processing hardware and software. Jensen, 2004

  3. Image Processing System Considerations • A digital image processing system should: • have a reasonable learning curve and be easy to use, • have a reputation for producing accurate results (ideally the company has ISO certification), • produce the desired results in an appropriate format (e.g., map products in a standard cartographic data structure compatible with most GIS), and • be within the department’s budget. Jensen, 2004

  4. Computer Systems and Peripheral Devices in A Typical Digital Image Processing Laboratory Jensen, 2004

  5. Image Processing System Hardware /Software Considerations • Number and speed of Central Processing Unit(s) (CPU) • Operating system (e.g., Microsoft Windows; UNIX, Linux, Macintosh) • Amount of random access memory (RAM) • Number of image analysts that can use the system at one time and mode of operation (e.g., interactive or batch) • Serial or parallel image processing • Arithmetic coprocessor or array processor • Software compiler(s) (e.g., C++, Visual Basic, Java) Jensen, 2004

  6. Computer Systems and Peripheral Devices in A Typical Digital Image Processing Laboratory Jensen, 2004

  7. Image Processing System Hardware /Software Considerations • Type of mass storage (e.g., hard disk, CD-ROM, DVD) and amount (e.g., gigabytes) • Monitor display spatial resolution (e.g., 1024  768 pixels) • Monitor color resolution (e.g., 24-bits of image processing video memory yields 16.7 million displayable colors) • Input devices (e.g., optical-mechanical drum or flatbed scanners, area array digitizers) • Output devices (e.g., CD-ROM, CD-RW, DVD-RW, film-writers, line plotters, dye sublimation printers) • Networks (e.g., local area, wide area, Internet) Jensen, 2004

  8. Central Processing Unit • The central processing unit (CPU) is the computing part of the computer. It consists of a control unit and an arithmetic logic unit. The CPU performs: • numerical integer and/or floating point calculations, and • directs input and output from and to mass storage devices, • color monitors, digitizers, plotters, etc. Jensen, 2004

  9. Central Processing Unit • A CPU’sefficiency is often measured in terms of how many millions-of-instructions-per-second (MIPS) it can process, e.g., 500 MIPS. • It is also customary to describe a CPU in terms of the number of cycles it can process in 1 second measured in megahertz, e.g., 1000 Mhz (1 GHz). • Manufacturers market computers with CPUs faster than 4 GHz, and this speed will increase. The system bus connects the CPU with the main memory, managing data transfer and instructions between the two. Therefore, another important consideration when purchasing a computer is bus speed.

  10. Moore’s Law • In 1985, Gordon Moore was preparing a speech and made an observation. He realized that each new computer CPU contained roughly twice as much capacity as its predecessor and each CPU was released within 18 to 24 months of the previous chip. If this trend continued, he reasoned, computing power would rise exponentially over relatively brief periods of time. Moore’s law described a trend that has continued and is still remarkably accurate. It is the basis for many planners’ performance forecasts. MIPS has also increased logarithmically.

  11. History of Intel Microprocessors Jensen, 2004

  12. Busicom Calculator 4004 Microprocessor used in the Busicom Calculator Intel Pentium 4

  13. Image Processing System Considerations Type of Computer: * Personal Computers (32 to 64-bit CPU) * Workstation (> 64-bit CPU) * Mainframe (> 64-bit CPU) Jensen, 2004

  14. Type of Computer Personal computers (16- to 64-bit CPUs) are the workhorses of digital image processing and GIS analysis. Personal computers are based on microprocessor technology where the entire CPU is placed on a single chip. These inexpensive complex-instruction-set-computers (CISC) generally have CPUs with 32- to 64-bit registers (word size) that can compute integer arithmetic expressions at greater clock speeds and process significantly more MIPS than their 1980s – 1990s 8-bit predecessors. The 32-bit CPUs can process four 8-bit bytes at a time and 64-bit CPUs can process eight bytes at a time. Jensen, 2004

  15. Computer Systems and Peripheral Devices in A Typical Digital Image Processing Laboratory Jensen, 2004

  16. Type of Computer Workstations usually consist of a >64-bit reduced-instruction-set-computer (RISC) CPU that can address more random access memory than personal computers. The RISC chip is typically faster than the traditional CISC. RISC workstations application software and hardware maintenance costs are usually higher than personal computer-based image processing systems. The most common workstation operating systems are UNIX and various Microsoft Windows products. Jensen, 2004

  17. Type of Computer Mainframe computers (>64-bit CPU) perform calculations more rapidly than PCs or workstations and able to support hundreds of users simultaneously, especially parallel mainframe computers such as a CRAY. This makes mainframes ideal for intensive, CPU-dependent tasks (e.g., image rectification, mosaicking, filtering, classification, hyperspectral image analysis, and GIS modeling). If desired, the output from intensive mainframe processing can be passed to a workstation or personal computer for subsequent less intensive, inexpensive processing. Mainframe computer systems are expensive to purchase and maintain. Mainframe applications software is more expensive. Jensen, 2004

  18. Operating System The operating system is the first program loaded into random access memory (RAM) when the computer is turned on. It controls the computer’s higher-order functions. The operating system kernel resides in memory at all times. The operating system provides the user interface and controls multitasking. It handles the input and output to the hard disk and all peripheral devices such as compact disks, scanners, printers, plotters, and color displays. All digital image processing application programs must communicate with the operating system. The operating system sets the protocols for the application programs that are executed by it. Jensen, 2004

  19. Operating Systems • The difference between a single-user operating system and a network operating system is the latter’s multi-user capability. • Microsoft Windows XP (home edition) and the Macintosh OS are single-user operating systems designed for one person at a desktop computer working independently. • Various Microsoft Windows, UNIX, and Linux network operating systems are designed to manage multiple user requests at the same time and complex network security. Jensen, 2004

  20. Read Only Memory and Random Access Memory Read-only memory (ROM) retains information even after the computer is shut down because power is supplied from a battery that must be replaced occasionally. Most computers have sufficient ROM for digital image processing applications; therefore, it is not a serious consideration. Random access memory (RAM) is the computer’s primary temporary workspace. It requires power to maintain its content. Therefore, all of the information that is temporarily placed in RAM while the CPU is performing digital image processing must be saved to a hard disk (or other media such as a CD) before turning the computer off. Jensen, 2004

  21. Read Only Memory and Random Access Memory • Computers should have sufficient RAM for the operating system, image processing applications software, and any remote sensor data that must be held in temporary memory while calculations are performed. Computers with 64-bit CPUs can address more RAM than 32-bit machines • It seems that one can never have too much RAM for image processing applications. RAM prices continue to decline while RAM speed continues to increase. Jensen, 2004

  22. Interactive Graphical User Interface (GUI) One of the best scientific visualization environments for the analysis of remote sensor data takes place when the analyst communicates with the digital image processing system interactivelyusing a point-and-clickgraphical user interface (GUI). Most sophisticated image processing systems are now configured with a friendly GUI that allows rapid display of images and the selection of important image processing functions. Jensen, 2004

  23. Graphical User Interface • Several effective digital image processing graphical user interfaces include: • ERDAS Imagine’s intuitive point-and-click icons, • Research System’s Environment for Visualizing Images (ENVI) hyperspectral data analysis interface, • ER Mapper, • IDRISI, • ESRI ArcGISImage Analyst, and • Adobe Photoshop. Jensen, 2004

  24. ENVI Interface Jensen, 2004

  25. ENVI Interface Jensen, 2004

  26. ERDAS Interface Jensen, 2004

  27. Interactive versus Batch Processing • Non-interactive, batch processing is of value for time-consuming processes such as image rectification, mosaicking, orthophoto creation, filtering, etc. • Batch processing frees up lab PCs or workstations during peak demand because the jobs can be stored and executed when the computer is idle (e.g., early morning hours). • Batch processing can also be useful during peak hours because it allows the analyst to set up a series of operations that can be executed in sequence without operator intervention. • Digital image processing also can now be performed interactively over the Internet at selected sites. Jensen, 2004

  28. Storage and Archiving Considerations • Digital image processing of remote sensing and related GIS data requires substantial mass storage resources. Mass storage media should have: • rapid access time, • longevity (i.e., last for a long time), and be • inexpensive. Jensen, 2004

  29. Serial and Parallel Image Processing It is possible to obtain PCs, workstations, and mainframe computers that have multiple CPUs that operate concurrently. Specially written parallel processing software can parse (distribute) the remote sensor data to specific CPUs to perform digital image processing. This can be much more efficient than processing the data serially. Jensen, 2004

  30. Serial Versus Parallel Processing • Requires more than one CPU • Requires software that can parse (distribute) the digital image processing to the various CPUs by • - task, and/or • - line, and/or • - column. Jensen, 2004

  31. Serial and Parallel Image Processing Consider performing a per-pixel classification on a 1024 row by 1024 column remote sensing dataset. In the first example, each pixel is classified by passing the spectral data to the CPU and then progressing to the next pixel. This is serial processing. Conversely, suppose that instead of just one CPU we had 1024 CPUs. In this case the class of each of the 1024 pixels in the row could be determined using 1024 separate CPUs. The parallel image processing would classify the line of data about 1024 times faster than would processing it serially. Jensen, 2004

  32. Serial and Parallel Image Processing Each of the 1024 CPUs could also be allocated an entire row of the dataset. Finally, each of the CPUs could be allocated a separate band if desired. For example, if 224 bands of AVIRIS hyperspectral data were available, 224 of the 1024 processors could be allocated to evaluate the 224 brightness values associated with each individual pixel with 800 additional CPUs available for other tasks. Jensen, 2004

  33. Serial versus Parallel Digital Image Processing to Perform Per-pixel Classification Jensen, 2004

  34. Compiler A computer software compiler translates instructions programmed in a high-level language such as C++ or Visual Basic into machine language that the CPU can understand. A compiler usually generates assembly language first and then translates the assembly language into machine language. The compilers most often used in the development of digital image processing software are C++, Assembler, and Visual Basic. Many digital image processing systems provide a toolkit that programmers can use to compile their own digital image processing algorithms (e.g., ERDAS, ER Mapper, ENVI). The toolkit consists of subroutines that perform very specific tasks such as reading a line of image data into RAM or modifying a color look-up table to change the color of a pixel (RGB) on the screen. Jensen, 2004

  35. Compiler It is often useful for remote sensing analysts to program in one of the high-level languages just listed. Very seldom will a single digital image processing system perform all of the functions needed for a given project. Therefore, the ability to modify existing software or integrate newly developed algorithms with the existing software is important. Jensen, 2004

  36. Rapid Access Mass Storage Digital remote sensor data (and ancillary raster GIS data) are often stored in a matrix band sequential (BSQ) format in which each spectral band of imagery (or GIS data) is stored as an individual file. Each picture element of each band is typically represented in the computer by a single 8-bit byte with values from 0 to 255. The best way to make brightness values rapidly available to the computer is to place the data on a hard disk, CD-ROM, DVD, or DVD-RAM where each pixel of the data matrix may be accessed at random (not serially) and at great speed (e.g., within microseconds). The cost of hard disk, CD-ROM, or DVD storage per gigabyte continues to decline. Jensen, 2004

  37. Rapid Access Mass Storage It is common for digital image processing laboratories to have gigabytes of hard-disk mass storage associated with each workstation. Many image processing labs now use RAID (redundant arrays of inexpensive hard disks) technology in which two or more drives working together provide increased performance and various levels of error recovery and fault tolerance. Other storage media, such as magnetic tapes, are usually too slow for real-time image retrieval, manipulation, and storage because they do not allow random access of data. However, given their large storage capacity, they remain a cost-effective way to archive digital remote sensor data. Jensen, 2004

  38. Rapid Access Mass Storage Companies are now developing new mass storage technologies based on atomic resolution storage (ARS), which holds the promise of storage densities of close to 1 terabit per square inch—the equivalent of nearly 50 DVDs on something the size of a credit card. The technology uses microscopic probes less than one-thousandth the width of a human hair. When the probes are brought near a conducting material, electrons write data on the surface. The same probes can detect and retrieve data and can be used to write over old data. Jensen, 2004

  39. Archiving Considerations and Longevity Storing remote sensor data is no trivial matter. Significant sums of money are spent purchasing remote sensor data by commercial companies, natural resource agencies, and universities. Unfortunately, most of the time not enough attention is given to how the expensive data are stored or archived to protect the long-term investment. The diagram depicts several types of analog and digital remote sensor data mass storage devices and the average time to physical obsolescence, that is, when the media begin to deteriorate and information is lost. Jensen, 2004

  40. Potential Longevity of Remote Sensor Data Storage Media Jensen, 2004

  41. Archiving Considerations and Longevity • Properly exposed, washed, and fixed analog black & white aerial photography negatives have considerable longevity, often more than 100 years. • Color negatives with their respective dye layers have longevity, but not as much as the black-and-white negatives. • Black & white paper prints have greater longevity than color prints (Kodak, 1995). • Hard and floppy magnetic disks have relatively short longevity, often less than 20 years. • Magnetic tape media (e.g., 3/4-in. tape, 8-mm tape) can become unreadable within 10 to 15 years if not rewound and properly stored in a cool, dry environment. Jensen, 2004

  42. Potential Longevity of Remote Sensor Data Storage Media Jensen, 2004

  43. Archiving Considerations and Longevity Optical disks can now be written to, read, and written over again at relatively high speeds and can store much more data than other portable media such as floppy disks. The technology used in rewriteable optical systems is magneto-optics, where data is recorded magnetically like disks and tapes, but the bits are much smaller because a laser is used to etch the bit. The laser heats the bit to 150 °C, at which temperature the bit is realigned when subjected to a magnetic field. To record new data, existing bits must first be set to zero. Jensen, 2004

  44. Archiving Considerations and Longevity Only the optical disk provides relatively long-term storage potential (>100 years). In addition, optical disks store large volumes of data on relatively small media. Advances in optical compact disc (CD) technology promise to increase the storage capacity to > 17 Gb using new rewriteable digital video disc (DVD) technology. In most remote sensing laboratories, rewritable CD-RWs or DVD-RWs have supplanted tapes as the backup system of choice. DVD drives are backwards compatible and can read data from CDs. Jensen, 2004

  45. Archiving Considerations and Longevity • It is important to remember when archiving remote sensor data that sometimes it is the loss of the • read-write software and/or • read-write hardware (the drive mechanism and heads) • that is the problem and not the digital media itself. Therefore, as new computers are purchased it is a good idea to set aside a single computer system that is representative of a certain computer era so that one can always read any data stored on archived mass storage media. Jensen, 2004

  46. Computer Display Spatial and Color Resolution • The display of remote sensor data on a computer screen is one of the most fundamental elements of digital image analysis. Careful selection of the computer display characteristics will provide the optimum visual image analysis environment for the human interpreter. The two most important characteristics are computer : • display spatial resolution, and • color resolution. Jensen, 2004

  47. Computer Screen Display Resolution The image processing system should be able to display at least 1024 rows by 1024 columns on the computer screen at one time. This allows larger geographic areas to be examined and places the terrain of interest in its regional context. Most Earth scientists prefer this regional perspective when performing terrain analysis using remote sensor data. Furthermore, it is disconcerting to have to analyze four 512  512 images when a single 1024  1024 display provides the information at a glance. An ideal screen display resolution is 1600  1200 pixels. Jensen, 2004

  48. Computer Systems and Peripheral Devices in A Typical Digital Image Processing Laboratory Jensen, 2004

  49. Computer Screen Color Resolution The computer screen color resolution is the number of gray-scale tones or colors (e.g., 256) that can be displayed on a CRT monitor at one time out of a palette of available colors (e.g., 16.7 million). For many applications, such as high-contrast black-and-white linework cartography, only 1 bit of color is required [i.e., either the line is black or white (0 or l)]. For more sophisticated computer graphics for which many shades of gray or color combinations are required, up to 8 bits (or 256 colors) may be required. Most thematic mapping and GIS applications may be performed quite well by systems that display just 64 user-selectable colors out of a palette of 256 colors. Jensen, 2004

  50. Computer Screen Color Resolution The analysis and display of remote sensor image data generally requires much higher CRT screen color resolution than cartographic and GIS applications. For example, most relatively sophisticated digital image processing systems can display a tremendous number of unique colors (e.g., 16.7 million) from a large color palette (e.g., 16.7 million). The primary reason for these color requirements is that image analysts must often display a composite of several images at one time on a CRT. This process is called color compositing. Jensen, 2004

More Related