660 likes | 668 Views
This thesis focuses on the identification of tree locations in geographic images using various image processing techniques and virtual reality tools. The purpose is to create a utility for accurately and efficiently placing trees in virtual wildfire simulations.
E N D
Identification of Tree Locations in Geographic Images A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Engineering By David Brown Dr. Frederick C. Harris, Jr., Thesis Advisor December, 2008
Committee • Dr. Frederick C. Harris, Jr. • Dr. Sergiu M. Dascalu • Dr. Timothy J. Brown
Overview • Purpose • Background • Methods • Software Specification and Design • Implementation and Results • Conclusion and Future Work
Purpose • To create an item placement utility for VFIRE (Virtual Fire In Realistic Environments) • VFIRE is a virtual reality application for visualizing wildfire simulations. • Current area of interest is Kyle Canyon in Southern Nevada.
Purpose Main uses of VFIRE: • Fire Training • Fire Planning • Fire Model Verification Wildfire Visualization [25]
Purpose • The placement of items in the visualization should correspond to their locations in the real environment. • This utility is intended to place large numbers of trees with reasonable speed and accuracy. • It can also be used to place a small number of houses with reasonable speed and accuracy.
Background – Geographic Images Photography • Standard Color • Panchromatic • Multispectral • Hyperspectral • Color Infrared (CIR)
Background – Photography • CIR images look different from most other types True Color Image [5, page 45] False Color Image [5, page 45]
Background – Vegetation Maps • Vegetation data can be displayed as a map. • Vegetation maps have been created by LANDFIRE to show various attributes. Map of Vegetation Cover Map of Vegetation Type
Background – Point Operations • Each output pixel is based on a single input pixel. • Changing the brightness of an image is a point operation. Original Image Image After Increasing Brightness
Background – Neighborhood Operations • Blurring an image is a neighborhood operation. • The blur filter is applied to each input neighborhood. Original Image Blurred Image
Background – Edge Detection • The LoG (Laplacian of Gaussian) Filter is an edge detection filter in which the level of detail can be controlled. Laplacian of Gaussian (LoG) filter [36]
Background – Template Matching • Can be used as the first step in image analysis • Used when finding the location of known item • Neighborhood operation where the filter mask is a template of the desired item • Filtering produces a correlation image that can be scanned for bright spots.
Background – Virtual Reality [2] Requirements: • Virtual World • Immersion • Sensory Feedback • Interactivity
Background – HMDs [2] Head Mounted Displays • 100% Field of Regard • May cause dizziness • Only one person can view at one time A Head Mounted Display [2, page 14]
Background – Multi-Sided Projection Displays [2] • Field of regard depends on the number of sides. • Wider field of view than HMDs • No dizziness • Many people can view at once • More bulky and expensive than HMDs Three-Sided Projection Display [25]
Background – Head Tracking • View must be adjusted for location and orientation of head. • Stereoscopic display can be used to create depth perception. Head-Tracking Active Stereo Goggles [17]
Background – Input Devices • A wand is a commonly used input device for virtual reality systems. Virtual Reality Wand [17]
Background – Related Work Applications • Plantation Management • Assessing Forest Health • Harvestable Lumber Estimation • Fuel Load Estimation
Background – Related Work Culvenor [6] • Even-Aged Mountain Ash (Eucalyptus) • NIR Selected from CIR • Identify Local Maxima • Identify Local Minima • Cluster Intermediate Pixels
Background – Related Work Pouliot et al. [33] • Uniformly Spaced Spruce Trees in an Arboretum • Absolute Difference of NIR and Red Bands • Moving Window, Local Maximum Filtering
Background – Related Work Brandtberg and Walter [11] • 80-Year-Old Stands of Scot pine, Norway spruce, birch, and aspen • Perform Scale-Space Edge Detection to Extract Tree Crown Perimeters • Analyze Perimeter Curvatures to Estimate Centroids 10-cm, CIR Brightness Scale Space Image After Edge Detection Estimating Centers
Background – Related Work Larsen [23] • Image of Norway spruce • Template Created from Ray-Traced 3D Tree Model • Model Incorporates Aircraft Position, sun position, and Spiecies-Specific Light-Scatering Parameters Norway Spruce 3D Template Model
Background – Related Work Image Analysis Software [4] • Will perform template matching
Background – Related Work Image Analysis Software • Not likely to output locations in geospatial coordinates • Not likely to provide geospatially aligned overlays of vegetation maps • Not likely to display vegetation map data for selected locations • Not likely to make placements based on vegetation maps
Methods Goals: • Achieve adequate tree-placement accuracy using whatever images (if any) are available. • Enhance Accuracy using vegetation maps. • Make tree placements using vegetation maps alone if no photographic image is available.
Methods System: • Interactive (not fully automatic) • Template Matching (no image constraints) • Templates Created at Runtime (quickly create multiple templates) • Vegetation maps provide information about terrain. • Placements can be made based on vegetation maps alone.
Methods System: • The algorithm is not tailored to any particular image. • The user-defined templates are tailored to the image. • The algorithm is tailored to the correlation image produced using the templates.
Methods Data for Kyle Canyon • 4-Meter Photographic GeoTIFFs • Red, Green, Blue, NIR • 1-Meter Photographic GeoTIFF • Panchromatic • 5-Meter Vegetation Maps Sampled from 30-Meters • Vegetation cover, vegetation type, vegetation Height
Methods 1-Meter Panchromatic Image • Trees look like blobs. • Species, size, shadow, and density are different in different parts of the image.
Software Specification and Design Use Cases
Software Specification and Design System consists of five groups of global functions using two existing libraries.
Implementation and Results Detection Process • The user selects a tree to use as a template. • The tree is the gray blob. • The shadow is the dark, elongated region. Cite of First Template (Zoomed In)
Implementation and Results Detection Process • The user draws a highlighting mark over the tree and shadow. Template Defined by User
Implementation and Results Detection Process • Area Near Template, Zoomed Out Area Near Template (Zoomed Out)
Implementation and Results Detection Process • Correlation Image Stored in Red Buffer of Workspace Image • Other buffers are used for intermediate processing. Correlation Image
Implementation and Results Detection Results • Detection Results Using a Single Template Result Using One Template
Implementation and Results Detection Process • User controls tuning parameters for tree detection . Tuning Parameter Window
Implementation and Results • Detection Process • User specifies data for trees associated with each template. • Locations, types, etc are then written to file. Preparation to Create Output File
Implementation and Results Detection Process • Entire Image of Kyle Canyon (8km × 6km) Entire Kyle Canyon Image
Implementation and Results Detection Process • Vegetation Map of Same Area (8km × 6km) Entire Vegetation Map
Implementation and Results Detection Process • Vegetation Map As Overlay onto Image (8km × 6km) Overlay of Vegetation Map onto Photographic Image
Implementation and Results Detection Process • Text Output Resulting When User Clicks on Image Text Output from Clicking on Image
Implementation and Results Partially Random Placement • Tree Placements Made According to Map of Vegetation Coverage, Without Using Image Partially Random Placements
Implementation and Results Partially Random Placement • User can control how placements are made when no image is available Options for Partially Random Placements