370 likes | 661 Views
Experiments with National Digital Elevation Models Yaron A. Felus, Robert C. Burtch, and Chad Schaeding Surveying Engineering Department Ferris State University, MI. http://btcsure1.ferris.edu/NGA/ 915 Campus Dr. Swan 314, Big-Rapids, MI 49307 E-mail: felusy@ferris.edu or burtchr@ferris.edu.
E N D
Experiments with National Digital Elevation ModelsYaron A. Felus, Robert C. Burtch, and Chad SchaedingSurveying Engineering DepartmentFerris State University, MI http://btcsure1.ferris.edu/NGA/ 915 Campus Dr. Swan 314, Big-Rapids, MI 49307 E-mail: felusy@ferris.edu or burtchr@ferris.edu
Presentation outline • Introduction • Existing National Digital Elevation Models • National Elevation Data Set (NED) by USGS • Shuttle Radar Topography Mission (SRTM) • Experiments with the data • Accuracy analysis with respect to standards • Applications: • Using free data to ortho-rectify aerial photographs. • The FSU-NGA project • Spatial interpolation, Kriging and Co-Kriging • Conclusions
National Elevation Data Set (NED) • The National Elevation Dataset is a new elevation product assembled by the U.S. Geological Survey (USGS). • The development of NED began in the early 1990’s and was completely assembled in 1999 by merging and processing the individual 7.5 minute DEM (with 10 & 30 meter resolution at NAVD88 ). • It was designed to provide national elevation data in a seamless form with a consistent datum, elevation unit, and projection. • Data corrections were made in the assembly process to minimize artifacts, permit edge matching, and fill sliver areas of missing data.
Shuttle Radar Topography Mission (SRTM) February 11, 2000, the Space Shuttle gathered topographic data over approximately 80% of the land surfaces of the Earth.
Shuttle Radar Topography Mission (SRTM) data • The SRTM data were acquired by the National Geospatial-Intelligence Agency (NGA) and the National Aeronautics and Space Administration (NASA) using a radar system that flew onboard the Space Shuttle Endeavour during an 11-day mission in February, 2000. • Currently two products are available: • One arc-second resolution (~90’) for the United States and its territories • Three arc-second (~270’) for all the areas between 60º North and 56º South latitudes. • The radar data underwent extensive processing and noise filtering before they were released to the public. SRTM DEM uses WGS84 datum and EGM96 geoid model.
How to obtain the data • Download data from http://seamless.usgs.gov/
Accuracy of NED and SRTM • NED (left) Vs. SRTM (right) • SRTM is a Digital Surface Model and was filter extensively
Accuracy of NED and SRTM • Smith and Sandwell (2003) performed spectral analysis of the 1-arcsecond SRTM and NED data and found that following • Root Mean Squared (RMS) of the SRTM data is 2.7m • Root Mean Squared (RMS) of the NED data 3.5m • Reinartz et al (2005) conclude that SRTM data accuracy decreases drastically in forest areas since it neither represents the tree canopy or the ground.
Case study, the FSU golf course • Evaluating the accuracy of SRTM/NED data • Comparing • SRTM • NED • Photogrammetry • GPS
Case study, the FSU golf course • Even though USGS standards for DEM require only 20 check points with at least eight scattered around the edge. • More than 500 points were collected in Real-time Kinematic (RTK) mode using the Big Rapids Continuous Operating Reference Station (CORS) at a distance of less than 1mile.
Accuracy standards National Map Accuracy Standards (NMAS) • The NMAS defines the following two criteria to test the vertical accuracy of a topographic map: • “Vertical accuracy, as applied to contour maps on all publication scales, shall be such that not more than 10 percent of the elevations tested shall be in error by more than one-half the contour interval.” . • “The accuracy of any map may be tested by comparing the positions of points whose locations or elevations are shown upon it with corresponding positions as determined by surveys of a higher accuracy.”
Accuracy standards American Society for Photogrammetry and Remote Sensing (ASPRS) • The ASPRS standard is using the Root-Mean-Square Error (RMSE) statistic to evaluate the accuracy of a given spatial data. • The RMSE is defined as: Class 1 map should have a vertical RMSE of 1/3 the contour interval for well-defined points and 1/6 the contour interval for spot elevations. Maps compiled within limiting RMSE errors of twice or three times those allowed for Class 1 map shall be designated as Class 2 or Class 3, respectively.
COMPATIBLE MAP SCALES & CONTOUR INTERVALS FOR AVERAGE TERRAIN
Is it good for floodplain plan? Section 142 of Act No. 59 of the Public Acts of 1978, as amended, being S559.242 of the Michigan Compiled Laws A flood plain plan when the condominium lies within or abuts a flood plain area, showing all the following: The location of all condominium buildings and improvements…. The contours over the entire project shown at 2-foot intervals. NO!
3 1 4 • Orthographic Projection Using NED and SRTM data for orthophotographs creation Orthophotography is a geometrically corrected photograph created from either aerial or satellite imagery. The most expensive part of producing an orhtophoto is generally the creation of the DEM.
Case study, the FSU golf course • Two orthophotographs were created using the Leica Photogrammetry Suite from 1:10,000-scale photography taken at a flight height of 1,582 meter above the average terrain and scanned at ground resolution of 0.15 meters. • The initial NED and SRTM DEMs were projected from their native geographic coordinates to Michigan State Plane coordinate system to create a 35x35 meter resolution DEM.
Orthophotography accuracy The errors were larger on the edges of the orthophotograph and very small near the center of the image (nadir point).
Concluding remarks for the experiments with DEMs From the results of experiments undertaken in this study, it is clear that these government datasets can be used to create orthophotos at a scale of 1:10,000 that meet acceptable industry standards such as those developed by ASPRS. This study found that the SRTM data had slightly better accuracy than the NED data but it may not represent the terrain properly and may have larger errors in computing slope and aspect parameters. It is also important to note that the SRTM data is a DSM while NED data is a DEM measuring ground topography. SRTM data is current which is an important advantage providing a proper model that can be used for many applications, even for updating the NED.
Multisource Data Fusion Strategies and methods for integrating data from different (and possibly diverse) sensors. Process results maintain the highest accuracy and resolution existing within the original data
Geoid separation -N Interpolation of the Geoid Undulation Surface The geoid separation is also termed geoid undulation Ho= He - N Interpolation procedure should be employed to obtain the geoid undulation surface from measurements made in specific points.
What is the geoid undulation in Michigan? Geoid 2003
SPATIAL INTERPOLATION Interpolation is the procedure of predicting the value of an attribute at unsampled site from the measurements made at point locations within the same area or region.
SPATIAL INTERPOLATION • Data close together in space (e.g. elevations, geoid undulation) or time (e.g. temperatures) are likely to be correlated (related). • Many interpolation procedures and methods are being used in different fields of science. These methods can be classified into a few categories. • Global/Local: • Exact/Approximate Interpolators: • Stochastic/Deterministic Interpolators • Gradual/Abrupt Interpolators.
Geoid 2003 • USGG2003 is a gravimetric geoid file covering the Conterminous United States. • It improves the gravimetric geoid primarily along the East Coast and especially in Florida (a reduction from 40 to 30 cm in misfit). • The USGG2003 geoid undulations refer to a geocentric GRS-80 ellipsoid. • USGG2003 was computed on a 1 x 1 arc minute grid (about 1 mile) Interpolate the value of geoid undulation between the grid values
Global/Local Interpolations • Global: • global interpolators determine a single function which is mapped across the whole region • a change in one input value affects the entire map • Local: • local interpolators apply an algorithm repeatedly to a small portion of the total set of points • a change in an input value only affects the result within the window
Exact/Approximate Interpolations • Exact: • exact interpolators honor the data points upon which the interpolation is based. the surface passes through all points whose values are known • Approximate: • approximate interpolators are used when there is some uncertainty about the given surface values • this utilizes the belief that in many data sets there are global trends, which vary slowly, overlain by local fluctuations, which vary rapidly and produce uncertainty (error) in the recorded values • the effect of smoothing will therefore be to reduce the effects of error on the resulting surface
Stochastic/Deterministic Interpolations • Stochastic: • stochastic methods incorporate the concept of randomness • the interpolated surface is conceptualized as one of many that might have been observed, all of which could have produced the known data points • Deterministic: • deterministic methods do not use probability theory
Gradual/Abrupt Interpolations • Gradual: • a typical example of a gradual interpolator is the distance weighted moving average • Abrupt: • it may be necessary to include barriers in the interpolation process
Kriging techniques (Geostatistics) Professor Georges Matheron (1930-2000) developed the formal foundation of Geostatistics, centered, in the beginning, on estimating changes in ore grade within a mine. However, the principles have been applied to a variety of areas in geology and then to other scientific disciplines. Geostatistical interpolation is known as kriging after D. G. Krige.
Dense =>deterministic Sparse =>kriging Kriging assumptions Some spatial surfaces cannot be modeled using deterministic methods that use smooth mathematical functions. Specifically if data are sparse, for example ground-water modeling, gravity data, soil mapping, water toxicity, air pollution, bathymetric data etc. Kriging is a stochastic interpolation method in contrast with deterministic methods (TIN, Inverse distance, trend estimation). It attempts to statistically obtain the optimal prediction i.e. to provide the Best Linear Unbiased Estimation (BLUE), specifically when data are sparse
trend m(s) x(s) e(s) Kriging assumptions The basic assumption is that the spatial variation can be expressed by the following summation: z(s0) = m(s0) + x (s0) + e where m(s0) = deterministic function describing the ‘structural’ component of z x(s0) = stochastic, spatially dependent residual from m(x) e = Observational noise
Variogram / Covariance function • Spatial dependence is usually expressed mathematically in the form of a spatial coherency function such as the semi-variogram, or the covariance function. • The semi-variogram and the covariance function are valuable tools in explanatory data analysis. Moreover these functions control the way in which kriging weights are assigned to data points during interpolation.
Ordinary kriging, Basic Steps Steps in the kriging interpolation process: Explanatory data analysis; identify and eliminate outliers and trend ( compute m(s0) using Trend estimation ) Estimation of the variogram ( 2(h) ) Using the semi-variogram to perform kriging prediction (1) where is our interpolated point, z(si) are the sample points, and λi are kriging coefficients 4. MSPE calculation and error analysis (cross validation)
Interpolation Summary • There is no 'best' interpolation algorithm that is clearly superior to all others and appropriate for all applications. • The quality of the resulting DTM is determined by the distribution and accuracy of the original data points, and the adequacy of the underlying interpolation model (i.e. a hypothesis about the behavior of the terrain surface); • The most important criterion for selecting a DTM interpolation method are the degree to which (1) structural features can be taken into account, and (2) the interpolation function can be adapted to the varying terrain character.
Interpolation Summary • Other criteria that may influence the selection of a particular method are the degree of accuracy desired and the computational effort involved • Cross validation is the procedure where one data is removed and the rest of the data is used to predict the removed data. Thus an estimate of the accuracy is obtain by: • Error = Predicted value - known value
Ferris State and Height Modernization The Best Surveying Students! The support for this research from the National Geospatial-Intelligence Agency under contract no. HM1582-04-1-2026 is greatly acknowledged.