470 likes | 563 Views
CEOS Working Group on Cal/Val LAND PRODUCT. SUBGROUP REPORT Jeff Morisette Jeffrey.T.Morisette@nasa.gov, (301) 614-6676 GOFC/GOLD Global Geostationary Fire Monitoring. LPV outline. Review of subgroup’s status and goals Validation activities Conclusions. LPV outline.
E N D
CEOS Working Group on Cal/Val LAND PRODUCT SUBGROUP REPORT Jeff Morisette Jeffrey.T.Morisette@nasa.gov, (301) 614-6676 GOFC/GOLD Global Geostationary Fire Monitoring
LPV outline Review of subgroup’s status and goals Validation activities Conclusions
LPV outline Review of subgroup’s status and goals Validation activities Conclusions
Land Product Validation Subgroup • Established in 2000 • Followed Terrain-mapping subgroup as topic-specific subgroup(non-wavelength-specific) • Jeff Morisette (NASA) starting as chair in February 2003 • Agreement from Fred Baret (“VALERI, INRA-CSE) to be “chair-elect”= potential chair in 2006)
GOFC/GOLD and LPV IGOS-P IGOS CEOS Plenary GTOS WGCV GOFC/GOLD LPV User accuracyrequirements IT and Regional networks: local expertise GOFC ITs Informed use of CEOS products Product evaluation, feedback Validation protocols LPV: topical mtgs & product-specific intercomparisons
Why validate global land products • WGCV definition implies validation = “Estimating Uncertainty” • Good science and resource management require understanding of product accuracy/uncertainty • Explicit statements of uncertainty fosters an informed user community and improved use of data • International environmental protocols and agreements imply products may be independently evaluated and possibly challenged • As more, and similar, global products are produced by CEOS members (via such programs as “GEO”), inter-use will require characterization of each product’s uncertainty
LPV: mission statement & objectives • Mission • To foster quantitative validation of higher-level global land products derived from remote sensing data and relay results so they are relevant to users • Objectives • (a) To work with users to define uncertainty objectives • (b) To identify and support global test sites for both systematic and episodic measurements (WGCV / WGISS Test Facility) • (c) To identify opportunities for coordination and collaboration • (d) To develop consensus “best practice” protocols for data collection and description • (e) To develop procedures for validation, data exchange and management (with WGISS)
Big Picture LPV provides a validation service to the Integrated Global Observation Strategy’s Global Terrestrial Observation System. Implications: • Focus Products: Biophysical, Land Cover, & Fire • Working in conjunction with GOFC/GOLD’s regional networks • Need to integrate with TEMS & GT-Net
Current/Upcoming Topical workshops • Follow-up Land Cover/Change • aiming toward “best practices” document • Early 2004, Boston University, USA • Fire and Burn scar: • Global Geostationary Fire Monitoring ApplicationsA Joint GOFC/GOLD Fire and CEOS LVP WorkshopMarch 23-25, 2004, EUMETSAT, Darmstadt, Germany • (related to Action WGCV 20-8) • “Results” workshop for LAI-intercomparison • fruition of LAI-intercomparison • 16 August 2004, University of Montana, Missoula US • Surface Reflectance and Albedo/BRDF • ? in conjunction with next BSRN meeting
Five listservs established ceos_lpv_gen@listserv.gsfc.nasa.gov General information regarding LPV activity, both scientific and administrative ceos_lpv_rad@listserv.gsfc.nasa.gov surface RADiation products, including surface reflectance/atmospheric correction, land surface temperature, albedo and BRDF ceos_lpv_bio@listserv.gsfc.nasa.gov BIOphysical parameters, including vegetation indices, leaf area index, FPAR, and vegetation productivity ceos_lpv_lc@listserv.gsfc.nasa.gov Land Cover and land cover change products ceos_lpv_fire@listserv.gsfc.nasa.gov FIRE, burn scar, and fire emissions products (related to action WGCV 20-11)
IEEE Transaction on Geoscience and Remote Sensing special issues Purpose: Lay out the current suite of higher-level global land products and quantitatively establish their accuracy. Provide a user’s perspective on the implications of a product’s accuracy to understand: how accurate the product needs to be why it is important to quantify the uncertainly how close currently available data come to meeting those needs
Proposed TGARS special issues Open invitation (suggestions or volunteers for reviews are also welcome) Papers due October 2004 Anticipated publication date early 2006 Ultimate objective is to provide not “mandatory protocols” but an “acceptable standard”
Validation: the process of assessing by independent means the quality of the data products derived from the system outputs (LPV will operates under this definition, but also with the understanding that validation activities should:- consider user accuracy needs and- feedback to algorithm improvements.) CEOS Definition
MODIS validation “hierarchy” • Stage 1 Validation: Product accuracy has been estimated using a small number of independent measurements obtained from selected locations and time periods and ground-truth/field program effort. • Stage 2 Validation: Product accuracy has been assessed over a widely distributed set of locations and time periods via several ground-truth and validation efforts. • Stage 3 Validation: Product accuracy has been assessed and the uncertainties in the product well established via independent measurements in a systematic and statistically robust way representing global conditions.
Biome Map Exhaustive and mutually exclusive global biome map Close match to GOFC/GOLD regional networks Published independently of LPVGurney et al. (2002) Towards robust regional estimates of CO2 sources and sinks using atmospheric transport models. Nature, 415, 626-630, 7 Feb. 2002.
Accuracy statements • Should be “user-oriented” and supported with peer-review literature • Augment validation “stage hierarchy” • Standardize/summarize information for each product • MODIS land team plans to update CEOS information for MODIS land products
Example: MODIS accuracy statements MODLAND Validation home page http://landval.gsfc.nasa.gov/MODIS …page for each product Link to accuracy statement for each product • Overall accuracy statement • Link to QA information • List of support material …pages for supporting materials • Title, author, abstract • Figures/captions • Tables/captions
LPV outline Review of subgroup’s status and goals Validation activities Conclusions
LPV Validation examples • CEOS “Core Sites” • Leaf Area Index “Intercomparison” • Albedo/BRDF • Active Fire/Burnt Area • Opportunities for Geostationary
CEOS “Core Sites” goals • Provide a focus for ongoing satellite, aircraft, and ground data collection for validation of CEOS member satellite/sensor products • Provide scientists with sets of readily accessible in-situ and CEOS member instrument data for algorithm validation and improvement • Build on infrastructure of existing scientific networks and validation sites • Realize international cost-sharing opportunities
CEOS “Core Sites” implementation • Utilize existing networks of validation sites for joint/multiple validation of CEOS member global land product • EOS Land Validation Core Sites • VALERI Network of sites • GTOS “Terrestrial Ecosystem Monitoring Sites” • CEOP/NASDA, CSIRO, GT-net, ILTERS sites • Provide easy access to high resolution data and subsets of global land products from CEOS member sensors • Conducted as a joint project with the CEOS Working Group on Information Systems and Services
CEOS “Core Sites”: Phase 1 • Simple, Web-Based Interface to CEOS Core Sites • Allow Users to Specify Conversion Parameters • Provide Access to EO and in-situ Data • Begin With MODIS, SPOT Veg, & Landsat • Allow Reprojecting • Allow Reformatting into GeoTIFF
Core Test Sites Data Distribution Investigator WGISS Test Facility MODIS Subsets, ETM+, ASTER (EDC DAAC) SeaWiFS Subsets (GSFC) MAVT (MERIS,AATSR) IVOS Data Subsetting, Reprojection, Formatting, QC Data Catalog Limited Storage Data Tools Science In Situ Data (PIs; ORNL)
http://edcsgs16.cr.usgs.gov/wgiss/ User Code = calval99 Password = wgiss03 Feedback welcome Jeff Morisette (jeff.morisette@nasa.gov) John Dwyer (dwyer@usgs.gov)
NDVI time series for Harvard Forest LTER Year: close up 2000 - 2004
NDVI time series for Harvard Forest LTER Year: close up 2000 - 2004
NDVI time series for Mongu, Zambia Year: close up 2000 - 2004
“Difference” analysis NDVI Difference: SPOT - AVHRR Autocorrelation function Lag (in months) Years (1998 – 2004) Average difference is 0.034 Significant ½ and 1 year lags
LPV Inter-comparison Site contacts provide “Vital Statistics” LPV provides subsets of global LAI product(s) LPV to create link to the site from the LAI-intercomparison page Field campaign(s): LPV acquires and posts relevant high-res multispectral imagery Site contacts collect field data and register these in the Mercury system Site contacts provide Internet link to locally maintained high-resolution LAI surface, with proper documentation on how the surface was derived LPV posts link to LAI-surface on LAI-intercomparison page Data are shared among fellow “LAI-intercomparison” participants for research comparing both validation results and methods Sites added to this international activity are those that help create a globally representative sample - across biomes and continents AND have a strong need or intention to utilize global, coarse resolution, LAI products.
Albedo “Intercomparison” • Albedo intercomparison is gaining momentum • Discussed at meeting in Fall ’02 • Consider are of overlap between MSG and MODIS • Utilize Baseline Surface Radiation Network sites with Albedometers • MODIS Albedo and Aerosol products subset and available through Oak Ridge National Lab • Concept will be brought to BSRN at their July meeting (interaction between Crystal Schaaf, Atsumu Ohmura, and Andreas Roesch) • “Kick off” will be a joint Working on Cal/Val meeting / Albedo Validation Campaign in Argintina, early 2005
The CEOS “Intercomparison” concept • LAI will serve as pilot study (August workshop and article in special issue) • Albedo intercomparison is gaining momentum • Burnt area intercomparison has strong potential • GBA2000 and MODIS in Africa
“Intercomparison” General Timeline Topical meeting to establish data requirements Decide on Sites Develop data sharing infrastructure Field Campaigns & individual product analysis Synthesis of results LAI Albedo Burnt Area
MODIS burnt area validation • Courtesy of David Roy et al., UMd • Botswana, Okavango Delta, 2001 • Landsat ETM+ path 175 row 073 • Cloud-free scenes acquired 32 days apart: • September 4th • October 6th
Landsat ETM+ Sept. 4th
Landsat ETM+ Oct. 6th Yellow vectors = ETM+ interpreted burned areas occurring between the two ETM+ acquisitions
MODIS 500m Burned Areas Sept. 4 to Oct. 6 White vectors = ETM+ interpreted burned areas occurring between the two ETM+ acquisitions
Burnt Area: MODIS vs ETM+ Each point illustrates the proportion of a 5.0*5.0 km cell mapped as burned
GOFC/GOLD Fire implementation team: Active Fire Validation with ASTER
MODIS fire detection: INPE and UMd …probably not an issue of which is best, but rather how to combine. … GOES ABBA fire could be included ACRE 29 Aug 2003 UMd INPE
MODIS fire detection: Brazil February 2004 X = 1.6 1.4 1.5 1.3 Y = 1.2 1.1 1.2 1.1 Average pixel size – indication that non-coincident fire are further from NADIR
View Angle dependency NOAA-12 hot spot detection for Tocantins State – Brazil July-August 2001. (x axis = date dd/m)
Curve fitting for regional summaries Fit curve to fire counts filter for extent of ASTER viewing angle. Then take area under the curve to represent fire counts - and fit curve with consideration of “error bars” estimated for near-nadir counts resulting from the ASTER analysis.
Daily Fire counts If we validate using sun-synchronous, what can we say about the accuracy of a “diurnal curve” – example: adjusting diurnal signal to four (or more) points from sun synchronous data, can we estimate the accuracy of “non-linearitites” between sun-syn. observations Number of Hot spots Time of Day
LPV outline Review of subgroup’s status and goals Validation activities Conclusions
Conclusion LPV is available to coordinate validation activities Burnt area “intercomparison” seems attractive Validation of Geostationary products is certainly possible, but somewhat limited