230 likes | 366 Views
GOES-R AWG Product Validation Tool Development. Aerosol Optical Depth/Suspended Matter and Aerosol Particle Size Mi Zhou (IMSG) Pubu Ciren (DELL) Hongqing Liu (DELL) Istvan Laszlo (STAR) Shobha Kondragunta (STAR). OUTLINE. Products Validation Strategies & Tools Selected Examples
E N D
GOES-R AWG Product Validation Tool Development Aerosol Optical Depth/Suspended Matter and Aerosol Particle Size Mi Zhou (IMSG) PubuCiren (DELL) Hongqing Liu (DELL) Istvan Laszlo (STAR) ShobhaKondragunta (STAR)
OUTLINE • Products • Validation Strategies & Tools • Selected Examples • Ideas for Further Enhancement and Utility of Validation Tools • Summary
Products • Aerosol Optical Depth (AOD) / Suspended Matter (SM) • Aerosol Particle Size (APS) • Ångström Exponent is reported • at 2 km every 15 minutes for CONUS and FD
Validation Strategy and Tools Strategy • Spatial and temporal match-up with ground “truth” and independent satellite retrievals • ground data are temporally averaged within a 1-hour window around the satellite overpass time; satellite data are spatially averaged in a 50x50-km box centered on the ground station • Quality control of comparison data set • Comparison and calculation of appropriate statistics Tools • Monitor operational Level-2 aerosol products • Displays images of product and quality flags • Plots histograms for specified granules • Collocate aerosol products with reference (“truth”) observations • Compare with ground truth • Time series for collocated AERONET stations • Frequency scatter-plot and linear regression • Statistics for comparison with F&PS requirements • IDL is used for visualization and calculation of statistics
Reference Data • AERONET aerosol product • ground-based remote sensing network providing a comprehensive dataset of aerosol properties http://aeronet.gsfc.nasa.gov/cgi-bin/combined_data_access_new • MODIS aerosol product • AOD, Ångström Exponent, fine-mode weight over ocean, surface reflectance over land, aerosol type over land from MODIS collection 5 aerosol products ftp://ladsweb.nascom.nasa.gov/allData/5/ • CALIPSO • Level 2 AOD and aerosol type profile from 5-km aerosol layer data http://eosweb.larc.nasa.gov/cgi-bin/searchTool.cgi?Dataset=CAL_LID_L2_05kmALay-Prov-V3-01 • MAPSS • A Multi-sensor Aerosol Products Sampling System providing co-located MODIS-AERONET, CALIPSO-AERONET, and MISR-AERONET data http://disc.sci.gsfc.nasa.gov/aerosols/services/mapss/mapssdoc.html#caliop 09/2007
Example use of “Deep-Dive” Tools ABI retrieval MODIS retrieval • ABI image vs. MODIS image has much less valid retrievals; • Plotting of inputs suggests it is due to bad snow mask • Correct snow mask and reprocess this granule ABI snow mask Correct ABI snow mask Reprocessed ABI retrieval
Summary • Current tools perform three functions: • routine monitoring of product • routine validation with reference data • deep-dive validation with reference and intermediate data • Validation truth data have been identified and processed • Planned enhancements include: • more stats • automatic detection of problems
GOES-R AWG Product Validation Tool Development Aerosol Detection Product Team Shobha Kondragunta (STAR) and Pubu Ciren (DELL)
Products Dust plume over China 05/10 /2011 Smoke over eastern Asia 04/13/2011 Product generated using MODIS L1B radiances at 1 km resolution. Accuracy requirements: 80% (dust); 80% (smoke over land) and 70% (smoke over water)
Validation Strategies • Product is qualitative in nature. So finding “truth data” is not that straight forward. Therefore strategy is: • to use whatever coincident other similar satellite data exist. These are: • CALIPSO Vertical Feature Mask (VFM) • OMI (OMPS in the future) Aerosol Index product (positive for dust and smoke) • to infer the presence of smoke and dust in the atmosphere using in situ observations and use the data. These are: • AERONET Angstrom Exponent (itself an un-validated product) • IMPROVE speciated aerosol measurements (daily average) • Visual inspection of GOES-R ABI RGB imagery (same temporal scale as dust/smoke detection – For internal consistency check for diagnosing issues and may not be released to public. • SEARCH speciated aerosol measurements (hourly) • Hazard Mapping System (HMS) fire and smoke analysis • Field campaign data during which visual reports of dust and smoke, if they exist • Compare aerosol model (type) identified by GOES-R ABI Suspended Matter/AOD algorithm for qualitative comparison and assigning a quality flag that indicates the mis-match - For internal consistency check for diagnosing issues and may not be released to public
Routine Validation Tools • ABI aerosol detection algorithm run in near real time using MODIS L1B data as proxy: • Global (2-week delay) • CONUS (1-day delay) • Product and quality flags displayed http://www.orbit2.nesdis.noaa.gov/smcd/spb/pubu/validation_new/adp_modis_db.php
Routine Validation Tools • ABI aerosol detection algorithm run in near real time using MODIS L1B data as proxy: • Global (2-week delay) • CONUS (1-day delay) • Product and quality flags displayed http://www.orbit2.nesdis.noaa.gov/smcd/spb/pubu/validation_new/adp_modis_db.php
Routine Validation Tools • Product validation: using CALIPSO Vertical Feature Mask (VFM) as truth data (retrospective analysis not near real time. Data downloaded from NASA/LaRC) • Tools (IDL) • Generates match-up dataset between ADP and VFM along CALIPSO track, spatially (5 by 5 km) and temporally (coincident) • Visualizing vertical distribution of VFM and horizontal distribution of both ADP and VFM • Generating statistic matrix
”Deep-Dive” Validation Tools Percentage of Pixels (%) Day of the Year (2011)
Summary • Routine validation tools have been developed to validate ABI aerosol detection product derived from MODIS level-1B data: • Matchup tools for comparisons to AERONET, CALIPSO, and IMPROVE have been developed • Matchup tools for comparisons to OMI, SEARCH, and HMS datasets are in R&D phase • Web-based monitoring tool has been developed to identify issues with algorithm: • Change in product performance as indicated by low probability of detection or low confidence in retrieval can trigger a deep-dive analysis to understand: • Accuracy of cloud detection • Accuracy of snow/ice mask • Etc.
GOES-R AWG Product Validation Tool Development Total Column Ozone Chris Schmidt (UW-Madison)
Ozone Requirements * The 10 km resolution is a legacy of the cancelled GOES-R sounder. 18
Validation Approach There are three potential sources of validation data Satellite collocations of ozone values • SEVIRI proxy data allows for validation against polar-orbiting UV-based ozone detection from Ozone Monitoring Instrument (OMI) • OMI footprints are estimated based on building rectangles around OMI locations in OMI space • SEVIRI pixels within the footprints are averaged • Method applicable to ABI and polar orbiting ozone instruments in the future • OMI is extensively validated against ground-based Dobson Photospectrometers, %RMSE less than 2% Ground based detection • Dobson photospectrometers provide total column ozone at certain sites around the world, typically at solar noon • Global network is shrinking Model-generated proxy data • Difficult to generate, numerical weather prediction models typically use climatological ozone • Some is now available and will be tested Approach: Focus on satellite collocations as primary validation.
Validation Approach Long-term Validation Plan: Validate against UV-based satellites and ground-based network as available August 2006 average total column ozone Desert surfaces pose a challenge for the regression Dobson Units Dobson Units 200 500 200 500 Met-8 OMI
Ozone Validation Against Requirements Ozone total column requirement is that precision be within 25 DU and accuracy be within 15 DU. Regression is trained for each satellite, unaccounted for biases in BTs will lead to differences in performance between satellites. For the Met-8 dates (August 2006, 1-14 Feb 2007, 1-10 April 2007) in the standard 10 week proxy dataset, overall precision was 14.8 DU and accuracy was 3.3 DU. For the Met-9 dates (11-13 April 2007, 1-13 October 2007) the precision was 18.6 DU and the accuracy was 8.1 DU. There appears to be bias in the BTs that is unaccounted for in the regression training. Cloud mask is from the AWG Clouds team.
Ground Station Validation TCO from Dobson instruments at ground stations, primarily in Europe but some in Africa (see below), compared to SEVIRI TCO for dates August 2006, 1-14 Feb 2007, 1-13 April 2007, and 1-13 October 2007. Overall precision was 23.7 DU and accuracy was 2.1 DU. 23