1 / 18

Development of a 103-Year High-Resolution Climate Data Set for the Conterminous United States

Development of a 103-Year High-Resolution Climate Data Set for the Conterminous United States. Wayne Gibson 1 , Christopher Daly 1 , Tim Kittel 2 , Doug Nychka 2 , Craig Johns 2 , Nan Rosenbloom 2 , Alan McNab 3 , and George Taylor 1.

ford
Download Presentation

Development of a 103-Year High-Resolution Climate Data Set for the Conterminous United States

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Development of a 103-Year High-Resolution Climate Data Set for the Conterminous United States Wayne Gibson1, Christopher Daly1, Tim Kittel2, Doug Nychka2, Craig Johns2, Nan Rosenbloom2, Alan McNab3, and George Taylor1 1 Spatial Climate Analysis Service,Oregon State University, Corvallis, OR 97331, USA 2 National Center for Atmospheric Research, Boulder, CO 80307, USA 3 National Climatic Data Center, Asheville, NC 29901, USA

  2. Introduction • Why is the data set so useful? • unique • complete in space and time for long time period (US, 103 years) • high resolution (4km) • spatial QC of the station data prior to modeling • many applications need this type of data • Methodology used to create grids • Statistical infilling of incomplete station data (NCAR) • PRISM model used to spatially map the station data • PRISM products • Official USDA 1961-1990 normals for the US • New NCDC Climate Atlas of the US (48 parameters) • Canada, China, European Alps, Pacific Islands, Puerto Rico

  3. Project Overview • Main objective • To create serially complete, high quality, topographically sensitive, high resolution grids for the conterminous United States (ppt, Tmin, and Tmax) • To create a serially complete infilled station data set • Progression • Year 1: Preliminary precipitation grids created for 1948-1993 • Year 2: Development of a semi-automated Quality Control (QC) system (ASSAY QC, based on PRISM) • Year 3: Development of a more robust methodology for station data infilling (National Center for Atmospheric Research) • Year 4: Creation of final grids for the time period 1895-1997

  4. Collection of Station Data • HCN: Historical Climate Network (1895-1997) • COOP: National Weather Service Cooperative Network (1895-1997) • MCC: COOP data from the Midwestern Climate Center (1895-1947) • SNOTEL: SNOwpack TELemetry Network, National Resource Conservation Service (1978-1997) • AG: Agricultural climate data (1961-1993) • MISC: Miscellaneous data (storage gauges, snow course) Inconsistencies between Station Data Networks

  5. Observation Networks over Time

  6. Observation Networks vs Elevation

  7. Data QC: Station Metadata Checks • Elevation • Using Geographical Information Systems (GIS) • Latitude, longitude, and elevation: • Analyzed each stations metadata for changes • Total of 100 metadata errors • Horizontal position errors <= 2 degrees • Elevation errors <= 1200 m

  8. Data QC: PRISM based QC system – ASSAY QC • What is Bad Data • Data having transcription errors • We are not attempting to identify errors such as gauge under catch, observation methods, or instrumentation changes. • ASSAY QC – automated method • Jackknifed prediction • Compare predicted to observed value • Tag large differences as “candidate” outliers. • Process of evaluating detection of outliers with actual station data (monthly and daily observations) • Post Processing • Additional Check to “Candidates” Applied Based on Closest/Highest Weighted Station: • Distance • Elevation • Precipitation Amount • Large Outliers in the Observations • Manual Checks • List of “Bad” Observation. Mark as Missing.

  9. QC Results - Precipitation • 2371 monthly data errors out of 6,345,675 station-months for a detection rate of 0.0374% • This are about 2 errors per monthly grid, not insignificant • Also keep in mind that there is a propagation of errors in space. (50km radius or greater)

  10. Example Outlier

  11. Issues • Inconsistencies among Observation Networks • SNOTEL vs COOP (ppt) • HCN vs COOP (adjusted vs raw) • Station data infilling errors • Climatologically aided interpolation (climate as predictor) • ASSAY QC improvements • Easily detects outliers • Run iteratively • Independent evaluation • Long term runoff

  12. Fullerton: Location to Other Stations

  13. HCN vs COOP: Inconsistencies

  14. HCN vs COOP: Inconsistencies

  15. Summary • Important data set. High quality, high resolution, and long duration. • Can be used to support a variety of research topics in many disciplines. • Precipitation, Tmin, and Tmax • Summer 2002 • ftp://ftp.ncdc.noaa.gov/pub/data/prism100 http://www.ocs.oregonstate.edu/prism

More Related