380 likes | 559 Views
QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS. WMO TECHNICAL CONFERENCE ON METEOROLOGICAL AND ENVIRONMENTAL INSTRUMENTS AND METHODS OF OBSERVATION TECO-2005 C. Bruce Baker, NOAA
E N D
QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS WMO TECHNICAL CONFERENCE ON METEOROLOGICAL AND ENVIRONMENTAL INSTRUMENTS AND METHODS OF OBSERVATIONTECO-2005 C. Bruce Baker, NOAA USA
The Backbone QUALITY MANAGEMENT, CALIBRATION, TESTING AND COMPARISON OF INSTRUMENTS AND OBSERVING SYSTEMS Note the stability
Functions of an International/National Backbone • Infrastructure in Place for Quality measurements • Collects open access data and provides consistent quality assurance and control • Distributes data and information (via multiple paths) in real time (varies with parameter) and ensures archival • Abides by national / international standards and fosters the implementation of standards by local and regional observing systems
Key Components • Management of Network Change • Parallel Testing • Meta Data • Data Quality and Continuity • Integrated Environmental Assessment • Complementary Data • Continuity of Purpose • Data and Meta Data Access
VOCABULARY MANAGEMENT Documentation, Performance Measures, and Requirements PROGRAM POLICY Determined by International or National Policy and Science Driven Directives QUALITY MANAGEMENT SYSTEM Personnel, Hardware, Ingest, and Dissemination QUALITY MANUAL Requirements Documents QUALITY CONTROL Automated, Manual, Maintenance QUALITY ASSURANCE Documented Metadata, Performance Measures RESEARCH Testing, Intercomparisons, Transfer functions Overlapping Measurements IMPLEMENTATION Program Infrastructure
Functional Requirements • Systems - parameters, ranges, accuracies, resolutions, expandability, design life, maintainability • Program - number of systems, cost and schedule targets, communications • Commissioning • Defines decision point – when data are official • Sustained operation, data from each site 95% of the time within one hour and/or successful entry into the archives within 30 days
Configuration Management • Change management of hardware and software items, metadata management • responsibilities and procedures for CCB
Test and Evaluation Phase • Conducted by Evaluation Team • Reviewed by Ad Hoc Science Working Group • Six areas Evaluated • Site Selection • Site Installation • Field Equipment and Sensors • Communications • Data Processing and Quality Control • Maintenance
5 Components of Data Quality Assurance (QA) • Laboratory Calibration • Routine Maintenance and In-Field Comparisons • Automated Quality Assurance • Manual Quality Assurance • Metadata, Metadata, Metadata • Ability to Integrate New technology
Laboratory Calibration • Every sensor is calibrated before deployment to verify and improve upon the manufacturer’s specifications • Sensors are routinely rotated back into the lab from the field to be re-calibrated
Routine Maintenance and In-Field Comparisons • Site Maintenance Passes • Three visits scheduled annually • Trouble Ticket or Emergency Repairs • Malfunctioning Sensor • Lightning Strike • Communication Problems • Theft and Vandalism
Site Maintenance Passes Sensor Inspection Air Temperature and Humidity sensors are inspected for dust accumulation, spider webbing and wasp nests. The radiation shields of these sensors are also cleaned.
Trouble Ticket or Emergency Repairs Trouble Tickets • Issued by the Data QA Manager • Priorities range from 2 to 30 business days (based on sensor) • QA Manager provides a description of the problem • Technicians complete the form with time of fix, serial numbers of sensors and a description of the repairs made • Technicians may also generate tickets in the field and submit them to the QA Manager
Quality Assurance of Instruments • Documented in Anomaly Tracking System Users Manual • Reports of Incidences collected, evaluated, maintenance as needed • Metadata records updated • Quality Control Data • Documented in Data Management – Ingest to Access • Data ingest • Tests for proper message form, communication errors, etc. • Automated • Limits - Gross limits check • Variance - Limits for individual parameters • Redundancy - Data inter-comparison relies on multiple sensors • Manual -- Handbook of Manual Monitoring
Ingest Raw-Data Archive Processing Internet Quality Control Flagged-Data Archive User Community Maintenance Notification offline online Maintenance Provider Access Field Sites InstrumentSuite Processing Unit Communications Device Communications Network
Performance Measures 114 CONUS Geographic Locations Required • Captures 98% of variance in monthly temperature, 95% in annual precipitation for CONUS. • Average annual error <0.1ºC for temperature, <1.5% for precipitation • Trend “errors” <0.05ºC per decade • IPCC: projects warming of 0.1-0.3ºC/decade and precipitation changes of 0–2%/decade for CONUS.
Determine the Actual Long-term Changes in Temperature and Precipitation of the Contiguous U.S. (CONUS) FY2005 Target: Capture more than 96.9% and 91.1% of the temperature and precipitation trends.
DewTrack MET2010 Standard RMY USCRN Shield PMT New ASOS Standard HMP243 Air Temperature & RH Monitoring At High Plains Regional Climate Center (Lincoln) ASOS MMTS CRS Gill
Cross-Network Transfer Functions Cooperative Observer Network (~10,000 Stations)
Planned USCRN Stations at end of 2008 (114* stations) Installed Paired Locations Installed Single Locations * Does not Include Alaska, Canada, Hawaii, & GCOS stations As of April 26, 2005
Siting Standards Documents Representativeness • Network Plan • Site Acquisition Plan • Site Information Handbook • Site Survey Plan • Site Survey Handbook • Site Survey Checklist • Site Acquisition Checklist
Major Principles of Station Siting • Site is representative of climate of region. • Minimal microclimatic influences. • Long-term (50-100 year) land tenure • Minimal prospects for human development • Avoids agriculture, major water bodies, major forested areas, basin terrain. • Accessible for calibration & maintenance. • Stable Host Agency or Organization. • Follows WMO Climate Station Siting Guidelines
Objective Site Scoring • An objective scoring sheet was developed based on the Leroy method. The score for a station becomes part of the metadata for the station • Re-scoring of stations is part of the annual maintenance visit; allows tracking time change in representativeness of station meteorology
International Cooperation ,Collaboration and Partnerships • U.S Representative on the Canadian National • Monitoring Change Management Board • Canadian Reference Climate Network program participates on the USCRN Science Review Panel • USCRN hardware architecture incorporated into • Canadian Climate Monitoring Network • Two nations will exchange and co-locate reference climate stations FY04 First step in international cooperation to have commonality established for surface observing systems to monitor climate change
QUESTIONS • How do we continue to expand International and • National Partnerships?? • What is the best way for the exchange of information?? • How do we glue the system of systems together?? E-Mail: Bruce.Baker@noaa.gov URL: http://www.ncdc.noaa.gov/oa/climate/uscrn/index.html
Network Characteristics • Benchmark Network for temperature and precipitation • Anchor points for USHCN and full COOP network • Long-Term Stability of Observing Site (50+ years) likely to be free from human encroachment • Sensors Calibrated to Traceable Standards • Planned redundancy of sensors and selected stations • Network Performance Monitoring - Hourly and Daily • Strong Science & Research Component