260 likes | 380 Views
Quality Assurance of Upper Air Meteorological Data. Robert A. Baxter, CCM Parsons Engineering Science Pasadena, CA. Upper Air QA Overview. Field program QA Scope of QA during the data collection effort Implementation of the field QA program Overall results of the field QA effort
E N D
Quality Assurance of Upper Air Meteorological Data Robert A. Baxter, CCMParsons Engineering SciencePasadena, CA
Upper Air QA Overview • Field program QA • Scope of QA during the data collection effort • Implementation of the field QA program • Overall results of the field QA effort • Data validation QA • Unresolved issues or unprocessed data • Post processing algorithm analysis associated with data validation • Lessons learned from overall program
Field Program Quality Assurance • Review candidate monitoring sites and aid in the site selection process • Perform system and performance audits early in the program to aid in early identification and correction of any potential problems • Assess the accuracy of the data collected
Field Program Quality Assurance Overall scope • Candidate site reviews (16 sites) • Systems audits (26 stations) • Performance audits- 25 surface meteorological stations- 4 sodars- 10 radar profilers/RASS systems • Assess overall data quality from surface and upper air measurements
Field Program Quality Assurance Equipment and variables audited • Radar wind profilers with RASS- NOAA/ETL 915 MHz three axis- Radian 915 MHz phased array • Sodars- NOAA/ETL two axis- Radian phased array- AeroVironment three axis • Surface meteorology (WS, WD, T, RH)
Candidate site reviews System audits Performance audits Audit Activities
Candidate site reviews System audits Performance audits Exposure for measurements Noise sources RF analysis AF analysis Power, security and communications Compatibility with neighbors Suitability for measurements Suitability for audit instrumentation Assessment of appropriate beam directions Audit Activities
Candidate site reviews System audits Performance audits System audit checklist Observables, equipment, exposure, operations Procedures, training, data chain of custody Preventive maintenance Site vista evaluation Orientation, level Picture documentation Operating environment Background noise Potential interfering sources Audit Activities
Candidate site reviews System audits Performance audits (surface) Wind speed Response Starting threshold Wind direction Alignment to true north Response starting threshold Temperature Relative Humidity Audit Activities
Candidate site reviews System audits Performance audits (upper air) Radar wind profiler (10 sites + ARB) portable sodar rawinsonde RASS (10 sites + ARB) rawinsonde Sodar (4 sites) simulated winds using APT Audit Activities
Field QA Data Analysis and Reporting • In field evaluation • preliminary surface and upper air results • system audit review • same day reaudits for any correctable deficiencies • Audit summary findings • provided by e-mail to management in ~48 hours • overall results with needed action identified • Completed audit summaries • provided by e-mail to management in ~2 weeks • detailed system and performance audit reports • Audit follow-up
Field QA Overall Results • Site operational differences between contractors • Systematic problems with equipment alignment • Equipment orientation errors in the data • Differences in data QC and validation procedures between reporting groups
Why are we going through these steps? What is the role of QA in the validation phase? Data Validation QA
Why are we going through these steps? What is the role of QA in the validation phase? Variety of data formats and reporting conventions Questions about the post-processing algorithms Incorporation of additional data available Completion of the processing steps Data Validation QA
Why are we going through these steps? What is the role of QA in the validation phase? Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Data Validation QA
Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Data Validation QA
Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Audit report and data header information Antenna orientation Surface vane orientation Time zone differences Reporting interval differences Data Validation QA
Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Goal is to determine the most appropriate methods to process and validate data Regional site classification Coastal and offshore Inland Desert Data set comparisons _0, _1, CNS, Sonde Comparison statistics RMS, systematic diff Data Validation QA
Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Review all sodar data (six sites) Determine needed post processing Vertical velocity correction Antenna rotations Algorithm corrections Interference problems (noise, reflections) Data Validation QA
Identification of system offsets Evaluation of post-processing algorithms Sodar data evaluation and validation Data quality descriptors Metadata Site by site descriptors Data qualifiers (minor offsets, limitations) Pertinent information from audits and validation Data Validation QA
Lessons Learned From the QA Program • On-site review of each and every station • Consistent procedures implemented by each audit group • Implementation and adherence to SOPs by all study organizations • Consistent processing procedures implemented by groups with similar data sets • Don’t shortcut the on-site documentation process in either the operations or QA