110 likes | 272 Views
National programmes European Monitoring and Evaluation Programme (EMEP) Working Group on Effects (ICP-F, ICP-IM, ICP-V) World Meteorological Organisation - Global Atmosphere Watch (WMO-GAW) (6 WDCs) Marine Conventions (OSPAR, HELCOM) Arctic Monitoring and Assessment Programme (AMAP)
E N D
National programmes European Monitoring and Evaluation Programme (EMEP) Working Group on Effects (ICP-F, ICP-IM, ICP-V) World Meteorological Organisation - Global Atmosphere Watch (WMO-GAW) (6 WDCs) Marine Conventions (OSPAR, HELCOM) Arctic Monitoring and Assessment Programme (AMAP) EEA (EiO, EIONET) North American Programmes (CAPMON, NADP, NTN) Acid Deposition Network in East Asia (EANET) Impacts project (China) others Regional monitoring programmes
Recommendation of methods, preparation of manuals, Field and laboratory intercomparisons, Data and metadata collection, validation and storage, Evaluate QA/QC-procedures Data interpretation, data reports and joint reports comparison with CTM-estimates deposition and exposure estimates Representativeness, Development of network and measurement programme, All in close cooperation with parties, the other EMEP Centres, external bodies and the research community. EMEP-CCC tasks
National programmes (remote, rural, urban, hot-spot) EMEP also TOR2, PM-Nordic OSPAR HELCOM AMAP WDCSO3 (WMO-GAW) NADIR CALVAL (ESA - ENVISAT) Data handling for research projects e.g. DG- RES. Databases handled by NILU
Data are reported through many channels; by the person doing the measurements (official) by independent scientists (unformal) by or through a national reference institution (official) by or through appointed contact person (NFP)(official but often not responsible for the actual measurement) by or through official body (non scientific) exchange between databases or similar Varies upon compound, site or network Varies upon to which database data are reported for Experience show that the organisation of dataflow within countries are poor Joint validation, correction and re-submission crucial points where good procedures are essential How data flows
Multiple storage inconsistencies Joint validation with data originator by whom, how are corrections handled Which are the official data? Route of data flow Data suppliers often not identical Formats Deadlines Needs attention;
Official data and its quality - a recent example from one site
Objectives may partly be different Many organisations with different requirements Rigid and low flexibility Restriction on data-use, permits, acknowledgements Large resources required “Ownership” to and knowledge of data may decrease Documentation and need for metadata differs Quality differs Difficulties with “mega” databases
Methodologies Algorithms Software QA/QC criteria, metadata Easy access (use of portals?) Capacity building Network improvement Key reasons to harmonise
Harmonisation can be beneficial in many ways cost savings, better utilisation, more data, improved knowledge Several steps have been made Several challenges to be resolved Flexibility decreases Cost may rather increase (?) Is the potential for savings larger by other means ? cost of running sites, harmonisation of networks, “the ICP Forest example” Conclusions