1 / 23

CMAS Conference 2011

CMAS Conference 2011. Comparative analysis of CMAQ simulations of a particulate matter episode over Germany. V. Matthias , A. Aulinger, M. Quante, C. Chemel, J. L. Perez, R. San Jose, R. Sokhi. Chapel Hill, October 26, 2011. Case study on PM10 (Feb/March 2003).

nyla
Download Presentation

CMAS Conference 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CMAS Conference 2011 Comparative analysis of CMAQ simulations of a particulate matter episode over Germany V. Matthias, A. Aulinger, M. Quante, C. Chemel, J. L. Perez, R. San Jose, R. Sokhi Chapel Hill, October 26, 2011

  2. Case study on PM10 (Feb/March 2003) PM10 daily mean concentration (µg/m3) over Germany on March 2, 2003 Stern et al., Atm. Env. 42, 4567-4588 (2008)

  3. COST 728 study • COST 728: • European cooperation, participants from more than 20 nations • „Enhancing Mesoscale Meteorological Modelling Capabilities for Air Pollution and Dispersion Applications“

  4. CMAQ model intercomparison Group Met-Model CTM IfK, HZG MM5 CMAQ Uni Hertfordshire (UH) WRF CMAQ TU Madrid (UPM) MM5 CMAQ

  5. What are the differences? • Preparation of the emissions • Meteorological fields • Inital and boundary conditions • Grids (horizontal and vertical structure) • Computing platforms • People who run the model

  6. Concept Round 1: • All groups provide input files (IC,BC,EMIS,METEO,GRIDDESC) • All groups use common CMAQ version (4.7) and chemistry mechanism (cb05_ae_aq) • All groups recalculate results of others Expected results: • Determination of simple (model user) errors (switches …) • Quantification of computing errors (compiler, platform, …)

  7. Recalculations: Sulfate Westerland Melpitz

  8. Recalculations: Nitrate Westerland Melpitz

  9. Outcome: Recalculations • CMAQ model results can be reproduced by other groups on different computing platforms • Depending on species, some differences exist but they are much smaller than the differences in the “blind“ runs

  10. Concept (2) Round 2: Agree on common grid Use same initinal and boundary conditions (IC & BC) Expected Results • influence of emissions • influence of meteorological fields

  11. Impact of emissions Reconstruction of UH run Emission files from UPM Sulfate Nitrate

  12. Emissions in Central Europe (spatial average) NO SO2 Time series from CMAQ input files NH3 UH UPM HZG

  13. Outcome: Emissions • Emissions may be prepared in different ways concerning their temporal and spatial variation. • For short time series at certain grid points this may lead to significant differences in particle concentrations.

  14. Impact of meteorology MM5 from UPM MM5 from HZG WRF from UHS Sulfate Nitrate

  15. Outcome: Meteorology • Numerous meteorological parameters may influence particle concentrations. • The quality of the CTM results may not be judged from the quality of the meteorological fields.

  16. Open questions • Do we see „typical“ differences between „correct“ model runs or were there important errrors in the input data? • Sensitivity study: • Annual runs (year 2000) with CMAQ 4.6 with different • Boundary conditions • Emission files • Meteorological data • Goal: • Quantify the variability of the hourly and daily concentrations at Melpitz

  17. Boundary conditions • BC from global models: Mozart and TM4 NO3 SO4 SO2

  18. Emissions • SMOKE-EU emissions and EMEP emissions • Additional comparisons to other emission data sets with similar results NO3 SO4 SO2

  19. Meteorology • MM5 (FDDA with NCEP) and CCLM (Spectral nudging with NCEP) • Hourly values NO3 SO4 SO2

  20. CMAQ intercomparison: different emissions • Nitrate values with UH emissions lower than it could be expected. SO4 NO3

  21. CMAQ intercomparison: different meteorological fields • Sulfate values with UPM and HZG meteo within expected range • Nitrate values low, but may be explained by variability due to meteo input NO3 SO4

  22. Summary • CMAQ intercomparison within COST 728 showed: • Simulations are reproducable by other groups on other computing platforms • Emission data may be prepared in very different ways • Largest influence on simulation results comes from meteorology • Unreliable results may be detected by comparisons to sensitivity runs Acknowledgements Emission data has been prepared by Johannes Bieser Most CMAQ sensitivity runs were set up by Johannes Bieser Total gridded emissions were provided by TNO, IER and EMEP Boundary conditions were provided by the RETRO project (TM4) and Ulrike Niemeier (Mozart)

  23. Thank you

More Related