1 / 20

MIPAS Validation Summary: Radiometric Accuracy and Geophysical Validation

Summary of the plenary session on MIPAS validation focusing on radiometric accuracy, temperature discrepancies, ozone levels, and cloud contamination, along with recommendations for algorithm updates and lessons learned.

Download Presentation

MIPAS Validation Summary: Radiometric Accuracy and Geophysical Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MIPAS Validation Summary - Plenary Session - Herbert Nett ESTEC / EEM-PPP

  2. Measured NESR0

  3. Radiometric Accuracy • Wavenumber range [cm-1] • value • In-flight validation • 685 - 970 • 2 * NESRT+ 5 % [true source radiance] • Ok • 1570 - 2410 • 2 * NESRT+ 1 % [true source radiance] • Ok • Dynamic range (BB source) • (0 – 230) K • Ok Radiometric Performance • Critical aspects • ice contamination on focal plane optics • non-Linearity of detectors in bands A, AB and B -> strongest impact in band A !

  4. Level 2 algorithm verification: critical aspects • variabilities in target gas signatures (latitudinal, day/night dependencies, perturbed chemistry …) • knowledge of systematic error sources (modelling of instrumental errors, interfering species, spectroscopic errors, Non-LTE ...) • code robustness wrt instrumental effects & extreme atmospheric conditions

  5. VMR (ppmv) Water vapour Ozone N2O Altitude [km] CH4 HNO3 NO2 Orbital coordinate [deg] Results of orbit #504 retrievals: all species

  6. GAIN: gain cal error GRA: horizontal T gradient ILS: ILS shape error Tem: temperature error Hitran: spectrocopy error NLTE: Non-LTE effects [ Extracted from presentation A. Dudhia / U. of Oxford ]

  7. By: H. Oelhaf / FZ-IMK MIPAS-B2 vs. MIPAS-Envisat Comparisons Temperature HNO3

  8. NDSC /O3 sonde & MW radiometer Payerne (av., after 13 Nov) NDSC / Lidar OHP Lauder (NZ) By: J.C. Lambert, V. Soebijanta, BIRA/IABS

  9. MIPAS ECMWF By: A. Dethof / ECMWF MIPAS temperatures • Good agreement of MIPAS temperatures with ECMWF analyzed temperatures over large part of stratosphere (diff < 2%) • Largest differences at 0.1 hPa (ECMWF model top) • MIPAS too cold at bottom end of profiles especially in tropics (cloud contamination?). Improvement after upgrade on 13.11.2002. • Very robust. Same features seen every week. Global averages, 11.-17.11.2002 Departures MIPAS - EC Improvement after 13.11.02

  10. By: A. Dethof / ECMWF MIPAS Ozone • Reasonable agreement with ECMWF ozone over large part of stratosphere • Some differences might be explained by known ECMWF model bias: e.g. – tropical O3 max. lower in ECMWF than MIPAS - 90-65ºN: ECMWF > MIPAS over large part of stratosphere • Unrealistically large MIPAS ozone values in lower stratosphere (cloud contamination?). Improvement after upgrade on 13.11.2002. ECMWF 4.-10.11.02 25.11.-1.12.02 MIPAS Improvement after 13.11.02

  11. Cloud top height distribution in MIPAS measurements (~4100 profiles, 7 – 25 Sep 2002) By: J. Remedios/ U. Of Leicester

  12. MIPAS L2: Geophysical validation

  13. MIPAS L2: Geophysical validation / summary temperature: good agreement with correlative measurements & analyses O3: generally good agreement, bias +1…2 ppmv (?) -> spect. database H2O: too high > 55 km -> Non-LTE (?) too low < 20 km -> cloud contamination, MW choice, conv. thresholds HNO3: bias (-) -> updated spectroscopic data (mipas_hitran v2 -> v3) will yield ~ 10 % higher mixing ratios CH4, N2O, NO2: tendency to vertical oscillations -> T error propagation (‘F’ - ‘R’ difference in detector NL correction/band A) ACVT-MASI: “the MIPAS data set (including also HNO3, CH4, NO2) is the only data set that is self-consistent and can be included in existing assimilation systems” ,

  14. -75° Retrieval altitude-range

  15. Lessons learned / critical areas • potential for reduced total errors by extended MW selection (pT, H2O, …) • perturbations in non-regularised profile retrievals due to oscillations in ‘fw’ – ‘rev’ sweep radiances (re-check after enhanced NL correction scheme in place) • extension of retrieval height range towards higher and lower limb heights will improve the profile accuracy also within the nominal height interval • inaccuracies in spectroscopic line data (incl. error correlations) -> essential also for gephysical validation (HNO3 …)

  16. Baseline modifications: L2 algorithm

  17. MIPAS error budgets • Re-assess total budgets, take into account: - mean profiles and variabilities of contaminants - impact of assumed profile shape above highest tangent altitude - impact of convergence thresholds Reporting ESD & temperature error propagation -> Level 2 products Systematic components (HITRAN, NLTE, gain, ..) -> ’off-line’ information, could be provided as TN & coded data sets (as done for MIPAS Averaging Kernels)

  18. Conclusions • MIPAS in-flight calibration & characterisation tasks completed (some activities & documentation under finalisation) • geophysical validation: first intercomparison results available (ground-based, balloon sensors, assimilation studies, …) • instrument in excellent health, consolidated L0 -> L2 processing chain • stable algorithm baseline (May ‘02->), only minor changes in aux data • important update 13 Nov (LOS pointing correction, L2 tuning parameters) • algorithm update in progress (pT/H2O loop, cloud detection)

  19. Recommendations • Level 1B -> release ok • Level 2: -> release after update of IPF (incl. cloud detection, pT-H2O iteration loop, mipas_hitran v.3.0) • Essential: supply MIPAS data users with • - total error budgets • - averaging Kernels • - spectroscopic database • - reports on geophysical validation campaigns

  20. The MIPAS Calibration & Algorithm Verification Team • P. Mosner/R.Gessner Astrium/D [ instrument engineering / operations ] • G. Perron, G. Aubertin ABB BOMEM [ ESL / Level 1B ] • Th. Fiksel DJO [ L1B&2 s/w engineering ] • S. Bartha Astrium/D [ L2 s/w engineering ] • B. Carli, P. Raspollini IFAC-CNR [ ESL / Level 2 ] • M. Carlotti, M. Ridolfi U. of Bologna [ ESL / Level 2 ] • B. Dinelli ISAC-CNR [ ESL / Level 2 ] • Dudhia, C.D. Rodgers Univ. of Oxford [ ESL / Level 2; project AO # 323 ] • J.M. Flaud LPM/Paris [ ESL / Level 2 ] • M. Hoepfner/H. Oelhaf FZ-IMK [ ESL / Level 2 ] • T.v. Clarmann FZ-IMK [AO#145 & AMIL2DA / L1B&2 analysis ] • M. Lopez-Puertas IAA [ project AO # 304 / L2 analysis ] • M. Birk/G. Wagner DLR-IMF [ project AO # 652 / L1B analysis ] • J.J. Remedios/R. Spang Univ. of Leicester [ project AO # 357 / L1B&2 analysis ] • G. Schwarz DLR-IMF [ ESL / Level 2 ] • ESTEC: J.C. Debruyn, A. Burgess (now U. Oxford), J. Langen, M. Sanchez, H. Nett • ESOC: A. O’Connell (now EUMETSAT), D. Patterson , F. Diekmann, A. Rudolph

More Related