1 / 33

Evaluation of cloudiness simulated by climate models using CALIPSO and PARASOL

Accurately assessing cloud representation in climate models by evaluating cloud cover, vertical distribution, optical depth, and cloud type relationships. Comparison of observational and simulated datasets for comprehensive model analysis.

vernia
Download Presentation

Evaluation of cloudiness simulated by climate models using CALIPSO and PARASOL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of cloudiness simulated by climate models using CALIPSO and PARASOL H. Chepfer Laboratoire Météorologie Dynamique / Institut Pierre Simon Laplace, France Contributors : S. Bony, G. Cesana, JL Dufresne, D. Konsta, LMD/IPSL D. Winker, NASA/LaRC D. Tanré, LOA C. Nam, MPI Y. Zhang, S. Klein, LLNL J. Cole, U. Toronto A. Bodas-Salcedo, UKMO

  2. Les nuages : principale source d’incertitude pour les prévisions d’évolution du climat Reducing this uncertainty => a thorough evaluation of cloud description in climate models

  3. Evaluation of clouds in climate models • generally based on monthly mean TOA fluxes ERBE, ScaRaB, CERES, and ISCCP • (e.g. Zhang et al. 2005, Webb et al. 2001, Bony et al. 2004, ….) • Problem: a good reproduction of monthly mean TOA fluxes can be due to error compensations Errors compensate between: (c) Instantaneous/ Monthly Time averaging (b) Cloud optical thickness and Total cloud cover Vertically integrated value within lat x lon grid box (a) Cloud vertical distri. Same TOA fluxes These errors impact cloud feedback as predicted by climate models Need to unravel error compensations using: (1) Independent, simultaneous obs of cloud cover, vertical structure, cloud optical depth (2) instantaneous and monthly mean observations (3) methods for consistent model/obs comparisons

  4. Specific data processing starting from Level 1 Spatio-temporal resolution. detection threshold consistent wih the simulator (Chepfer, Bony et al. submitted) Observational datasets CALIPSO-GOCCP (see poster G. Cesana) PARASOL-Reflectance Observation simulators CFMIP-OSP/CALIPSO/PARASOLSubgridding, Overlap, Detection, Stat etc… (Chepfer, Bony et al. 2008) Simulated datasets CALIPSO-likePARASOL-like A-train Method of comparison between observations and models not consistent consistent Ensures that model/obs differences are due to model deficiencies

  5. A preliminary inter-comparison to evaluate cloudiness in climate models using 4 diagnostics: (d1) Cloud Cover (d2) Cloud Vertical distribution (d3) Cloud Optical Thickness (d1)+(d3) Cloud Cover / Optical Thickness relationship (d4) Cloud « Type » Models participating : LMDZ4, IPSLnew, CAM3, CCCMA, ECHAM-5, ECMWF, NICAM, UKMO Model Experiment : Year 2007, forced SST, Run CALIPSO/PARASOL simulator, Outputs on a monthly mean and daily, Diagnostics (d1 to d4).

  6. (d1) Cloud Cover obs CALIPSO-GOCCP:GCM-Oriented Calipso Cloud Product (built from Caliop level 1) Observational dataset fully consistent with COSP/CALIPSO outputs Cloud detec: Horiz. Res=330m CALIPSO – GOCCP compared with others cloud climatologies See poster G. Cesana Chepfer et al., 2009, JGR. submitted

  7. (d1) Cloud cover model evaluation CALIPSO- GOCCP OBS ECHAM5 +sim CCCMA +sim LMDZ4 +sim LMDZnew +sim CAM3.5 +sim

  8. (d2) Cloud Vertical Distribution CALIPSO-GOCCP ECHAM5 + SIM CAM3.5 + SIM OBS 0.3 CCCMA + SIM LMDZ4 + SIM • Overestimate: • High clouds • Underestimate: • Tropical low clouds • Congestus • Mid level mid lat 0

  9. (d2) High clouds model evaluation CALIPSO- GOCCP OBS ECHAM5 +sim CCCMA +sim LMDZ4 +sim LMDZnew +sim CAM +sim

  10. (d2) Mid clouds model evaluation CALIPSO- GOCCP OBS ECHAM5 +sim CCCMA +sim LMDZ4 +sim LMDZnew +sim CAM +sim

  11. (d2) Low clouds model evaluation CALIPSO- GOCCP OBS ECHAM5 +sim CCCMA +sim LMDZ4 +sim LMDZnew +sim CAM3.5 +sim

  12. Model Tau : 2x2deg. Iwc, lwc = f(day,lat,lon) Hypothesis: PFice’ , PFliq’ Tau-cloud-mod = f(day, lat, lon) Satellite Tau retrievals: 10km to 1 km Refl_obs =f(day,lat,lon). + Cloud detection + Sunglint correction + hypothesis:PFice , PFliq… Tau-cloud-obs = f(day, lat, lon) (d3) Cloud Optical Depth: method model/obs comparison not consistent Another approach: use reflectance as a proxy of tau • Diagnostic of the Reflectance as a model output • Sub-gridding Iwc, Lwc (Klein, Jacob, 1999) • - A-train orbit: (lat,lon,day) => tetaS • - Direct Rad. Transfert computation with PFice_mod and PFliq_mod PARASOL Reflectance (864nm): subset of observations: In a fixed “no-glint” viewing direction (tetaV=30°, PhiV=320°) Built maps of Reflectance Refl_mod (tetas, tau) Refl_obs (tetas, tau) consistent Ensures that model/obs differences are due to tau-model

  13. (d3) Cloud optical thickness model evaluation PARASOL-REFL LMDZ4+sim ECHAM5 + sim 14 CCCMA+sim LMDZ-new physic+sim

  14. (d1+d3) Cloud cover / cloud optical thickness relationship Monthly means ECHAM-5 0.7 LMDZ4 + Sim. LMDZ new physics + Sim. Observations Obs Reflectance 0.8 LMDZ4 0 1 0 1 0.7 CCCMA Reflectance LMDZ-New physic 0 0 Cloud fraction 1 Models reproduce roughly the relationship between the reflectance and the cloud fraction in monthly mean

  15. (d1+d3) Cloud cover / cloud optical thickness relationship CCCMA instant. LMDZ4 + Sim. LMDZ new physics + Sim Observations 0.7 Obs monthly Reflectance LMDZ4 instant. 0 0.7 Obs instantaneous Reflectance LMDZ-New physic instant 0 Cloud fraction Models have difficulties to reproduce the « instantaneous » relationship between tau and the cloud fraction … needed for cloud feedbacks

  16. Focus on Tropics (d1) Cloud cover and (d3) Optical thickness in dynamical regimes (d1) CLOUD COVER (d4) REFLECTANCE w500 (hPa/day) w500 (hPa/day) Vertical wind speed et 500hPa Vertical wind speed et 500hPa Error compensations between Cloud cover and Optical depth

  17. (d4) Cloud types : focus on Tropics LMDZ4 + Sim. LMDZ New Physics + Sim. CALIPSO-GOCCP-obs “CFADs” Tropical warm pool Pressure 0 Clouds 0 Clouds Clouds 0 5 80 5 80 5 80 Hawaii trade cumulus Pressure Clouds Clouds 5 80 5 80 Lidar signal intensity (SR)

  18. Conclusions CALIPSO and PARASOL obs. can help identifying GCM error compensations: 1) between vertically integrated Cloud Cover and Optical Thickness2) between time scales: instantaneous vs monthly mean3) in cloud vertical distribution All the models : - overestimates high cloud amount- underestimate total cloud amount - underestimate tropical low level oceanic cloud amount in subsidence regions All models exhibit error compensations None of the models can reproduce the « Cloud Types », characterized by lidar intensity, e.g. the 2 modes of low level clouds and the congestus clouds Physical interpretations of model/obs differences and inter-model differences … just starts now

  19. CALIPSO and PARASOL simulators are included in COSP: Chepfer H., S. Bony, D. Winker, M. Chiriaco, J-L. Dufresne, G. Sèze, 2008: Use of CALIPSO lidar observations to evaluate the cloudiness simulated by a climate model, Geophys. Res. Let., 35, L15704, doi:10.1029/2008GL034207. Simulators: http://www.cfmip.net “CFMIP Observation Simulator Package”: ISCCP, CloudSat, CALIPSO/PARASOL, MISR (UKMO, LLNL, LMD/IPSL, CSU, UW) CALIPSO- GOCCP « GCM Oriented CALIPSO Cloud Product » : Chepfer H., S. Bony, D. Winker, G. Cesana, JL. Dufresne, P. Minnis, C. J. Stubenrauch, S. Zeng, 2009: The GCM Oriented Calipso Cloud Product (CALIPSO-GOCCP), J. Geophys. Res., under revision. Observations: http://climserv.ipsl.polytechnique.fr/cfmip-obs.html CALIPSO-GOCCP, PARASOL-REFL, CLOUDSAT-CFAD, CERES-EBAF, … (LMD/IPSL, UW, LOA, NASA/LarC, …) This preliminary pilot inter-comparison will be extended to others climate models : - CFMIP-2 experiment comparison with actual observations - WGCM/CMIP-5 experiment (Taylor et al. 2009) – inter-models comparisons via simulators (doubling CO2, long term) Today, about 20 models use COSP (CFMIP Obs. Simulator Package)-

  20. Qques autres études avec le simulateur lidar: • Dust : Chimère-dust (Afrique, IPSL/LMD) .vs. Calipso (Vuolo et a. 2009) • Nuages : WRF/MM5 (Méditerrannée, IPSL/SA) .vs. SIRTA (Chiriaco et al. 2006) - Nuages : MM5 .vs. Calipso (Chepfer et al. 2007) • Nuages : Méso-NH (Pinty, LA) .vs. Calipso (Chaboureau et al. soumis) • Pollution: Chimère (Europe, IPSL/LMD) .vs. SIRTA (Hodzic et al. 2004) - Nuages : LMDZ .vs. Calipso (Chepfer et al. 2008 et 2009) • Perspectives: • Simulateur générique : sol, sat, nuages, aérosols, régional, • Observations dédiées : GOCCP, CFMIP-OB, … G. Cesana (Cdd) • SIRTA, A-train + EarthCare … vers une évaluation quantitative, systématique, statistique, des nuages et des aérosols dans les modèles globaux et régionaux ….

  21. fin

  22. (d4) Cloud Optical Thickness obs Background : t_cloud retrieved from satellite observations is uncertain because : hypothesis on the particle size, sunglint in the tropics, backscattering dir highly sensitive to the shape of the particles. 1 PARASOL image (level 1) PARASOL Reflectance in a “well suited” direction : a « proxy » for the cloud optical depth (JFM) • Model: • . • Avoid uncertainties due to hypothesis required in retrieval algorithms • This specific viewing direction: • avoid sunglint dir. • avoid nadir dir. • avoid backscattering dir.

  23. Focus: Tropics(d4) Cloud Vertical Distribution CALIPSO-GOCCP LMDZ New Physics +Sim. LMDZ4 + Sim. log w500 (hPa/day) w500 (hPa/day) w500 (hPa/day) CLOUD FRACTION – Seasonal mean

  24. Parasol Observations Model LMDz GCM Ouputs : 19 vertical levels, lat-lon 2.5°x3.5° LWC(P,lat,lon),IWC(P,lat,lon), Re(P,lat,lon) PARASOL L1 : 6 x 6 km2 Reflectances Directionnelles 14 directions de visées 4 ou 5 longueurs d’ondes visible (proche IR) SCOPS : Subgrid Cloud Overlap Profile Sampler Cloud condensate / Cloud fraction / Cloud overlap assumption 19 vertical levels, lat/lon :a few km or m Same cloud diagnostics for model and obs at the same resolution : • Extraction de la Reflectance : • à 1 seule longueur d’onde : 865 nm • principalement sensible aux nuages • dans une seule direction : • principalement sensible à tnuages • hors glitter, backscattering, nadir Imposed by model parametrizations: Particle shape hypothesis (spheres for Lmdz) Shape of the size distribution Parameterized optical properties P(p,z), Cdiff (z), Cabs(z) for liq, ice Simulated 1dir radiance Radiative transfer computation parameterized from exact computation (Doubling Adding RT) Parametrisation de qs = f(lat,lon,date) pour orbite Parasol Relation Reflectance_1dir= f(qs, t) Refobs (6km) RefGCM(6km) Averaging obs and model diagnostics : spatially over lat-lon 2.5°x3.5° temporally (month, season)

  25. Lidar Observations Model LMDz GCM Ouputs : 19 vertical levels, lat-lon 2.5°x3.5° LWC(P,lat,lon),IWC(P,lat,lon), Re(P,lat,lon) CALIOP L1 : 583 vertical levels, 330m along the track ATB(z,lat,lon), Molecular density (33z, lat, lon) SCOPS : Subgrid Cloud Overlap Profile Sampler Cloud condensate / Cloud fraction / Cloud overlap assumption 19 vertical levels, lat/lon :a few km or m Compute ATBmol : - scale molecular density to ATB in cloud / aerosol free regions (22-26 km) - average over 10km (SNR) Same cloud diagnostics for model and obs at the same resolution : Cloudy (SR>3), Clear (SR<1.2), Fully attenuated (SR<0.01), Unclassified (1.2<SR<3) Spatial homogeneity index, Cloud top height, Phase index, etc… Convert altitude in pressure (GMAO) ATB (583P, lat, lon), ATBmol(583P, lat, lon) Imposed by model parametrizations: Particle shape hypothesis (spheres for Lmdz) Shape of the size distribution Parameterized optical properties P(p,z), Cdiff (z), Cabs(z) for liq, ice Average over the 19 vertical levels (same as GCM one) Strong increase of SNR Simulated lidar profile: lidar equation Multiple scattering coefficient = cste Molecular signal from model ( P,T) SRGCM(19P, 330m) SRobs (19P, 330m) Averaging obs and model diagnostics : spatially over lat-lon 2.5°x3.5° temporally (month, season)

  26. Comparisons CALIOP / LMDz-GCM : Zonal mean GCM GCM + SIMULATOR CALIOP/GOCCP HIGH MID LOW CLOUD FRACTION Chepfer et al. 2008

  27. A cloud in GCM world … looks like a cube or sum of cubes … feel a grid (about 2.5°) or a subgrid (0/1) … overlap assumption (random, fully, partially) … clouds and aerosols are distincts … strike microphysical approximation … but each GCM has his own (!!) … etc … Why is it « complicated » to interface satellite data and GCM for clouds ? A cloud in satellite world … looks like whatever you want … can cover 1000’s km2 or less than 100m … infinite overlap configurations … clouds and aerosols are difficult to distinguish … strike microphysical approximation (but different of GCM one’s) … clouds are sometimes difficults to detect Moreover : uncertainty on measurements signal-to-noise ratio multiple scattering (or RT equation) « tilt » .vs. « no tilt »

  28. « Etat de l’art » : évaluation des nuages dans les GCM à partir des observations satellitales Comment les modèles sont-ils évalués? Modèles de climat 0 -10 Zhang et al. 2005 -20 forçage radiatif (W/m²) -30 -40 Observations (ERBE) -50 -60 Les modèles sont évalués principalement à partir de flux TOA (et ISCCP) : - basic, mais … fondamental car c’est le bilan d’énergie global du système OA • Conséquences : • les modèles reproduisent tous correctement les flux • on peut évaluer d’autres grandeurs à condition de ne pas dégrader les flux, • sinon le système Océan-Atmosphère-Biosphère-Cryosphère etc… explose/implose !! • (dans le modèle)

  29. Clouds and climate feedback Dufresne and Bony, J. Clim. 2008 Reducing this uncertainty => a thorough evaluation of cloudd description in climate models

More Related