1 / 44

Outline Introduction Atmospheric Models Ocean Models Concluding Remarks

Validation of Coupled Models. Richard M. Hodur Naval Research Laboratory Monterey, CA 93943-5502 hodur@nrlmry.navy.mil Short Course on Significance Testing, Model Evaluation, and Alternatives 11 January 2004 Seattle, WA. Outline Introduction Atmospheric Models Ocean Models

garret
Download Presentation

Outline Introduction Atmospheric Models Ocean Models Concluding Remarks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation of Coupled Models Richard M. Hodur Naval Research Laboratory Monterey, CA 93943-5502 hodur@nrlmry.navy.mil Short Course on Significance Testing, Model Evaluation, and Alternatives 11 January 2004 Seattle, WA • Outline • Introduction • Atmospheric Models • Ocean Models • Concluding Remarks

  2. Validation of Coupled Models Acknowledgements • Dr. James D. Doyle (NRL MRY) • Dr. Timothy F. Hogan (NRL MRY) • Dr. Xiaodong Hong (NRL MRY) • Dr. John C. Kindle (NRL SSC) • Dr. Paul May (CSC/NRL MRY) • Dr. Jason E. Nachamkin (NRL MRY) • Dr. Randy Pauley (FNMOC) • Dr. Ruth H. Preller (NRL SSC) • Dr. Julie D. Pullen (NRL MRY) • Dr. Robert C. Rhodes (NRL SSC) • Dr. Douglas L. Westphal (NRL MRY)

  3. Validation of Coupled Models Context of Talk • Validation is more than a skill score • Validation implies a learning process • Develop system • Measure skill of system • Seek ways to improve skill of system • Validation is a critical component of model development • Without validation there can be no improvement in model performance

  4. How Do We Measure the Validity and Usefulness of Atmosphere and Ocean Models? Combination of Many Measures • Scientific Basis • Record of Publications, Presentations, Patents, . . . • Equations, Grid Structure, Numerical Techniques, Representation of Physical Processes Based on Well-Tested, Peer-Reviewed Principles • Reproduction of Analytic/Idealized Test Cases • Validate Numerical Schemes (e.g., Topographic Flow, PGF Computation) • Validate Physical Parameterizations (e.g., Wangara, Convection) • Measure Real-Time Predictive Performance • No one simple “metric” is available • Objective Measurements are useful (e.g., RMS, Bias, Anomaly Correlation, Tropical Cyclone Forecast Position Error, Precipitation Scores) • May be Difficult to Measure on Mesoscale; Perform Subjective Evaluation of Differing Episodic Events (e.g., Patterns, Trends, Drifters, Transport, Tracers) • Measure Skill Over Long Time Periods (Months or more) • Transitions require measure of skill of new version of system relative to a benchmark version • Measure Utility to User(s) • Does Output Meets User Needs? • User-Feedback • Robustness • Efficiency (e.g., Wall time, Flops, . . . )

  5. Validation of Coupled Models Outline • Atmospheric Models • Global • Anomaly Correlation • RMS, Bias Errors • Tropical Cyclone Track • Scorecard • Mesoscale • Idealized Flow Studies • RMS, Bias • Qualitative Verification (Case Studies) • Event-Based Verification • Ocean Models • Global • Features/Positions • Sea Surface Height Validation • Anomaly Correlation • Sea Surface Temperature Validation • Mesoscale • Transport • Sea Level Height • T/S Profiles • Coastal Issues • Coupling Issues

  6. Validation of Coupled Models Outline • Atmospheric Models • Global • Anomaly Correlation • RMS, Bias Errors • Tropical Cyclone Track • Scorecard • Mesoscale • Idealized Flow Studies • RMS, Bias • Qualitative Verification (Case Studies) • Event-Based Verification • Ocean Models • Global • Features/Positions • Sea Surface Height Validation • Anomaly Correlation • Sea Surface Temperature Validation • Mesoscale • Transport • Sea Level Height • T/S Profiles • Coastal Issues • Coupling Issues

  7. NOGAPS: Navy Operational Global Atmospheric Prediction System Northern Hemisphere 1988-2003 1.0 0.9 0.8 Anomaly Correlation 0.7 1.0 NH Colors 0.6 0.9 Southern Hemisphere 1995-2003 24hr 24hr 48hr 48hr 72hr 72hr 96hr 96hr 120hr 120hr *Anomaly Correlation: 0.8 0.7 Anomaly Correlation 0.6 Key: f: forecast c: climatology a: analysis SH Colors NOGAPS Annual Mean Forecast StatisticsAnomaly Correlation* of 500 mb HeightsValues Greater than 0.6 are Considered Skillful Results indicate that forecast skill is improving at the rate of about one day per decade

  8. Temperature: Europe Mean Error RMS Error Wind: CONUS Speed Error Vector Error Color Key RMS and Bias Errors NOGAPS:Navy Operational Global Atmospheric Prediction System Errors can be stratified by latitude bands, hemisphere, or specified geographic area Largest wind errors are typically found at jet-level where the winds are the strongest

  9. DoD 48 hour goal is 100 nautical miles Forecast Error (n mi) 308 304 264 210 286 325 550 Number of Forecasts NOGAPS Operational 48 h TC Track Forecast Error All Basins NOGAPS: Navy Operational Global Atmospheric Prediction System Time series of the annual mean 48 hour tropical cyclone track error (nautical miles). In 2002 NOGAPS was the best performing global model for tropical cyclone prediction (JTWC). The improvements in skill were largely due to the transition of improvements to the cumulus convection scheme and the increase in resolution.

  10. Each element of the scorecard is measured for a period of a minimum of three weeks and is required to meet a statistical significance test of at least 95%. Each element is scored as a win, loss, or tie based on the 95% significance level and a minimum of 5% difference in the case of RMS errors. A net score of -1 (or higher) out of a possible +/-15 will be considered a neutral (or better) overall result. Implementation will not result if there is significant degradation in the TC tracks, even if the rest of the scorecard is positive. NOGAPS Scorecard * Weighted double if no TC track verification available NOGAPS Scoring System Used for Comparing Benchmark Run with New Version

  11. Validation of Coupled Models Outline • Atmospheric Models • Global • Anomaly Correlation • RMS, Bias Errors • Tropical Cyclone Track • Scorecard • Mesoscale • Idealized Flow Studies • RMS, Bias • Qualitative Verification (Case Studies) • Event-Based Verification • Ocean Models • Global • Features/Positions • Sea Surface Height Validation • Anomaly Correlation • Sea Surface Temperature Validation • Mesoscale • Transport • Sea Level Height • T/S Profiles • Coastal Issues • Coupling Issues

  12. 4 2 W (m/s) x 10-3 at 8h 0 -2 -4 Idealized Flow Studies COAMPS:Coupled Ocean/Atmosphere Mesoscale Prediction System WRF: Weather Research and Forecast Model Mountain Wave Test Case Linear Hydrostatic Gravity Waves T=250K, U=20 m s-1, hm=1 m, a=10 km dx=2 km, dz=250m, 121L Na/U=5 Nhm/U=1x10-4 Linear Analytic Solution COAMPS WRF (EM)

  13. COAMPS Wind RMS and Bias Errors 10 m RH (%) 10 m T (ºC) and 2 m U (m s-1) COAMPS 24-h Surface RMS Errors RMS Error (ºC or m s-1) Year (November) COAMPS RMS, Bias Verification Europe:27 km Grid

  14. Full 27 km COAMPS Grid 40 Display Area COAMPS SSM/I Wind Speeds 0454 UTC 22 Aug 1998 Wind Speed (Kts) 20 NOGAPS 0 5 20 40 Wind Speed (Kts) Subjective Evaluation of COAMPS (27 km) and NOGAPS (T159) Forecasts of Mistral 27 Hour Forecasts of 10 meter Wind Valid at 0300 UTC 22 Aug 1998

  15. SeaWiFS Image: 30 October 2001 NAAPS: Vertical Integral of Extinction 0.4 0.1 0.1 0.3 0.1 AERONET Observation Network of sun photometers Improvement of Aerosol Prediction Capability Validation of NAAPS Using SeaWiFS and AERONET Data

  16. Event-Based VerificationWhy Verify Events? • The user wants a deterministic answer • The model produces a deterministic forecast • Unfortunately, the outcome is not deterministic! • Verification should communicate the nature of the variability

  17. Event-Based Verification Composite Verification Method • Identify events of interest in the forecasts • Rainfall greater than 25 mm • Wind greater than 10 m/s • Event contains between 50 and 500 grid points • Define a kernel and collect coordinated samples • Square box located at center of event • 31x31 grid points (837x837 km for 27 km grid) • Compare forecast PDF to observed PDF • Repeat process for observed events

  18. Forecast event Observations x Event center Collection kernel Event-Based Verification Collecting the Samples

  19. RMS: shade Bias: contour FCST: shade OBS: contour Event-Based Verification Mistral Speed Statistics • 66-hour wind speed forecasts for 2000-01 over the Mediterranean Sea • Speed greater than 12 m/s, dir 270-70 deg., covering 50-500 grid points • Verified against SSM/I satellite observations

  20. Average rain (mm) given an event was predicted Average rain (mm) given an event was observed FCST: shade OBS: contour FCST: shade OBS: contour Event-Based Verification CONUS Warm Season Precipitation • 24-hour precipitation forecasts for April-September 2003 over full CONUS • Rain events greater than 25 mm covering 50-500 grid points • Verified against River Forecast Center precipitation analysis

  21. Validation of Coupled Models Outline • Atmospheric Models • Global • Anomaly Correlation • RMS, Bias Errors • Tropical Cyclone Track • Scorecard • Mesoscale • Idealized Flow Studies • RMS, Bias • Qualitative Verification (Case Studies) • Event-Based Verification • Ocean Models • Global • Features/Positions • Sea Surface Height Validation • Anomaly Correlation • Sea Surface Temperature Validation • Mesoscale • Transport • Sea Level Height • T/S Profiles • Coastal Issues • Coupling Issues

  22. Validation of Gulf Stream Position in the Navy Layered Ocean Model (NLOM) Validation of the Gulf Stream Position

  23. Eddy Kinetic Energy cm2/s2 Global NCOM using data assimilation Free-running Global NCOM Climatological eddy kinetic energy near 700m depth in the western North Atlantic. Taken from Schmitz (1996) which adapted the data from Owens (1984,1991) and Richardson (1993). Validation of Eddy Kinetic Energy (EKE) in 1/8-degree global NCOM Mean EKE at 700 m depth during 1998-2000 NCOM: Navy Coastal Ocean Model In comparison to the free-running case, EKE at 700 m in the assimilative case is generally higher and in closer agreement to historical observations, showing the two regions of relatively high EKE south of Nova Scotia and Newfoundland.

  24. Global Kuroshio Gulf Stream SSH Gulf Stream Global Kuroshio SST Validation of the Navy Layered Ocean Model (NLOM) Anomaly Correlation 42 30-day forecasts from Dec 20 2000 to Oct 24 2001 Blue Line: Persistence, Red Line: NLOM

  25. SST Predictions from POP Model using NOGAPS forcing verify better than persistence forecasts ATMOSPHERE NAVDAS NOGAPS OMVOI POP OCEAN NOGAPS/POP Air-Ocean Coupling Air-Ocean with Data Assimilation/Forecast Cycle Analysis-only produces significant errors in coastal boundary currents Reduced errors demonstrate importance of model to data assimilation OMVOI: Ocean Multivariate Optimum Interpolation Analysis POP: Parallel Ocean Program Prediction Model

  26. Validation of Coupled Models Outline • Atmospheric Models • Global • Anomaly Correlation • RMS, Bias Errors • Tropical Cyclone Track • Scorecard • Mesoscale • Idealized Flow Studies • RMS, Bias • Qualitative Verification (Case Studies) • Event-Based Verification • Ocean Models • Global • Features/Positions • Sea Surface Height Validation • Anomaly Correlation • Sea Surface Temperature Validation • Mesoscale • Transport • Sea Level Height • T/S Profiles • Coastal Issues • Coupling Issues

  27. Validation of the Intra-Americas Sea Nowcast/Forecast System (IASNFS) Run Daily to 72 h (http://www7320.nrlssc.navy.mil/IASNFS_WWW/) • MODAS: Modular Ocean Data Assimilation System • 2D Optimum Interpolation Analysis • Synthetic T/S profiles generated, used as observations • All observations assimilated during 12-hour pre-forecast period • Domain/Bathymetry: • NCOM: Navy Coastal Ocean Model • 1/24-degree grid spacing • 40 vertical levels (20 sigma/20 z) • NOGAPS Forcing

  28. 28.2/31.7 25.2/28.8 3.2/2.9 26.0/28.0 -0.8/1.0 1.6/2.6 1.9/3.1 25.9/28.0 4.2/7.0 1.1/1.1 4.8/2.5 3.0/1.6 2.1/1.5 3.2/2.9 3.2/5.7 Validation of the Intra-Americas Sea Nowcast/Forecast System (IASNFS) NCOM Predicted Transport (2001 Mean) vs. Observations Key: IASNFS/Observation

  29. Validation of the IASNFS Predictions of Sea Level Height Comparison to Tide Gauges

  30. Validation of the IASNFS Predictions of Sea Level Height Comparison to Persistence

  31. Red Line: CTD Observation Blue Line: IASNFS X X X X Validation of the IASNFS Temperature and Salinity (T/S) Profiles Comparisons to (non-assimilated) CTD data

  32. 81 km 27 km 9 km Eastern Pacific Air-Ocean Coupling Coastal Issues Validation of Wind Stress for 9 km Nest in EPAC Black Line: Stress calculated from observations Blue Line:Stress from operational COAMPS interpolated to lat/lon grid Red Line: Stress from COAMPS reanalysis on native grid Figure courtesy of John Kindle, NRL SSC Results indicate thatunfiltered, native grid fields are required for proper forcing and validation along coasts

  33. Po River Bora Ocean-Atmosphere Nested Modeling of the Adriatic Sea during Winter and Spring 2001 Meteorology and Oceanography in the Adriatic • Atmosphere: • Bora: Strong, localized northeasterly winds around Istrian peninsula • Scirocco: Strong, warm southeast winds • Ocean: • Cyclonic cells in the central and southern regions • River runoff and strong winds create large variability in the northern Adriatic

  34. Momentum, Heat fluxes Momentum, Heat fluxes Collaboration with Adriatic Circulation Experiment (ACE) • Objectives • Simulate Adriatic atmospheric and oceanic circulation at high resolution • Document and understand response of the shallow northern Adriatic waters to forcing by the Bora and Po river run-off • Quantify the effects of coupling (e.g., one-way, two-way, frequency, resolution) on atmosphere and ocean forecasts • Aid in planning and interpreting Adriatic Circulation Experiment (ACE) observations 1. Generate 27 km atmospheric forcing fields over the Med 2. Generate 6 km, 2-year spin-up of the Med using forcing from #1, then 12-hour data assimilation for October 1999 3. Generate 4 km atmospheric forcing fields over the Adriatic Sea 4. Generate 2 km Adriatic forecasts using initial conditions and inflow from #2, and atmospheric forcing from #3 81 km COAMPSTM 36 km COAMPSTM 27 km 12 km 4 km 1 4 3 2 Initial conditions and lateral boundary forcing 6 km NCOM 2 km NCOM

  35. 4 km 36 km Ocean-Atmosphere Nested Modeling of the Adriatic Sea during Winter and Spring 2001 COAMPS Wind Stress(Mean and RMS vector amplitude) 28 January - 4 June 2001

  36. COAMPS Wind Stress Curl Mode 1 NCOM 5 m Velocity Mode 1 NCOM 25 m Velocity Mode 1 36 km 36 km Forcing 4 km 4 km Forcing Ocean-Atmosphere Nested Modeling of the Adriatic Sea during Winter and Spring 2001 COAMPS/NCOM Model Circulation: EOFs NCOM: 2 km Grid Spacing

  37. Results (1) 4 km and 36 km winds have similar correlation to observations (2) Ocean model performs better with 4 km winds Atmosphere (1) Ocean (2) Comparison of observed 10 m winds to observations (top) and 25 m ocean current to observations (bottom) Comparison using 36 km (blue) and 4 km (red) atmospheric forcing Results suggest that the consideration of the effects on an ocean model should be a metric in the validation of atmospheric models and thathigh-resolution forcing fields improve ocean forecasts

  38. 12 h frequency runs Preliminary results suggest that significant differences exist when forcing an ocean model with 12 h frequency as opposed to 1 h or 6 h frequency 1 h and 6 h frequency runs • Importance of Temporal Resolution of Ocean Forcing • Comparison of NCOM runs using 1 h, 6 h, and 12 h COAMPS™ forcing

  39. 3 km 9 km 27 km 81 km 12 Sample 10 m wind speeds from inner 3 meshes 6 Wind Speed (m/s) 27 km 9 km 3 km 0 Real-Time COAMPS Support for AOSN II AOSN II: Adaptive Ocean Sampling Network II Quadruple-nest grid built for AOSN area • Twice Daily Forecasts to 72 h with Data Assimilation • NOGAPS Lateral Boundary Conditions • SGI Origin 3900 at FNMOC DoD HPC DC Facility • Real-Time Winds and Fluxes Used to Force Multiple Ocean Models

  40. Concluding Remarks Atmospheric Model Validation • Many Tools Available: • RMS, Bias • Anomaly Correlation • Idealized Tests • Threat Scores • Event-Based Validation • Qualitative/Quantitative Case Studies • Long-Term Studies are Mandatory: • Avoid Simplistic Answers with Single “Case Study” • Minimum Requirements for Evaluation of Systems: • 2-week Periods for Summer and Winter • Test Over Several Different Geographical Areas • Simple Questions/Complex Answers to Validation: • Grid Structures • Formulation of Dynamics • Physical Parameterizations/Interactions • Data Assimilation Issues (i.e., QC, Analysis Techniques, Initialization, First-Guess) • Sensitivity in specific grid-point validation • “Represent-ativeness” of what is being validated (i.e., Resolution) • Bugs (Validation requires understanding of code, no “black box” mentality)

  41. Concluding Remarks Ocean/Coupled Model Validation • Many Tools Available: • RMS, Bias • Anomaly Correlation • Qualitative/Quantitative Case Studies • Idealized Test Cases • Validation/Performance Affected by Atmospheric Model: • Resolution (Spatial and Temporal) • Grid: Native vs. Interpolated/Filtered • Long-Term Studies are Mandatory • Unique Validation Parameters: • Transport • Sea Surface Height • Tides • Simple Questions/Complex Answers to Validation as in the Atmospheric Models

  42. Concluding Remarks Challenges • Demonstrating improved skill is becoming more difficult to do • Models have improved tremendously • Modeling systems are much more complex • Requires thorough understanding of model(s), no “black box” mentality • More validation metrics are needed, especially for mesoscale modeling • Higher resolution does not always translate to improved skill scores • Phase/Pattern shifting validation? • Expect dramatic increase in remotely-sensed data - How to apply to validation of models? • Coupled modeling complicates the validation process: • Air/Ocean interactions/feedbacks • What if atmosphere forecasts are better (worse) and ocean forecasts are worse (better)? • Additional resources needed • Commitment of more resources to validation (Also commit more resources to preparing efficient code)

  43. Concluding Remarks Lessons Learned from Model Validation/Development • Important, Do Right: • Listen to the Customer • Data Assimilation • Configuration Management • Lower Boundary Condition • Physical Parameterizations • Validation/Verification • Efficiency • Be Creative, Build Flexibility in System • Important, Don’t Do Wrong: • Numerics • Grid Configuration/Flexibility/Relocatability • Upper, Lateral Boundary Conditions • Horizontal Diffusion • Database Issues: • Portability • Resolution (terrain, coastlines, etc.) • Plug-compatible code • Use Standard “Sane” FORTRAN, UNIX

  44. Validation of Coupled Models Richard M. Hodur Naval Research Laboratory Monterey, CA 93943-5502 hodur@nrlmry.navy.mil Short Course on Significance Testing, Model Evaluation, and Alternatives 11 January 2004 Seattle, WA • Outline • Introduction • Atmospheric Models • Ocean Models • Concluding Remarks

More Related