280 likes | 566 Views
Update to COPC 21 November 2013 Chuck Skupniewicz , FNMOC, UEO co-chair Yuejian Zhu, EMC, UEO co-chair Dave McCarren , NUOPC DPM. Agenda. NUOPC UEO Committee Update NUOPC Verification Metrics through Oct 2013 NUOPC CMA Committee Update Discussion. 2.
E N D
Update to COPC 21 November 2013 Chuck Skupniewicz, FNMOC, UEO co-chair Yuejian Zhu, EMC, UEO co-chair Dave McCarren, NUOPC DPM
Agenda • NUOPC UEO Committee Update • NUOPC Verification Metrics through Oct 2013 • NUOPC CMA Committee Update • Discussion 2
NUOPC and the North American Ensemble Forecast System (NAEFS) Q: What’s the difference? A: Not much. • NAEFS focus is a coordination and sharing of research, development, production and distribution of ensemble members for weather forecasting at each agency. • NUOPC focus is the centralized (common) and de-centralized (mission-unique) post-processing of a multi-model ensembleproducts. 4
According to the UEO Operational Management Plan signed by AFWA, FNMOC, and NCEP, Operational Prediction Centers will • Evaluate candidate NUOPC products • NCEP and FNMOC will execute reanalyses of agreed upon periods • Reanalysis will include CMC ensemble members • At a minimum reanalysis will use the NUOPC metrics • Approve selection of NUOPC products to be made available for user access • Approve final implementation decisions with oversight from NUOPC ESG 7
Delivery Schedule Planned for 2014 Exists today +6:00 +7:00 +8:00 +9:00 +10:00 +6:10 raw members delivery +7:00 Level 1 Single model products: - L1 debiased members - Single model statistics +8:00 Level2 Multi-model products: - L2 group debiased members - group statistics +9:00 to 10:00 Level 3 multi-model products: - group statistics -downscaled products Ensemble model runs completed transmit raw partners’ raw post-process, transmit L1 & statistics transmit partners’ L1 post-process, transmit L2 & group statistics transmit partners’ L2 post-process, transmit L3 downscaled products & group stats Notes • “statistics” include mean, std dev, and threshold probabilities • “group statistics” are for the multi-model ensemble • 3. “L1” is single model bias corrections by each center • 4 “L2” is multi-model calibration based on L1 (e.g. adjustment with NCEP analysis • 5. “L3” is user (or OPC) specified products based on L2 (e.g. downscaling probabilistic products) • Partners may split responsibilities for group debiasing or statistics
2014 UEO Plans • All 3 agencies have agreed to a multi-month, multi-model validation during this winter using most of the agreed upon NUOPC metrics. • Each agency will share their results and make recommendations through the UEO committee. Each agency’s focus will be on statistical products most useful to their customer base. • The UEO will present consensus recommendations at the COPC spring meeting. This will include recommendation on shared production responsibilities. • COPC-approved multi-model products will be produced and distributed to the NOMADS server. 9
For all three individual bias corrected ensemble forecast (NCEP/GEFS, CMC/GEFS and FNMOC/GEFS) and combined (NUOPC) ensemble (equal weights) against UKMet analysis NUOPC Verification Metrics EMC/NCEP November 1st 2013
5-day forecast NH 500hPa CRPS skill scores NH 500hPa anomaly correlation Northern Hemisphere 500hPa height: 30-day running mean scores of day-5 CRPS skill score RMS error and ratio of RMS error / spread Anomaly correlation All other regions could be seen from: http://www.emc.ncep.noaa.gov/gmb/yluo/naefs/VRFY_STATS/T30_P500HGT NH 500hPa RMS errors Under-dispersion Over-dispersion Ratio of RMS error over spread
10-day forecast Northern Hemisphere 500hPa height: 30-day running mean scores of day-10 CRPS skill score RMS error and ratio of RMS error / spread Anomaly correlation All other regions could be seen from: http://www.emc.ncep.noaa.gov/gmb/yluo/naefs/VRFY_STATS/T30_P500HGT
NH CRPS skill scores NH RMS errors 5-day forecast for surface temperature NA CRPS skill scores NA RMS errors
5-day forecast for surface wind (U) 10-day forecast for surface wind (U)
5-day forecast for surface wind (V) 10-day forecast for surface wind (V)
AL01-13, EP01-17, WP03-29, May-October, 2013 Track error(NM) Forecast hours CASES 493 448 401 356 300 206 129 81
Common Model Architecture • Draft whitepaper sent to Liaisons for review on 4 Sep. • Being developed into BAMS article to address National Research Council report “A National Strategy for Advancing Climate Modeling” • Focus on advanced capability and interoperability through Earth System Prediction Suite • ESPS is collection of Earth system component and model codes that are interoperable, documented, and available for integration and use • ESPS implementation is part of a project awarded under ESPC entitled: An Integration and Evaluation Framework for ESPC Coupled Models • ESPS website with draft inclusion criteria and list of candidate models (Coupled, Atmosphere, Ocean, Ice, and Wave) http://www.earthsystemcog.org/projects/esps/ NUOPC National Unified Operational Prediction Capability 20
NUOPC Layer Roadmaps • The current set of roadmaps using NUOPC Layer involves the following codes: • Navy NAVGEM and HYCOM coupled system • Navy COAMPS coupled system • NOAA Environment Modeling System (NEMS) from NOAA NCEP EMC • NOAA Climate Forecast System (CFS) from NOAA NCEP EMC • WaveWatch 3 model from NOAA NCEP EMC and NRL • MOM5 ocean model from GFDL and CICE sea ice model from Los Alamos • GEOS-5 atmospheric general circulation model from NASA Goddard Space Flight Center • The Ionosphere Plasmasphere Electrodynamics model from the NOAA Space Weather Prediction Center • NASA Goddard Institute for Space Studies Model E • Community Earth System Model from NCAR/DOE • There are development pathways that traverse multiple groups, and outcomes that are interrelated • Implementing GFDL MOM5 as a NUOPC component, coupling this to a NOAA NEMS atmosphere component, and exploring the use of this system as the architecture for the next version of CFS • Reconciling multiple versions of the HYCOM ocean model, and using the resulting NUOPC HYCOM version in NEMS. A proposed activity would also couple this version of HYCOM to CESM.
Common Model Architecture • ESMF v6.3.0r expected release Dec 2013 • NUOPC Layer upgrades in ESMF v6.3.0r • Developed new user orientation material with prototype codes -- http://earthsystemcog.org/projects/nuopc/ • Implemented standardization of component dependencies -- establishing a standard way for assembling NUOPC compliant components into a working application • Implemented NUOPC Layer compliance testing tools: NUOPC Compliance Checker & NUOPC Component Explorer • NUOPC Layer Reference and prototypes extended to include data-dependencies during initialize, standardization of component dependencies, compliance, and multi-time level coupling NUOPC National Unified Operational Prediction Capability 22
The Earth System Prediction Capability (ESPC) Inter-agency Project
Phase 0: Ongoing Collaborative Programs (Operational short-range weather forecasting, research seasonal outlooks ) • Inter-agency Global and Mesoscale Atmospheric Model Ensembles • Hurricane Forecast Improvement Program (HFIP: 3-7 days) • National Unified Operational Prediction Capability (NUOPC: 5-20 days) • National Multi-model Ensemble (NMME: 3-6 months) • Multi-model Ensembles are more accurate for longer lead times. • Distributed Production Centers leverage multi-agency and • international computer infrastructure and investments. • Skill improves with spatial resolution - All are run at sub-optimal • but best affordable resolution. • Next-generation Global Atmospheric Cloud Resolving Models • (GCRM) – DCMIP Candidates • NMMB, FIM/NIM, Cubed Sphere, MPAS, NUMA, CAM-SE • High resolution for regional high impact and extreme events • Adaptive/unstructured mesh allows computational efficiency • Potentially Improved prediction at weather to short term seasonal • climate variability scales (5-100 days)
Phase I: ESPC Demonstrations (10 days to 1-2 years) • Extreme Weather Events: Predictability of Blocking Events and Related High Impact Weather at Leads of 1-6 Weeks (Stan Benjamin) • Seasonal Tropical Cyclone Threat: Predictability of Tropical Cyclone Likelihood, Mean Track, and Intensity from Weekly to Seasonal Timescales (Melinda Peng) • Arctic Sea Ice Extent and Seasonal Ice Free Dates: Predictability from Weekly to Seasonal Timescales (Phil Jones) • Coastal Seas: Predictability of Circulation, Hypoxia, • and Harmful Algal Blooms at Lead Times of 1-6 Weeks (Gregg Jacobs) • Open Ocean: Predictability of the Atlantic Meridional Overturning Circulation (AMOC) from Monthly to Decadal Timescales for Improved Weather and Climate Forecasts (Jim Richman)
Phase II: Decadal Prediction (5-30+ years) • The decadal to multi-decadal prediction issue is more • complex and more focused on the forced problem • and limits of predictability • Physical – solar variability, aerosols, • volcanic, albedo, glacial and sea ice melt, • ocean circulation and acidification, • desertification… • Biogeochemical – ocean microbial, migrations • including human, plant and animal…. • Societal – deforestation, agriculture, • urbanization, industrial… • Political – carbon limits, economic cycles, • policy, water resources, warfare, … • Leverage National and International ongoing efforts in defining “operational” capability at these timescales: availability and reliability of information against decision requirements and format and mechanism for operational product generation, validation, and distribution.
Phase I: Demonstration Goals • (2013) An Implementation Plan for each Demonstration Project • (2013-2017) A better understanding of the bounds on prediction skill at various time and space scales in the current “skill nadir” at sub-seasonal to ISI lead times for specific aspects of the earth system important to decision makers • (2018-2022) Improved operational prediction for informed decisions (Full Operational Capability (FOC) by 2025) The Phase I Demonstrations seek to define: • the current state of scientific understanding • the current technological approach and maturity • common skill metrics and case studies to explore areas of predictability that could lead to future operational prediction • some measure of return on investment, i.e. computational cost vs. prediction skill of various approaches, resolution, etc.