100 likes | 225 Views
Global Change: Advancing the Understanding of Future Changes in Climate, Atmospheric Ozone, and Air Quality through Cyberinfrastructure. Donald J. Wuebbles The Harry E. Preble Professor of Atmospheric Sciences Department of Atmospheric Sciences University of Illinois Urbana, IL.
E N D
Global Change: Advancing the Understanding of Future Changes in Climate, Atmospheric Ozone, and Air Quality through Cyberinfrastructure Donald J. Wuebbles The Harry E. Preble Professor of Atmospheric Sciences Department of Atmospheric Sciences University of Illinois Urbana, IL
Global Change: Not just a science challenge but also a major technology challenge • Climate change (“Global Weirding”) is one of the most important issues facing humanity • Over next 5-10 years, models of the Earth’s climate system will have much more complex representations of physical, chemical, and biological processes, be run at much higher horizontal and vertical resolutions and be used to analyze 100s of years for past and projected future climate. • Extensive analyses to understand uncertainties • Extremely large observational and model datasets to be analyzed • Recovery of stratospheric ozone is expected under the Montreal Protocol but may not happen. • Extensive modeling of coupled climate-chemistry processes will be required. • Controlling local air quality becoming a global issue
The Future of Global Change Modeling:Representing All Earth System Processes
Bottlenecks/Issues to Achieving Objectives • Science community unsure how to effectively analyze the extensive amount of data already being produced. • How deal with the even larger datasets expected in the future? • New “dynamical cores” being developed to extend models to petascale. • Even with enhanced resolution, there remain large issues with handling many orders of magnitude of relevant spatial scales. • Integration across disciplines ever more important. • Extensive interdisciplinary teams necessary to address and analyze key issues in climate and in atmospheric chemistry. • New challenges from emphasis on impacts and adaptation
Cyberinfrastructure Challenges in Reaching the Objectives (1) • Petascale may not be sufficient to meet computing needs • Higher resolution (more grid boxes) needed in climate models • Ensemble (6 or more) of model runs over many (100 and more) years needed to represent Earth system and its response • Runs to evaluate Model Sensitivity to initial conditions, emissions inputs, and alternative treatments of processes • Runs to evaluate Uncertainties in model responses • “Real-time” analysis likely to become more important • Meeting the needs of stakeholders and policymakers • Storage and access of datasets for many decades
Cyberinfrastructure Challenges in Reaching the Objectives (2) • New techniques required for data analysis • Challenges in data mining and visualization for both observed and model datasets, and in their comparison • “Current datasets already too large to analyze” but really need to evaluate multiple datasets at once. • “Latency” in accessing datasets off of storage • New visualization techniques for educating public and policymakers
Cyberinfrastructure Challenges in Reaching the Objectives (3) • Need enhanced interactions between scientists and computer scientists to meet the specialized requirements to address climate and atmospheric chemistry issues • Analysis software • Work flow tools • Advanced informatics • Network protocols • Coding languages (current climate models largely written in Fortran) and Model Build Process • New distributed sensor networks will need to be integrated into the analysis system • Do all these while reducing carbon footprint of large-scale computing facilities.
Data Integration Challenges Facing Climate Science • Models will generate more data in the near future than exist today • How best to collect, distribute, and find data on a much larger scale? • At each stage tools must be developed to improve efficiency • Substantially more ambitious community modeling projects (Petabyte (PB 1015) and Exabyte (EB 1018)) will require a distributed database • Metadata describing extended modeling simulations (e.g., atmospheric aerosols and chemistry, carbon cycle, etc.) • How to make information understandable to end-users so that they can interpret the data correctly • Integration of multiple analysis tools, formats, data from unknown sources • Trust and security on a global scale • Courtesy: William Johnston et al. - LBNL Slide info from Dean Willimas, LLNL
CICE – Sea Ice Model (ice/atm, ice/ocn fluxes and ice albedos) CLM – Land Model (land/atm fluxes and land albedos) CAM – Atmosphere CPL - Coupler (ocean/atm fluxes and ocean albedos) CCSM Coupling and Execution Flow Coupling of CAM/CLM/CICE/AOF is every CAM timestep (30 min) Coupling of POP is every 24 hours (1º) and every 6 hours (.1º) CAM/CLM/CICE/AOF fluxes averaged over coupling interval POP - Ocean (sea surface temperature)
Example: Storage needs for NCAR CCSM • CCSM (100 year simulation; ) • 0.5 degree atmosphere, 0.1 degree ocean ~30 TB • 0.125 degree atmosphere, 0.1 degree ocean ~ 70 TB typically 6 ensembles per scenario, plus 3-5 scenarios 1000 year past history 300-700 TB 2 times or more if include chemistry and/or biogeochemistry • CCSM estimated storage for IPCC assessments • AR4 2004-2005 100 TB • AR5 2010-2011 1000 TB • AR6 2016-2017 10000 TB (Above assume 1 degree ocean and 1 month output) • NOTE: Current holdings in the entire NCAR MMS is ~10 PB