200 likes | 296 Views
Multiscale SciDAC Process Integration Progress and Prospects. Peter Caldwell, Phil Rasch , Hui Wan, Bereket Lebassi Habtezion , Carol Woodward, David Gardner. Thrust 1: Identify Problems. In Single Column Mode ( yr 1) In Short Forecasts ( yr 2) In Climate Simulations ( yr 3)
E N D
MultiscaleSciDACProcess Integration Progress and Prospects Peter Caldwell, Phil Rasch, Hui Wan, BereketLebassiHabtezion, Carol Woodward, David Gardner
Thrust 1: Identify Problems • In Single Column Mode (yr 1) • In Short Forecasts (yr 2) • In Climate Simulations (yr 3) • Using UQ parameter sensitivity to resolution changes (yr 2)
Δt (min) Thrust 1a: Single-Column Tests 20 15 • We planned to use Sungsu Park’s code, but he withdrew his support; creating our own code took 2 yrs. • we will share our code with the Multiscaleeval team, NCAR, and ACME • Bereket will submit a paper on aerosol initialization in the single column model next week • A paper on convergence tests is in prep (see fig) 10 5 ht (km) 0 1.2 Default Aerosol Prescribed Aerosol Fig: Cloud fraction profiles from DYCOMS RF02 (warm, drizzling stratocumulus) single column runs with varying timestep and aerosol treatment. Courtesy BereketHabtezion. 0.8 0.4 0 1 0 1 0.5 0.5
Thrust 1b: Short Forecasts Performed by PNNL (see Wan et al, 2014a)
Thrust 1c: Climate Runs • We performed these runs and talked about them at AMWG 2013 • We should write a paper comparing the impact of changing Δx versus Δt in CAM5! Fig: Change in 2O CAM5 low cloud due to changing physics timestep from 30 min to xx min
Thrust 1: Using UQ • CAM includes many hard-coded iteration counts and #s of substeps. Do these affect climate? • Don did 270 runs based on Latin hypercube sampling to test this • increasing iteration/substep count *does* affect climate (approach seems fruitful) • I gave Don a bad parameter that screwed up results. Need to re-do Fig: SWCF for each UQ run stratified by number of substeps taken within microphysics
Thrust 2: Explore Coupling Strategies • Set up (yr 1) and run (yr 2) parallel-split physics • Look at impact of parameterization ordering (yr 4)
Thrust 2 Progress: Motivation for Task: • CAM5 applies parameterizations sequentially and updates model state frequently • this can cause ‘pinballing between unrealistic states’ • Applying parameterizations in parallel avoids oscillations Doubts: • So why does the canonical paper on physics coupling (Beljaars et al, 2004) say sequential splitting is better? • Parallel splitting still applies a single process in isolation for the entire (~30 min) model timestep Fig: Parallel, sequential-update split, and sequential-tendency split solutions to: dx/dt = dx/dtA + dx/dtB dx/dtA=-10 – x, dx/dtB=10 – x Realization: • Beljaars meant processes should be ordered from longest timescale to shortest and applied sequentially, with the tendenciesfrom previous processes used by subsequent parameterizations We have not implemented sequential-tendency splitting in CAM yet. Doing so will require substantial software engineering.
Thrust 3: Implement Improved Numerical Treatments • fix specific issues identified above (yr 3-5) • create an implicit solver for clusters of physics parameterizations (yr 3) • Evaluate impact of fixes (yrs 4-5)
Thrust 3 Progress: Realization: A simple, idealized code is needed to bridge the gap between physics and FastMath Idealized Dynamics: TVD advection from Leonard et al (1993) Code to Optimize: Macro + Microphysics • Approach: • Put CAM5 macro+microinto the Kinematic Driver (KiD) Framework Interface: renames + reorders arrays Fig: Communicating across disciplines is hard! • Can leverage pre-existing test cases • Macro+micro are a key source of error • Macro+micro are directly from CAM, so improvements translate directly Fig: diagram showing how KiD works. Blue parts are supplied by Kid, orange comes from CAM. Fig: Liquid water content from ‘warm1’ test case, consisting of a constant updraft which dies after 600 s
Thrust 3 Progress: KiD Progress: • Implemented scheme • Convergence requires 4 s physics dt(!) • due to sedimentation substepping? • Implementing implicit solve now 17 min 4s |LWP – LWPdt=0.0625 s| (kg m-2) 0.25 s Fig: timeseries of absolute differences between LWP with physics dt listed in color versus physics dt=0.625 s. In all cases, dynamics dt = 0.0625 s 8 s 16 s 1 min 4.3 min
Future Plans: • Work with FASTMathpeople to improve numerical implement-ationof macro, micro, and their coupling (using KiD framework) • Move macro call into microphysics • Implement an implicit solver for micro (w/ macro included) • gets rid of unphysical mass fixer part of code • fixes macro/micro coupling problems • Use FASTMath technology to obtain the implicit solution more efficiently • Explore other KiD cases (e.g. mixed-phase cloud) • Finish KiD convergence testing (find/fix issues as needed)
Future Plans: • Finish efforts to identify the main numerical errors in CAM physics • Write paper doing convergence tests in single-column mode • Write paper comparing timestepversus horizontal resolution sensitivity in climate runs • Expand numerics improvement beyond macro+micro • Use above experience to inform • Recast processes in sequential-tendency split form? (need coding help) • Multi-rate integrators?
asdf Thanks – Discussion? • asdf (Milestones from Proposal Follow)
(LLNL) (PNNL) (LLNL) (PNNL) * Note I’m just including BER Process Integration deliverables, not ASCR *
(LLNL) (PNNL) (LLNL) (PNNL)
(LLNL) (PNNL)
Tasks in Proposal: • single-column convergence tests (p. 25 yr 1) • set up parallel split physics (yr 1) and test (yr 2) • Use CAPT and UQ parameter sensitivity to dt and dx to identify numerical problems (yr 2) • test sensitivity in multi-yr GCM runs (yr 3) • Implement and test fixes (yr 3) • create implicit solver for clusters of physics parameterizations (yr 3) • Continue improving physics, start evaluating changes (yr 4) • look at impact of changing parameterization ordering/coupling approach (yr 4) • finish devel and eval impact (yr 5)
KiD Results – LWP • asdf