460 likes | 616 Views
Advanced Simulation Capability for Environmental Management Amanzi Overview. David Moulton (HPC Thrust Lead) Mark Williamson (DOE EM Program Manager). ASCEM Challenge and Impact. Challenge
E N D
Advanced Simulation Capability for Environmental Management Amanzi Overview David Moulton (HPC Thrust Lead) Mark Williamson (DOE EM Program Manager)
ASCEM Challenge and Impact • Challenge • Current modeling and cleanup practices work for some sites, but many of the subsurface contamination problems at DOE sites have no practical remedy today • Reduce time required and financial cost of remedial actions at sites within EM complex by providing scientifically defensible modeling and simulation tools that accurately address complex environmental management situations • Develop an integrated, high-performance computer modeling capability to simulate multiphase, multi-component, multi-scale flow and contaminant transport, waste degradation and contaminant release, including • Provide enhanced tools for decision making: parameter estimation, visualization, uncertainty quantification, data management, risk analysis, and decision support • Leverage investments made by other DOE offices and Federal agencies to capitalize on significant investments already made and reduce ASCEM lifecycle development time and costs • Impact • Near-term: technically underpin” existing site RA’s and PA’s • Inform strategic data collection for model improvement • Scientifically defensible and standardized EM Risk and Performance Assessments
ASCEM Is Delivered Through a National Laboratory Consortium Lawrence Livermore National Laboratory
Advanced Simulation Capability for Environmental Management (ASCEM) DOE Interaction Team Office of Science: (BER/BES) & (ASCR) Office of Nuclear Energy: (NEAMS) & (UFD) Office of Fossil Energy: (NRAP) DOE-EM ASCEM Management Team Kurt Gerdes, Office Director EM-32 Mark Williamson, Manager R. Patterson, Deputy Applied Field Research Initiative: Attenuation Based Remedies for the Subsurface, V. Freedman (PNNL) ASCEM Multi-Lab Management Team Paul Dixon, Multi Lab Program Mgr David Brown, Technical Integration Mgr Roger Seitz, End User Interface Cementitious Barriers Partnership D. Kosson (Vanderbilt U) Site Application Thrust Mark Freshley, Mgr Susan Hubbard, Deputy User Steering Committee Platform and Integrated Toolsets Thrust Ian Gorton, Mgr Stefan Finsterle, Deputy Multi-Process HPC Simulator Thrust David Moulton, Mgr Carl Steefel, Deputy Applied Field Research Initiative: Deep Vadose Zone, S. Hubbard (LBNL) Laboratory Leads Landfills Partnership C. Bensen (U of Wisconsin) Applied Field Research Initiative: Remediation of Mercury and Industrial Contaminates, D. Watson (ORNL) ASCEM website: http://ascemdoe.org/
ASCEM Organized Around Three Thrust Areas ascemdoe.org
HPC Thrust Area Goals • Modular HPC simulation capability for waste form degradation, multiphase flow and reactive transport • Efficient, robust simulation from supercomputers to laptops • Design and build for emerging multi-core and accelerator-based systems • Open-source project with strong community engagement Wide Range of Complexity Wide Range of Platforms 5 5
HPC Simulator Thrust Approach • Tasks: • Process Models describedat high level and driven by the platform toolsets • HPC Core Framework provides necessary low-level services • HPC Toolsets provide flexible access to advanced numerical methods and solvers • Testing, V&V, and Benchmarking provides a hierarchical suite of automated tests.
HPC High-Level Goals • Build an interdisciplinary team from our collection of experts in earth science, applied mathematics, and computational science • Establish an open-source tool chain supporting modern software development practices for the team • Establish a common understanding of problems of interest, scope and requirements • Launch development of a prototype open-source code • Designed and implemented parallel from the start • Leverage existing frameworks/tools • Looking ahead toward emerging architectures
Quality Assurance Graded Approach with Code Development Regulatory Code Community Code ASCEM Model Capabilities R&D Code ASCEM v 3.0 User Release ASCEM v 4.0 Full QA ASCEM v 2.0 User Release ASCEM v 1.0 User Release ASCEM Phase I Demo CY14 CY10 CY12 CY13 CY15
First Steps: Requirements and a Name The Simulator is named Amanzi, which means water in Zulu, and the BSD open-source license is selected. Photo courtesy of Carl Steefel
Amanzi: Approach and Leveraging • Leverage the Trilinos framework and supporting tools • Leverage the Mesh-Oriented datABase (MOAB) library to develop support for unstructured meshes (API and mesh class) • Explore/leverage structured AMR techniques and libraries (BoxLib) • Develop the Multi-Process Coordinator (MPC) to manage the current state of the system, coupling of processes, and evolution through time • Develop Process Kernels (PKs) for flow, transport, and geochemistry • Using the new Mesh Class, develop discrete operators needed by the flow and transport Process Kernels, advances in Mimetic Finite Difference (MFD) • Develop a prototype of the Geochemistry Toolset that supports selected processes relevant to the F-Area • Develop prototype input/output and error handling capabilities
Unstructured Mesh Infrastructure • Supports 2D/3D polyhedral meshes in large distributed settings. • Simple inbuilt mesh generation for parallel hexahedral meshes in rectangular domains • Multiple external mesh frameworks under-the-hood for general meshes, including MOAB (ANL), MSTK (LANL), STKmesh (SNL) • Distributed meshes supported with one layer of ghosting • Simplified uniform API for accessing mesh functionality that supports, • Upward/downward connectivity queries • Membership of entities in sets (material region, boundary surface) • Trilinos/Epetra compatible • Accepts meshes in Exodus II (SNL), MOAB/HDF5 (ANL) and MSTK format (LANL)- plans to support others
HPC Unstructured Hexahedral Mesh The unstructured hexahedral mesh of the F-Area with 25ft horizontal resolution is shown. The hydrostratigraphiclayers are colored in shades of brown and the location of the largest F-basin is marked in cyan
Flow Methods on Unstructured Meshes • Methods for Richards and single phase flow equations are based onmimetic finite difference methods: • No limitations on element shapes (pinch-outs) • No computational degeneracies for flat dihedral angles (AMR), • Efficiency: direct generation of inverse of mass matrices, common algorithmic design for flow, energy and geomechanics equations. • Generality: family includes mixed FEM and a few second-order FVMs • Unique feature: design is open to various optimization criteria (monotonicity) • Leveraging developments from ASCR, ASC, and SciDAC programs • Future development of multi-phase flow and thermal processes will build on these methods as well.
Transport Methods on Unstructured Meshes The TVD-MUSCL (Barth-Jespersen) method for the advective term and linear (Lipnikov-Shashkov-Yotov) and non-linear (Le Potier) MPFA type methods for the dispersive term: • Pros: • Discrete maximum principle for tracers • Compact formulas for advective and dispersive fluxes • Second-order accurate methods on polyhedral meshes • Cons: • Linear MPFA cannot control monotonicity when mesh is distorted • Newton methods for nonlinear MPFA have not been studied enough
Biogeochemical Reactions: Processes • Equilibrium aqueous complexation • Mineral precipitation–dissolution (transition state theory) • Sorption • Isotherm-based (linear, Langmuir, Freundlich) • Ion exchange • Equilibrium surface complexation • General kinetically-controlled • Sequential decay with daughter products (A → B → C → D) • Reversible (A + B ↔ C + D) • Activity model: Debye-Hückel with B-dot extension
HPC Simulation of UO2 and Tracer • Isosurfaces of UO2 (yellow and red) and non-reactive tracer after 9.86 years • Simulations used 256 cores on Cray XT4 at NERSC
Structured Grid Integration Approach • Based on total velocity splitting approach • Solve pressure equation for a reference pressure • Define total velocity from Darcy’s law and express phase velocities in terms of total velocity • Solve conservation equations for fluid components (and energy for nonisothermal systems) • Pressure equation • Elliptic for incompressible systems • Parabolic for compressible systems • Component conservation equations • Typically advection dominated nonlinear parabolic equations • Second-order Godunov treatement of nonlinear hyperbolic component • Overall treatment is second-order accurate in space and time.
Adaptive Mesh Refinement (AMR) Approach AMR Objective – exploit varying resolution requirements in space and time • Block-structured hierarchical grids • Amortize irregular work • Each grid patch (2D or 3D) • Logically structured, rectangular • Refined in space and time by evenly dividing coarse grid cells • Dynamically created/destroyed • Subcycling: • Advance level l, then • Advance level l + 1 • level l supplies boundary data • Synchronize levels l and l + 1
Parallelization • Hybrid parallelization strategy • Domain decomposition approach • Coarse grained • Distribute patches within the AMR hierarchy to nodes • Fine grained • Thread work across cores in a node using OPENMP • Demonstrated scaling to 50K cores for single-phase problems with complex geochemistry
HPC Waste Tank Demonstration • Demonstrate the ability to simulate flow and reactive-transport through fine-scale man-made features such as waste tanks and engineered barriers Representative waste tank geometry Approach • A representative waste tank model: • Domain scale: 20m x 20m • Fine-scale features: 1-2cm • High contrast in parameters using representative values • Consider time-dependent multiphase infiltration of water. • Use a structured grid adaptive mesh refinement capability to simulate this infiltration accurately and efficiently • Capture influence of fast flow paths • cell-size ranges from 0.5cm to 8cm
HPC Waste Tank Demonstration • Water builds up on top of the tank and infiltrates through the fast flow channel (e.g., crack) into the tank and begins pooling at the bottom corner • Only 6% of the AMR grid is at the finest level (0.5cm), using only 1 million instead of 16 million cells (uniform grid) Water saturation Water saturation after 25 days (left) and the AMR grid used in the simulation (right) Enlargement of the tank’s top corner show ability of AMR to simulate fine-scale features
ASCEM 2010 to 2015 Program • 2010 Prototype: Demonstration of individual ASCEM modules • Impact: Engage end users in development of prototype integrated, open source PA capability • 2011-2012 ASCEM Version 1: Integration of ASCEM Modules • Impact: First prototype of an integrated, open source simulation capability for EM demonstrated • 2013 ASCEM Version 2: Applied Phase and End User Engagement • Impact: Version 2.0 of an integrated, open source simulation capability released to science and EM community for application • 2014 ASCEM Version 3: Applied Phase and Initiation of Regulatory Quality Assurance V&V Testing • Impact: Version 3.0 of integrated, open source simulation capability demonstrated • 2015 ASCEM Version 4: Regulatory Code Release and Training • Impact: Fully integrated, open source simulation capability released and maintained
Chemistry Time Scales to 50K processors • Geochemical Reactions • Kinetically-controlled mineral precipitation and dissolution • Aqueous complexation • Surface complexation (sorption) • 17 Component Species: • Na+, Ca2+, Fe2+, K +, Al 3+, H +, N2, NO3-, HCO3-, Cl-, SO42-, PO43-, F-, SiO2, UO22+ , O2, Tracer • 121 Secondary Species • Chemistry run times reduced from 40% down to 20% of total time
HPC Phase I Demonstration Goals Demonstrate a prototype of the HPC Simulator running a parallel three-dimensional simulation with flow, transport, and reaction on a simplified model of a real site (the SRS F-Area seepage basin site) • Three-dimensional, unstructured hexahedral mesh to fit stratigraphy • Parallel simulation on at least 100 cores • A steady-state Richards’ Model in the vadose zone • Transport of a non-reactive species • Reactive transport of species such as Uranium • Geochemical Reactions: Aqueous speciation, sorption, mineral precipitation, and dissolution Scenario: start simulation at the time the dumping of waste in the F-Area basins began, and run for a period of at least two years
Savannah River Site F-Area Background • Disposal of low-level radioactive, acid waste solutions (1955–1989) created groundwater plume (pH 3–3.5, NO3, U, 90Sr, 129I, 99Tc, tritium) • Ongoing remediation includes capping (1989), active pump and treat (1997-2003), and pH manipulation since 2004 • Natural attenuation is desired as a long-term remediation strategy
High Performance Computing (cont’d) Scoping studies with PFLOTRAN were developed in collaboration with the F-Basin Working group and used to guide the selection of the domain and the development of the conceptual model. • The selected domain • Has no-flow conditions on the sides • Includes the region where flow enters above the basins. • Extends to Fourmile Branch were flow exits. • Extends vertically to the Gordon Confining Zone.
Unstructured Mesh Generated for HPC • Overlays SRS general separations area • Based on 562,000 hexagonal cells with 25-ft horizontal resolution • Simple representation of hydrostratigraphic layers • Partitioned the unstructured mesh for a 64-processor parallel simulation Largest F-basin
HPC Advanced Multiphase Flow Simulations Block structured mesh with adaptive refinement Leveraged use of BoxLib developed by DOE/ASCR Geochemistry and transport weakly coupled through operator splitting Simulation used 2304 cores on Hopper II, Cray XE6 at NERSC High discharge rate to trench led to flow that dispersed the tracer in all directions 39 days 255 days 441 days
HPC 17-Component Reactive Transport Simulation UO2++ tracer 24 days 105 days • Geochemistry and transport weakly coupled (operator splitting) • Reactions included aqueous speciation, mineral precipitation and dissolution, and surface complexation (sorption) • Simulation used 2304 cores on Hopper II, Cray XE6 at NERSC
FY 2011 Work Scope • Tool development and integration of components : User Release 1.0 and Phase II Demonstration • Working groups for SRS F Area, Hanford Deep Vadose Zone, Waste Tank Performance Assessmentand DOE EM small sites • Initiate ASCEM user and training facilities • Continue communication and integration with other DOE Simulation efforts….. AGNI Toolset Driver ascemdoe.org
More Information about ASCEM • ASCEM Site Applications Thrust Site Selection Task ‘Select Phase I Demonstration’ Milestone 2010, ASCEM-SITE-091310-01, 2010 • Summary of salient features of candidate sites and the selection process • Mathematical Formulation Requirements and Specifications for the Process Models, ASCEM-HPC-101510-01, 2010 • Contains the general mathematical description for the process models envisioned for the final ASCEM product • System Requirements for ASCEM Platform and Integrated Toolsets, ASCEM-PIT-102710-03, 2010 • Describes use cases and requirements for the Platform toolsets • ASCEM Phase 1 Demonstration, ASCEM-SITE-102010-01, 2010 • Phase 1 Demonstration report describing the accomplishments from the first year http://ascemdoe.org/
High Performance Computing Goals Dual Structured/Unstructured approach Design and implement parallel from the beginning for execution on emerging computer architectures Some first year results Implemented unstructured hexahedral grids fit to stratigraphy at SRS F-Area Developed advanced discretizations capable of handling tensor material properties (e.g., permeability) on general grids Developed prototype of Reaction Toolset that includes aqueous speciation, mineral precipitation, dissolution, and sorption Performed parallel three-dimensional flow and transport simulations
ASCEM Leveraging • In addition to primary ASCEM code development, we significantly leverage investments by several DOE Office of Science and National Nuclear Security Administration programs • Early examples include: • VisIt – visualization and graphic analysis tool developed by ASC and ASCR SciDAC Program • Velo: Data Management • PSUADE – uncertainty analysis tool developed by ASC • Trilinos Framework – services for parallel programming and integrated software packages developed by ASC and ASCR SciDAC program • PETSc – Portable, Extensible Toolkit for Scientific Computation developed by ASCR SciDAC Program • BoxLib – parallel AMR framework developed by ASCR Base Math and SciDAC • MFD – Mimetic Finite Difference discretization methods developed by ASCR Applied Mathematics Program • Geochemistry Toolset – Use algorithms developed by computational scientists funded through DOE SC
QA (SQA) Overview: Now Over-arching Quality Assurance Plan QA Plan based on: NQA-1-2004 (1a-2005, 1b-2007) – per EM QAPD Subpart 4.2 (NQA in an R&D environment) DOE O 414.1C Developed and released Basic Phase Software Quality Assurance Plan Basic Phase Configuration Management Plan Basic Phase Verification/Validation Plan Training on these procedures in process QA (SQA) Overview: Remainder of FY 11 – First half FY 12 Refine implementation of Basic Phase effort Increase understanding and Implementation of the QA/SQA program Incorporation of elements in Thrust specific processes Utilization of web processes Review of implementation Develop and deploy Applied Phase processes Quality Assurance Accomplishments
Why do the Sites need/want ASCEM? ASCEM is actively soliciting input from a variety of end user perspectives • End user interactions are helping to guide ASCEM development and encourage future site implementation • Provide a vehicle for the EM User Community to have input to the ASCEM project requirements design and development • Facilitate broad user community participation in the development of Site Applications problems, test beds and testing tools • Feedback is reflected in project planning, requirements documents and demonstrations • Practitioners • Model Setupand Execution • Decision Support • Regulatory • Public Interface • Reviews • Decision Making • Programmatic • Project Management • Oversight • Decision Making