710 likes | 835 Views
Optimization and Model Insight Research Directions at Sandia National Laboratories. Scott A. Mitchell INFORMS Chicago Chapter CUSTOM Managing Risk in an Uncertain World.
E N D
Optimization and Model InsightResearch Directions at Sandia National Laboratories Scott A. Mitchell INFORMS Chicago Chapter CUSTOM Managing Risk in an Uncertain World Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company,for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.
Intro • Take home messages • Why we’re doing what we’re doing, not much of the how • Pose the questions, not all the answers • Sandia environment • Breadth of SNL mission • Unique applications • Physics variety • Extreme computations and simulations • Risks to manage, uncertainties to assess • Our response • Current R&D activities • New research directions • ( My “not-to-do” list • Policy and political issues )
Acknowledgements • Optimization and Uncertainty Estimation dept. • http://www.cs.sandia.gov/departments/9211/index.htm • Thanks to Urmila for invitation • Thanks to department staff… • Tim Trucano • Tony Giunta • Mike Eldred • Bart van Bloemen Waanders • Roscoe Bartlett • … and others throughout Sandia, from whose viewgraphs I have borrowed liberally • Marty Pilch • John Aidun • Garth Reese & Kendall Pierson
Sandia Mission Breadth • http://www.sandia.gov/about/vision/index.html Risk Uncertain World • My division: • Nuclear Weapons: ensuring the stockpile is safe, secure, reliable, and can support the United State’s deterrence policy. • Optimization (Opt) and Uncertainty Quantification (UQ) are key to 2 out of 3!
Types of SNL problems • Every year, Labs’ director signs off on stockpile • Says testing not needed (so far) • Standards < 1 in 10^6 accidents result in “any” nuclear reaction • Conservatism in design • Almost no performance data • Very few controlled tests. Systems rarely actually used or in accidents (that’s good, but makes data scarce!) • Tests designed to demonstrate performance, not to test limits of failure. Rarely designed for parameter studies. (Aim for mid-points of parameter ranges, not extremes.) • Contrast to automotive industry • Imagine building the Ford Expedition fleet and meeting individual and aggregate specs, if last car fleet actually driven on roads to failure were 10 model T’s.
Pulsed Power & Inertial Confinement Fusion (ICF) • ICF is a goal at Sandia National Labs • Pulsed Power Technique using Z-machine • Wire arrays explode,creating a plasma sheath, which implodes and stagnates. • X-rays hit capsule, generating fusion • Capsule performance very sensitive to variations!
Sandia: The Extreme Engineering Lab To fulfill our National Security mission, we develop systems and components designed to perform extreme applications, under adverse conditions. The duty cycle can include damage, crush and failure; Extreme sports athlete Dave Mirra melting; decomposition – or just plain old age.
Example Applications I Extremely Tough: • Earth penetrator bombs • Radiation-hard microelectronics • Explosive Destruction System Extremely Sensitive: • Sensors • Explosives Sniffer Extremely Powerful: • Ferroelectric ceramic power supply – explosively driven • Pulsed Power Z-Machine HE charge Electrical Load
Aluminum Honeycomb Competitive Cs and Sr sorption on clay mineral Example Applications II Extreme Insult Resistant: • Aluminum Honeycomb energy absorber • Engineered Stress Profile Glass • Architectural Surety Extreme Effectiveness: • Environmental remediation • Decontamination Foam
Example Applications III Extremely Small & Efficient: • Microsystems & Nanotechnology • Solid State Lighting • Next Generation NG Extremely Reliable: • Electrical Connectors • Bonds • Solder Joints • Adhesives Joints • Brazes and Welds World’s smallest linear accelerator
Spin the wheel: our example du jour Motivation: Target Simulation Codes Electrical Fire Structural Dynamics Solid mechanics Design Tools: optimization sensitivity uncertainty analysis Compressible fluid flow Heat transfer Geophysics Shock Physics Incompressible Fluid flow
Large Structural Dynamics Sandia Calculations • Example Large Sandia Calculations • Gordon Bell Prize won by ‘Salinas’ Structural Dynamics Code team in “special” category at SuperComputing 2002 • Sandia has had many wins in the past • Only non Earth Simulator winner in 2002 • Sandia Director of Engineering Sciences Tom Bickel: first time a “production code” won • C++ • Math and solver libraries • Systems of equations like • Production eigen solver based on ARPACK, Ax=lBx • Linear statics, dynamics, Ax=b • Nonlinear statics, dynamics, A1x1=b1, A2x2=b2,… Anxn=bn
Northrop Grumman Newport News Full Ship Model Description • Large-scale detailed model • 2 million equations • Shell & beam elements • Interior structure • 60 processors (Baby Q) • 20 Modes/1.5 hours
Recent Past: NASTRAN Electronics package 30,000 equations, Cray, Vector Supercomputer 15 years ago: Shellshock 2.5D 200 equations, Pre-Cray Increasing Levels of Model Complexity Explosion in computer hardware and software technologies allows higher levels of structural dynamics modeling sophistication. Electronics package model 2 yrs ago 400,000 equations, Parallel Supercomputer
Electronics package model 1 yr ago: 800,000 equations Cubit: Sandia Meshing Tool Increasing Levels of Model Complexity Gordon Bell SC2002: 0.5M equations, 18 minutes on 128 processors of ASCI Red Now, 10+ million equation (modeling at circuit board level)
Sandia Capabilities • Sandia considers itself (needing to be an) expert in • Advanced Manufacturing • Biosciences • Chemical and Earth Sciences • Computer Information Sciences • Electronics • Engineering • Homeland Security • Materials and Process • Microsystems • Nanotechnology • Pulsed Power • Modeling and Simulation of the above! • My dept’s role: tools for design optimization, reliability assessment, of the above in uncertain environments.
Computational Focus • Limitations of testing(I’m talking about non-nuclear engineering testing here… Even so, for some things, we wouldn’t test even if we could…) • Too many possible designs / scenarios • 20d param space • Not going to do 2^20 tests, but could explore via sampling, SAND (later) • Could you instrument an experiment adequately? • Worst case scenarios only identifiable by computational tools • paradigm shift from “engineering intuition”
In the past, Calibrate computational models to observed experiments Codes used for interpolating between tested points “Verification” criteria is approx. curve fit up to “view graph norm” after calibration Underlying physicsunimportant, any function that curve-fits is ok In the future, want Predictive capability Codes used for extrapolating to unknown designs / environments “Verification” criteria is error bars in computed answer matching error bars in experimental answer Confidence in underlying physics, “validation” that important phenomena are modeled Limits of Computation
Key Challenges of this Approach • How much credibility is sufficient? • Variabilities and uncertainties must be acknowledged, and their impact in the decision context quantified • Historical context • Reactor safety, WIPP, Yucca Mountain • Sandia was leader in developing, applying, and defending methodologies • Decade of peer review at the highest levels
Sandia vs. Textbook Problems • A few details…
Textbook… Functional form known Differentiable, analytic derivatives, or reliable finite differences Fast to calculate ¼ second on a single-processor workstation Sandia functions… Function is FEM solution over large grid May be transient simulation involving multi-physics Simulation may crash at some parameter values Worse than discontinuous, non-existent value! noisy response may make finite differences meaningless Slow to calculate Several days on 1000s of processors on one of the world’s top 10 MPP supercomputers Functional Form
Sandia Functions • Functional form unknown • Except: if function is FEM solution, then sometimesfunctional form may be implicitly given by the PDE • Foreshadow Sensitivities and SAND methods • Local analytic derivates obtainable by hard work Incompressiblefluid-flow codePremo Euler, Note: huge graph of these eq.s, not just single scalar Discretized Euler
1.0 f(x) 1.0 0.0 0.3 x1 x2 1.0 0.4 Sandia vs. Textbook Functions CTH Shock Physics Application Typical Textbook Optimization Problem • inexpensive to evaluate • gradients exist and are accurate • very expensive to evaluate • gradients do not always exist and may be inaccurate
Sandia vs. Textbook Functions ALEGRA ICF Capsule Implosion Study Typical Textbook Optimization Problem • Multiple minima • Variability near optimum is important • Range of sensitivity to variables • Single minimum • Variability near optimum not important
DtoA - Function Evaluation • It’s worse than that • The prior slides assumed that it was possible to run a simulation at an arbitrary design point
Expanded Design Optimization View Many steps currently require human intervention or interpretation, either initially (bother) or in the loop (oh dear!) Design “Analysis” Simulation to Design Design to Analysis Simulation S2D Embedded D2A Geometry creation & cond. Mesh generation Model preparation & mgmt. Mesh decomposition Parametric model changes Equation assembly Solution Error estimation Mesh refinement Load rebalancing Results filtering & decomp. Parallel output Visualization Parametric optimization Geometric optimization Design evaluation SET = D2A + S2D + some embedded libraries
D2A (DtoA) • Problem set-up a key issue • Set it up once • Usually tedious • Re-set it up / close the optimization loop • Can be very difficult, beyond current technology • Varying material properties fairly easy • Just a continuous variable in an input file • Simulation interfacing ok, a development and design,not conceptual, challenge
“impact” “foam” “steel case” symm plane position Material Bound Cond Init Cond Problem set-up a key issue • Varying geometry is difficult • Occasional shape-optimization, well controlled changes, ok • Unstructured hexahedral (brick) mesh dependence, hard • Brittle hex meshing structure, 2.5D algorithms • Several people-months for initial “meshing strategy” script • Many constraints flow all the way forward in the process • Geometry movement may cause discrete event in decomposed model, even if not in initial model!
Problem set-up a key issue: Varying Geometry Fidelity Difficult • Designer: parametric design changes retain “idealized model” characteristics • Analyst: parameterize DSM->ASM so that ASM can be updated automatically after parametric DSM changes Design Solid Model (DSM) Analysis Solid Model (ASM) Data Analysis BC’s • Other ASM data: • Detail suppression • Mesh scheme • Mesh size Decompositions “block_screw.asy” “screw” Idealized Model “block” Analyst Response “block_screw.asy” “screw” Designer changes Add detail “block” “he” “primer” “block_screw.asy” “screw” Parametric change “block” “he” “primer”
My Dept’s Response Our response to these challenges Programmatic / Organizational Technical
Scope – dept’s local focus • Traditionally • Opt and UQ over normal-sized weapon components. Focus on: • Complex systems, full system response multiphysics: electrical, radiation, thermal, fluid flow and shock, radiation, magneto-hydrodynamics • Generation of and reaction to these physics • Reliability in “normal environments” • Vibration and shock of launch, re-entry • Electronic and mechanical component performance during impact • Safety in “abnormal environments” • Weapon in a fire, pool effects • Expanding to Nano, MEMS, bio (not combinatoric DNA problems, but modeling chemistry at sub-cellular level) • Also ties to Emerging Threats • Discrete and agent-based logistics and battlefield system simulations
Dept. Activity Map • Optimization • Sandia Eng Sci • LLNL A-Div • Homeland security inversion & control • Premo, Xyce • SNL Designers • UQ (Uncertainty Quantification) & OUU (Opt. Under Uncertainty) • Sandia Eng Sci • Sandia Pulsed Power/RES/ICF • Validation Applications • Sandia Eng Sci • ALEGRA • MPSalsa reaction-diffusion eq. • Multi-scale modeling • Nano systems Applications Push/Pull Push/Pull Technology/Methodology DAKOTA/UQ Validation Methods & Assessments MOOCHOTSFCore DAKOTA Push/Pull Push/Pull • Optimization & Algorithms • Multi-fidelity • Surrogates • Trust regions • Sensitivity Analysis • SAND • Multi-disciplinary • Parallel strategies • Uncertainty & Algorithms • Sampling • Inference • Characterization • Calibration • Probability, Stats Analytic Reliability • Validation Science • Credibility Quantification • Optimal decisions (OR) • QMU • “Predictability” Research
Technical Themes Technical themes to meet these challenges • Validation & Verification • To trust the simulations • Sensitivity-based (intrusive) methods • Expensive simulations and/or large design spaces • Seven levels of intrusion • Large scale problem motivation • Inversion and homeland security projects • DAKOTA • Noisy and/or expensive simulations • Levels of parallelization • Strategies and methods • SBO, OUU -> SBOUU
Useful quotes to keep in mind. • Hamming – “The purpose of computing is insight…” (?) • ASCI – the purpose of computing is to provide “high-performance, full-system, high-fidelity-physics predictive codes to support weapon assessments, renewal process analyses, accident analyses, and certification.” (DOE/DP-99-000010592) • Philip Holmes – “…a huge simulation of the ‘exact’ equations…may be no more enlightening than the experiments that led to those equations…Solving the equations leads to a deeper understanding of the model itself. Solving is not the same as simulating.” (SIAM News, June, 2002)
ASCI Program • ASCI Applications developing suite of simulation codes • Milestones like “simulate phenomena X” • Major supporter of DAKOTA / our dept’s activities • ASCI Validation & Verification program • To verify those codes, and validate the model applicationso we have confidence in our answer • Uncertainty Quantification seen as a key technology for validating codes, by overlapping computational and experimental error bars. • DAKOTA is the tri-labs delivery vehicle for UQ technology
Validation and Verification • Validation definition • Is the computational model an accurate representation of reality (the reality I care about)? • Depends on phenomena of interest • Depends on decision you need to make • Verification definition • Given the computational model, does the code produce the right answer? • Depends on SQE, accurate solvers, allowing code to run to convergence, etc.
Consider the following “validation” exercise: This is physics. • Hint: Validation is a “physics problem.” • Hint: Verification is a “math problem.” This is math.
There is at least one essential problem with the previous comparison – there are no numerical error bars. • So, fundamentally, what does this comparison mean? • Please note that the calculation is not converged. • Stringently, “verification” for numerical PDE codes basically means: Demonstrate convergence to the correct answer. • …if not for the “code” at least for the particular calculation(s). • Since it is unlikely that we will establish convergence and since we don’t know what the correct answer is this is quite a problem.
“Uncertainty In Verification” • Uncertainty in verification arises from: • (Recall complexity of simulations = functions we hope to optimize over) • Software implementation errors – BUGS • Code crashes are the least of our problems. • Mutually reinforcing errors are also “easily” detectable. • Mutually canceling errors are of greater concern. • Inadequate algorithms • No amount of resolution will solve the problem. • Inadequate resolution • Resolution “solves” the problem but is probably unavailable. • The issue of “verifying” ASCI Level 1 milestones is becoming prominent.
Is Probabilistic Software Reliability (PSR) useful for computational science software? • We are test fixated in building software, properly so: “Based on the software developer and user surveys, the national annual costs of an inadequate infrastructure for software testing is estimated to range from $22.2 to $59.5 billion.” (“The Economic Impacts of Inadequate Infrastructure for Software Testing,” NIST report, 2002.) • If we can’t test software perfectly, then testing alone does not solve the verification problem.
A view of software “reliability” is decreasing # of “failures” and increasing # of “users” and they are correlated. A notional “reliability” diagram for a PDE code thus looks something like the following: Application Decisions 1st use / Validation # of Users (WHAT IS A FAILURE?) Failure Rate Development and test Capability I Capability II Capability etc
The bottom line in the previous example can be generalized: numerical accuracy is an important uncertainty. Qualitative: “I’m uncertain what the accuracy of this calculation is.” • I don’t know what the error is with certainty. • I still need to apply the calculation. • The alternative is analysis paralysis (in particular, NUMERICAL analysis paralysis). • “Global warming” is an example; keep it in mind. Quantitative leap: “I need to apply probabilistic language to describe my understanding of the accuracy of this calculation”
Are Probabilistic Error Models (PEM) useful for computational science software? • Suppose that we can neither “verify codes” nor “verify calculations.” • “When quantifying uncertainty, one cannot make errors small and then neglect them, as is the goal of classical numerical analysis; rather we must of necessity study and model these errors.” • “…most simulations of key problems will continue to be under resolved, and consequently useful models of solution errors must be applicable in such circumstances.” • “…an uncertain input parameter will lead not only to an uncertain solution but to an uncertain solution error as well.” • These quotes reflect a new view of “numerical error” expressed in B. DeVolder, J. Glimm, et al. (2001), “Uncertainty Quantification for Multiscale Simulations,” Los Alamos National Laboratory, LAUR-01-4022. • “All models are wrong-but some models are useful,” statistician George P. E. Box.
Sensitivities Project • Sensitivities (Derivatives, Jacobians, Hessians, etc.) can dramatically speed up optimization over large PDE-based codes • NAND • SAND approach names • PDE-Constrained Optimization, orSimultaneous Analysis and Design – SAND, orall-at-once-approach
Large Scale PDE Constrained Optimization Optimizer PDE simulation Input Black Box Output PDE simulation Input SAND MOOCHO Output optimizer Idea: add PDE equations as constraints to optimization problem