420 likes | 433 Views
This presentation provides an overview of the SciDAC Accelerator Modeling Project, its applications in accelerator science, and collaborations with applied math and computer science. It also discusses future plans and the impact of SciDAC codes on existing and future accelerator projects.
E N D
SciDAC Accelerator Modeling ProjectKwok Ko and Robert D. RyneSciDAC PI meetingCharleston, South CarolinaMarch 23, 2004
Outline • Project overview • Applications • Collaborations in Applied Math and Computer Science • Future Plans
Outline • Project overview • Applications • Collaborations in Applied Math and Computer Science • Future Plans
DOE’s Facilities for the Future of Science, 20yr Outlookis a testament to the importance of DOE/SC and the importance of particle accelerators Of the 28 priorities on the list,nearly 1/2 are accelerator facilities
Accelerator projects on 20yr list • LCLS • RIA • CEBAF upgrade • BTeV • Linear Collider • SNS upgrade • RHIC II • NSLS upgrade • Super Neutrino Beam • ALS upgrade • APS upgrade • eRHIC • IBX
SciDAC Accelerator Modeling Project Goal: Create a comprehensive simulation environment, capable of modeling a broad range of physical effects, to solve the most challenging problems in 21st century accelerator science and technology Sponsored by: DOE/SC Office of High Energy Physics (formerly HENP) in collaboration w/ Office of Advanced Scientific Computing Research
SciDAC codes are having a major impact on existing accelerators and future projects • PEP-II interaction region heating analysis (Omega3P,Tau3P,T3P) • Simulation of beam-beam effects in Tevatron, PEP-II, RHIC, and LHC (BeamBeam3D) • Discovery that self-ionization can lead to meter-long high density plasma sources for plasma accelerators • NLC acc. structure design (Omega3P) & wakefield computation (Omega3P, S3P, Tau3P) • Beam loss studies at FNAL booster (Synergia) • Study of e-cloud instability in LHC (QuickPIC) • NLC peak surface fields and dark current simulations (Tau3P, Track3P) • Gas jet modeling (Chombo/EB) • RIA RFQ cavity modeling (Omega3P)
The SciDAC Accelerator Modeling Project team:A multidisciplinary, multi-institutional team producing comprehensive terascale accelerator design tools BNL Space-charge in rings; wakefield effects; Booster expts UC Davis Particle & Mesh Visualization FNAL Space-charge in rings; software integration; Booster expts LBNL (AFRD) Beam-Beam; Space Charge in linacs & rings; parallel Poisson solvers M=e:f2: e:f3: e:f4:… N=A-1 M A U. Maryland Lie Methods in Accelerator Physics SLAC Large-Scale Electromagnetic Modeling LANL High Intensity Linacs, Computer Model Evaluation SNL Mesh Generation Stanford, LBNL (CRD) Parallel Linear Solvers, Eigensolvers, PDE Solvers, AMR UCLA, USC, Tech-X, U. Colorado Plasma-Based Accelerator Modeling; Parallel PIC framworks (UPIC)
Code Development • Electromagnetics • Omega3P, Tau3P,T3P, S3P, Track3P • Beam Dynamics • BeamBeam3D, IMPACT, MaryLie/IMPACT, Synergia, Langevin3D • Advanced Accelerators • OSIRIS, VORPAL, QuickPIC, UPIC
IMPACT code suite User-Map • RAL • PSI • GSI • KEK • SLAC • LBNL • LANL • TX corp • FNAL • ORNL • MSU • BNL • JLab
Collaborations with Applied Math and Computer Science • SciDAC ISICs (TOPS, APDEC, TSTT), SAPP • Eigensolvers and linear solvers • Poisson solvers • AMR • Meshing & Discretization • Parallel PIC methods • Partitioning • Visualization • Stat methods
Outline • Project overview • Applications • Collaborations in Applied Math and Computer Science • Future Plans
Right crotch Center beam pipe Left crotch 2.65 m 2.65 m e- e+ Modeling the PEP-II Interaction Region Courtesy K. Ko et al., SLAC FULL-SCALE OMEGA3P MODEL FROM CROTCH TO CROTCH Beam heating in the beamline complex near the IR limited the PEP-II from operating at high currents. Omega3P analysis helped in redesigning the IR for the upgrade.
Tevatron Modeling • Large computing requirement: each point requires 12 hrs x 1024 procs • Recent result: good agreement for pbar lifetime vs proton intensity • Courtesy Fermilab and LBNL
Beam-Beam Studies of PEP-II • Collaborative study/comparison of beam-beam codes • Predicted luminosity sensitive to # of slices used in simulation
Modeling a Plasma Wakefield Accelerator with added realism in full 3D models (OSIRIS, VORPAL) Full EM PIC simulation of drive beam ionizing Lithium in a gas cell. Courtesy W. Mori et al, UCLA
Full Scale modeling of 30-cell Structure • Distributed model on a mesh of half million hexahedral elements • Study RF damage at high power X-Band operation using Tau3P & Ptrack3D Courtesy K. Ko et al., SLAC
QuickPIC calculations have resulted in up to 500x increase in performance over fully EM PIC Wake produced by an electron beam propagating through a plasma cell
Modeling beam loss in the Fermilab Booster using Synergia Booster simulation and experimental results. (P. Spentzouris and J. Amundson, FNAL)
Outline • Project overview • Applications • Collaborations in Applied Math and Computer Science • Future Plans
Collaboration w/ SciDAC ISICs • TOPS: linear algebra libraries, preconditioners, eigensolvers for better convergence & accuracy • APDEC: solvers based on block-structured AMR, and methods for AMR/PIC • TSTT: gridding and meshing tools
Collaboration with APDEC • AMR for particle-in-cell. • Goal: Develop a flexible suite of fast solvers for PIC codes, based on ADPEC’s Chombo framework for block-structured adaptive mesh refinement (AMR). • Block-structured adaptive mesh solvers. • Fast infinite-domain boundary conditions. • Flexible specification of interaction between grid and particle data. • Accurate representation of complex geometries.
Fine grid patch around source, & tracking beam edge high resolution low resolution + AMR 0.4 0.2 0.0 0.1 0.2 0.3 0.4 Collaboration with APDEC:Benefits from Heavy Ion Fusion program AMR modeling of an HIF source and triode region in (r,z) geometry • In this example, we obtain a ~ 4x savings in computational cost for ~ the same answer Courtesy of A. Friedman, P. Colella et al., LBNL
Collaboration with APDEC:Embedded boundary methods for gas jet modeling
SciDAC Accelerator Modeling Project provides challenging visualization problems Courtesy K.-L. Ma et al., UC Davis
Simulating high intensity beams & beam halos Courtesy Andreas Adelmann (PSI) and Cristina Siegerist (NERSC viz group) Courtesy Andreas Adelmann and PSI viz group
Parallel Performance and Parallel Implementation Issues • Example:BeamBeam3D Scaling using weak-strong option Performance of different parallelization techniques in strong-strong case Milestone: First-ever million particle, million turn, strong-strong simulation performed for LHC
High Aspect Ratiosolver based on Integrated Green Function (IGF):New algorithm provides < 1% accuracy using 64x64 grid (black curve). 64x1024 64x2048 IGF 64x64 64x4096 64x8192 64x16384
Comparisons with Experiments • LANL proton radiography (single-particle optics) • LANL LEDA beam halo experiment • J-PARC front end test (collab w/ KEK/JAERI) • FNAL booster • BNL booster • CERN PS (collab w/ CERN, GSI)
Statistical Methods for Calibration and Forecasting • Determining initial phase space distribution from 1D wire scan data. • Courtesy D. Higdon (LANL) et al. Simulation of a high intensity proton beam through a series of quadrupole magnets. Statistical techniques were used to combine 1D profile monitor data with simulations to infer the 4D beam distribution. The figure shows the 90% intervals for the predicted profile at scanner #6 (shaded regions), and, for comparison, the observed data (black line). Only data from the odd numbered scanners were used to make the prediction.
Outline • Project overview • Applications • Collaborations in Applied Math and Computer Science • Future Plans
3D First-Principles Fokker-Planck Modeling • Requires analog of 1000s of space-charge calculations/step • “…it would be completely impractical (in terms of # of particles, computation time, and statistical fluctuations) to actually compute [the Rosenbluth potentials] as multiple integrals” J.Math.Phys. 138 (1997). FALSE. Feasibility demonstrated on parallel machines at NERSC and ACL Self-Consistent Diffusion Coefficients Spitzer approximation Previous approximate calculations performed w/out parallel computation were not self-consistent Courtesy J. Qiang (LBNL) and S. Habib (LANL)
Optimization • Accelerator system design including space charge • Shape optimization • Plasma afterburner