Upload
yetty
View
55
Download
0
Tags:
Embed Size (px)
DESCRIPTION
SciDAC Accelerator Modeling Project Kwok Ko and Robert D. Ryne SciDAC PI meeting Charleston, South Carolina March 23, 2004. Outline. Project overview Applications Collaborations in Applied Math and Computer Science Future Plans. Outline. Project overview Applications - PowerPoint PPT Presentation
Citation preview
SciDAC Accelerator Modeling Project
Kwok Ko and Robert D. Ryne
SciDAC PI meetingCharleston, South Carolina
March 23, 2004
OutlineOutline
Project overview Applications Collaborations in Applied Math and Computer
Science Future Plans
OutlineOutline
Project overview Applications Collaborations in Applied Math and Computer
Science Future Plans
DOE’s Facilities for the Future of DOE’s Facilities for the Future of Science, 20yr OutlookScience, 20yr Outlook
is a testament to the importance of DOE/SC and the importance of particle accelerators
Of the 28 priorities on the list,nearly 1/2 are accelerator facilities
Accelerator projects on 20yr listAccelerator projects on 20yr list LCLS RIA CEBAF upgrade BTeV Linear Collider SNS upgrade RHIC II NSLS upgrade Super Neutrino Beam ALS upgrade APS upgrade eRHIC IBX
SciDAC Accelerator Modeling SciDAC Accelerator Modeling ProjectProject
Goal: Create a comprehensive simulation environment, capable of modeling a broad range of physical effects, to solve the most challenging problems in 21st century accelerator science and technology
Sponsored by: DOE/SC Office of High Energy Physics (formerly HENP) in collaboration w/ Office of Advanced Scientific Computing Research
SciDAC codes are having a major SciDAC codes are having a major impact on existing accelerators and impact on existing accelerators and
future projectsfuture projects PEP-II interaction region heating analysis (Omega3P,Tau3P,T3P) Simulation of beam-beam effects in Tevatron, PEP-II, RHIC, and LHC
(BeamBeam3D) Discovery that self-ionization can lead to meter-long high density
plasma sources for plasma accelerators NLC acc. structure design (Omega3P) & wakefield computation
(Omega3P, S3P, Tau3P) Beam loss studies at FNAL booster (Synergia) Study of e-cloud instability in LHC (QuickPIC) NLC peak surface fields and dark current simulations (Tau3P, Track3P) Gas jet modeling (Chombo/EB) RIA RFQ cavity modeling (Omega3P)
UCLA, USC, Tech-X, U. ColoradoStanford, LBNL (CRD)Parallel Linear Solvers,Eigensolvers, PDE Solvers, AMR
LBNL (AFRD)Beam-Beam; Space Charge in linacs & rings; parallel Poisson solvers
Plasma-Based Accelerator Modeling; Parallel PIC framworks (UPIC)
The SciDAC Accelerator Modeling Project team:A multidisciplinary, multi-institutional team producing comprehensive
terascale accelerator design tools
FNALSpace-charge in rings; software integration; Booster expts
UC DavisParticle & Mesh Visualization
SLACLarge-Scale Electromagnetic Modeling
SNLMesh
Generation
U. Maryland Lie Methods in
Accelerator Physics
LANL High Intensity Linacs,
Computer Model Evaluation
M=e:f2: e:f3: e:f4:…N=A-1 M A
BNLSpace-charge in rings; wakefield effects; Booster expts
Code DevelopmentCode Development
Electromagnetics Omega3P, Tau3P,T3P, S3P, Track3P
Beam Dynamics BeamBeam3D, IMPACT, MaryLie/IMPACT,
Synergia, Langevin3D Advanced Accelerators
OSIRIS, VORPAL, QuickPIC, UPIC
IMPACT code suite User-Map
•SLAC•LBNL•LANL•TX corp•FNAL•ORNL•MSU•BNL•JLab
•RAL•PSI•GSI•KEK
Collaborations with Applied Collaborations with Applied Math and Computer ScienceMath and Computer Science
SciDAC ISICs (TOPS, APDEC, TSTT), SAPP Eigensolvers and linear solvers Poisson solvers AMR Meshing & Discretization Parallel PIC methods Partitioning Visualization Stat methods
OutlineOutline
Project overview Applications Collaborations in Applied Math and Computer
Science Future Plans
Modeling the PEP-II Interaction Modeling the PEP-II Interaction RegionRegion
e+ e-
Center beam pipe Right crotchLeft crotch
2.65 m 2.65 m
Beam heating in the beamline complex near the IR limited the PEP-II from operating at high currents. Omega3P analysis helped in redesigning the IR for the upgrade.
FULL-SCALE OMEGA3P MODEL FROM CROTCH TO CROTCH
Courtesy K. Ko et al., SLAC
Modeling the PEP-II Interaction Modeling the PEP-II Interaction RegionRegion
Top, Distributed Mesh of the IR between the crotches only. Bottom, Snapshot in time of electric field due totwo colliding beams from Tau3P time-domain simulation
Left, Trapped mode with highest power loss calculated by Omega3P (5.28 GHz, 230W). Right,Power loss distribution about interaction point (17.2 kW total from 330 modes)
Tevatron ModelingTevatron Modeling Large computing requirement: each point requires 12 hrs x 1024 procs Recent result: good agreement for pbar lifetime vs proton intensity
Courtesy Fermilab and LBNL
Beam-Beam Studies of PEP-II
• Collaborative study/comparison of beam-beam codes• Predicted luminosity sensitive to # of slices used in
simulation
Modeling a Plasma Wakefield Accelerator Modeling a Plasma Wakefield Accelerator with added realism in full 3D models with added realism in full 3D models
(OSIRIS, VORPAL)(OSIRIS, VORPAL)
Full EM PIC simulation of drive beam ionizing Lithium in a gas cell. Courtesy W. Mori et al, UCLA
Full Scale modeling of 30-cell Structure
Distributed model on a mesh of half million hexahedral elements Study RF damage at high power X-Band operation using Tau3P &
Ptrack3D
Courtesy K. Ko et al., SLAC
NLC Accelerating Structure NLC Accelerating Structure DesignDesign
Model of the 55-cell H60VG3 structure without damping manifolds, but with cell-to-cell variation toprovide Gaussian detuning
QuickPIC calculations have resulted in up to 500x increase in performance over fully EM PIC
-3
-2
-1
0
1
2
-8 -6 -4 -2 0 2 4 6 8
QuickPIC
Osiris (2D)
Ez (m
cp/e
)
Z (c/p)
Wake produced by an electron beam propagating through a plasma cell
Modeling beam loss in the Fermilab Booster using Synergia
Booster simulation and experimental results. (P. Spentzouris and J. Amundson, FNAL)
OutlineOutline
Project overview Applications Collaborations in Applied Math and Computer
Science Future Plans
Collaboration w/ SciDAC ISICs Collaboration w/ SciDAC ISICs
TOPS: linear algebra libraries, preconditioners, eigensolvers for better convergence & accuracy
APDEC: solvers based on block-structured AMR, and methods for AMR/PIC
TSTT: gridding and meshing tools
Collaboration with TOPS:Eigensolvers and Linear Solvers
0
25,000,000
50,000,000
75,000,000
100,000,000
SuperLU WSMP CG withHierarchical
preconditioner
Largest problem size attempted with three different linear solvers on NERSCÕs IBM/SP.
Collaboration w/ TOPS: Partitioning
Left, ParMETIS partitioned model of a five-cell structure (top), same structure partitioned with RCB1D,showing only two neighbors per partition but three partitions sharing one waveguide port (bottom).Right,Comparison between ParMETIS and RCB1D on 55 cell structure without waveguide ports.
Collaboration with APDEC Collaboration with APDEC AMR for particle-in-cell.
Goal: Develop a flexible suite of fast solvers for PIC codes, based on ADPEC’s Chombo framework for block-structured adaptive mesh refinement (AMR).
• Block-structured adaptive mesh solvers.
• Fast infinite-domain boundary conditions.
• Flexible specification of interaction between grid and particle data.
• Accurate representation of complex geometries.
Collaboration with APDEC:Collaboration with APDEC:Benefits from Heavy Ion Fusion programBenefits from Heavy Ion Fusion program
0.0 0.1 0.2 0.3 0.40.2
0.4
high resolution low resolution + AMR
Fine grid patch around source, & tracking beam edge
AMR modeling of an HIF source and triode region in (r,z) geometry
• In this example, we obtain a ~ 4x savings in computational cost for ~ the same answer
Courtesy of A. Friedman, P. Colella et al., LBNL
Collaboration with APDEC:Collaboration with APDEC:Embedded boundary methods for gas jet modelingEmbedded boundary methods for gas jet modeling
axisymmetric jet expanding into vacuum. (The axis of symmetry is at the bottom). Thisuses AMR and APDEC’s embedded boundary methods for gas dynamics.
Collaboration with TSTT: Meshing & Collaboration with TSTT: Meshing & DiscretizationDiscretization
Hexahedral mesh of the PEP-II Interaction Region (excluding the crotches) generated with CUBIT forTau3P simulation using a transit beam to study wall heating
Collaboration with TSTT: AMR on Collaboration with TSTT: AMR on Unstructured GridsUnstructured Grids
Three steps of AMR applied to the Trispal cavity to refine regions of high wall loss (in red) for accuratequality factor determination
SciDAC Accelerator Modeling Project SciDAC Accelerator Modeling Project provides challenging visualization provides challenging visualization
problemsproblems
Courtesy K.-L. Ma et al., UC Davis
Courtesy Andreas Adelmann (PSI) and Cristina Siegerist (NERSC viz group)
QuickTime™ and a YUV420 codec decompressor are needed to see this picture. QuickTime™ and a YUV420 codec decompressor are needed to see this picture.
Courtesy Andreas Adelmann and PSI viz group
Simulating high intensity beams & beam Simulating high intensity beams & beam haloshalos
Parallel Performance and Parallel Implementation Issues
• Example:BeamBeam3D
PEs time (sec)128 1612256 858512 4771024 3032048 212
Scaling using weak-strong option
Performance of different parallelizationtechniques in strong-strong case
Milestone: First-ever million particle, million turn, strong-strong simulation performed for LHC
High Aspect RatioHigh Aspect Ratio solver based on Integrated Green Function (IGF):solver based on Integrated Green Function (IGF):New algorithm provides < 1% accuracy using 64x64 grid (New algorithm provides < 1% accuracy using 64x64 grid (black black
curvecurve).).
64x1024
64x2048
64x409664x819264x16384
IGF64x64
Comparisons with Comparisons with ExperimentsExperiments
LANL proton radiography (single-particle optics)
LANL LEDA beam halo experiment J-PARC front end test (collab w/
KEK/JAERI) FNAL booster BNL booster CERN PS (collab w/ CERN, GSI)
Statistical Methods for Calibration and Forecasting
Simulation of a high intensity proton beam through a series of quadrupole magnets. Statistical techniques were used to combine 1D profile monitor data with simulations to infer the 4D beam distribution. The figure shows the 90% intervals for the predicted profile at scanner #6 (shaded regions), and, for comparison, the observed data (black line). Only data from the odd numbered scanners were used to make the prediction.
• Determining initial phase space distribution from 1D wire scan data.
• Courtesy D. Higdon (LANL) et al.
OutlineOutline
Project overview Applications Collaborations in Applied Math and Computer
Science Future Plans
NLC Accelerating Structure NLC Accelerating Structure DesignDesign
Model of the 55-cell H60VG3 structure without damping manifolds, but with cell-to-cell variation toprovide Gaussian detuning
NLC Accelerating Structure NLC Accelerating Structure DesignDesign
Model of the 55-cell H60VG3 structure without damping manifolds, but with cell-to-cell variation toprovide Gaussian detuning
Model of the H60VG3 DDS structure showing damping manifolds and HOM couplers
3D First-Principles Fokker-Planck 3D First-Principles Fokker-Planck ModelingModeling
• Requires analog of 1000s of space-charge calculations/step— “…it would be completely impractical (in terms of # of particles, computation time,
and statistical fluctuations) to actually compute [the Rosenbluth potentials] as multiple integrals” J.Math.Phys. 138 (1997).
Self-Consistent Diffusion Coefficients
Spitzer approximation
Previous approximate calculations performedw/out parallel computationwere not self-consistent
FALSE. Feasibility demonstrated on parallel machines at NERSC and ACL
Courtesy J. Qiang (LBNL) and S. Habib (LANL)
OptimizationOptimization
Accelerator system design including space charge
Shape optimization Plasma afterburner
QuickTime™ and aTIFF (LZW) decompressor
are needed to see this picture.