Upload
tacita
View
23
Download
0
Embed Size (px)
DESCRIPTION
Introduction to FLASH 2.x. Developers: A. Bodoh-Creed, A. Caceres, A. Calder, L. J. Dursi, W. Freiss, B. A. Fryxell, T. Linde, K. M. Olson, P. M. Ricker, K. Riley, R. Rosner, D. Sheeler, A. Siegel, F. X. Timmes, N. Vladimirova, G. Weirs, K. Young, M. Zingale. Outline. Current Status - PowerPoint PPT Presentation
Citation preview
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Introduction to FLASH 2.x
Developers:
A. Bodoh-Creed, A. Caceres, A. Calder, L. J. Dursi, W. Freiss, B. A. Fryxell, T. Linde, K. M. Olson, P. M. Ricker, K. Riley, R. Rosner, D. Sheeler, A. Siegel, F. X. Timmes, N. Vladimirova, G. Weirs, K. Young, M. Zingale
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Outline• Current Status• FLASH architecture• PARAMESH / Data structures• Test Suite• Scaling / Performance
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Current Status• FLASH 2.1 is the current released version
• written in a mixture of Fortran 90, C, and python• support routines written in perl, Java, shell, and IDL.• Modules can be written in either Fortran 90 or C — API
is provided for both languages• code is ~ 280,000 lines
• MPI is used for interprocessor communication• HDF/HDF5 are the primary output formats (parallel I/O
is supported with HDF5).• Nightly test suite is run on 8 platform/compiler
combinations.
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Simulations that FLASH must handle
• Wide range of compressibility
• Wide range of length and time scales
• Many interacting physical processes
• Large variety of problems
• Scaling to 1000s of processors
• Many people in collaboration
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Adaptive Mesh Refinement• Required because of the very large
range of length scales
• Benefits:
• Resolution only where needed
• Memory/compute time decrease
• Can do bigger problems
• Disadvantages
• Code is much more complex
• Loss of data locality
• Frequent redistribution/load balancing
• Irregular/unpredictable memory/message patterns
• Refinement/derefinement is an art
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Current Status (cont.)
• Hydrodynamics:• PPM
• Kurganov
• Magnetohydrodynamics (compressible TVD scheme)• Adaptive Mesh Refinement• Material Properties
• EOS (gamma-law, electron/positron degenerancy,...)
• Conductivies, Viscosity, Species Diffusion
• Multifluid database provides access to material properties (atomic weights, proton number, ...)
Flash contains:
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Current Status (cont.)• I/O
• HDF
• HDF5 (parallel and serial)
• Source terms• Thermonuclear burning
• Stirring
• Gravity
• Constant• Plane parallel• Poisson (multigrid and multipole)
• Particles (active and passive)• Automatic generation of C wrappers to dBase calls
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Under development• New mesh modules
• PARAMESH 3.0• Supports cell, edge, and face-centered and corner data• Allows for permanent or temporary guardcell storage
• Uniform Grid
• Standardized interfaces to modules (CCA)• New solvers (relativistic PPM, radiation transport, Hall
MHD, anelastic)• Automated FLASH debugger (based on AMS)• FLASH user interface over Globus
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
FLASH Architecture• Each FLASH module is divided into 3 components
• Configuration layer describes:
• module dependencies• Default sub-modules• Runtime parameter definitions
• Interface layer • provides a public API to the underlying FLASH data-structures• accessor/mutator methods to manipulate the data• all modules (including the mesh) exchange data through the interface
layer.
• Algorithm• single block / single processor routine• written in any language
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
FLASH Architecture• Each module is in its own directory
• Sub-module directories contain code, Makefiles, and configuration files specific to a particular variant of a module.
• Each sub-module inherits all the code, Makefiles, and Config files of the parent.
• Source files at a given level in the directory hierarchy override files of the same name at higher levels.
• All modules communicate with each other only through the interface layer, NOT common blocks.
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Real dimension(nxb, nyb, nzb) :: density
lnblocks = dBaseGetLocalBlockCount()
do block_no = 1, lnblocks call dBaseGetData("dens", block_no, density) call foo(density)enddo
Interface layer• The data-structures from the mesh are not directly available to a
module writer, but instead through database call.
• Similar calls exist of the runtime parameters, physical constants, and scratch space data.
Assuming a module provides a Config file with the line:VARIABLE dens
The variable dens can be accessed via
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Setup script
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Creating a New Problem• New problems are added by creating a problem directory under
setups/
• A Config file in setups/problem-name lists any module requirements, i. e.
• An init_block function provides a routine that initializes the unknowns for a single block.
• setup problem-name -auto [-1d, -3d] setups the problem, linking all of the modules you requested into the build directory, and creates the Makefile
• A list of the valid runtime parameters for the current setup is generated in paramFile.{txt,html}
REQUIRES materials/eos/helmholtz
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
PARAMESH Tree Structure• The computational domain is broken
into blocks• Each block has the same logical
structure.
• Each block is surrounded by a perimeter of guardcells to exchange information with neighboring blocks
• The hierarchy of blocks is maintained by a 2d -tree structure.
• A block is refined by bisection in each coordinate direction
• Each level of refinement is a factor of 2 different from adjacent levels
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Load Balancing• A Morton space-filling curve is passed
through all the blocks in the tree to generate a 1-dimensional ordering.
• Other space filling curves were tried w/ no real improvement
• Space filling curve preserves locality and minimizes the surface/volume of the sub-domains
• Work weightings are assigned to each block
• Equal amounts of work are assigned to each processor
• Refinement occurs every 4 timesteps for PPM
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Flux Conservation• At jumps in refinement, a flux
conservation step must be performed to ensure global conservation.
• The fluxes from the finer mesh are taken as the more accurate fluxes.
• The fine cell fluxes are added and replace the coarse cell flux in the neighboring zone.
• The summing is area weighted to take into account geometry.
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
FLASH Test Suite• Each night, the current
development version of FLASH is checked out of CVS.
• flash_test submits each job in the suite (currently 26) on several different platforms.
• Currently there are comparison tests, compilation tests, and performance tests.
• Results are posted online nightly
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
FLASH Test Suite (cont.)
• The output is compared against stored benchmarks to machine precision
• Any differences are flagged as failures
• New tests are added frequently
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Visualization
Extensive plot/analysis routines in IDL for analyzing FLASH data
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Development Dynamics• ~10 developers make ~500 changes each month (each change
involves 1 or more files).
• Keeping the code in CVS is an absolute must.• CVS basically forces you to comment about your changes, so every change to
the code is documented.
• Each night, the minor version number of FLASH is incremented. • This version number is supplied by a function flash_release() that is
automatically created by the Makefile at compile time.
• Every output file indicates the exact version number of the code used to make the executable, as well as the machine, directory, and setup call used.
• Combined with CVS, this makes it possible to go back to the exact source used for a particular simulation weeks/months/years after the calculation.
• Maintaining accurate/up-to-date documentation is important for bringing new developers into the project.
ASCI/Alliance Center for Astrophysical Thermonuclear Flashes (University of Chicago)
Development Dynamics• A nightly GNU-style ChangeLog is produced from the cvs logs giving
details of the recent changes.
• Nightly testing is probably the most important part of the development cycle.
• Code is documented using RoboDoc.
• Scripts are used to enforce our coding standards and report on violations.
• Code is periodically parsed by syntax checkers/lints like forcheck.
• Performance/scaling tests are periodically run to make sure that architectural changes have not come at the expense of performance.