35
 U.S. Department of Energy  National Nuclear Security Administration Advanced Simulation and Computing Academic Strategic Alliances Program Center for Simulation of Advanced Rockets  Y6-10 Statement of Work 2262 Digital Computer Laboratory 1304 West Springfield Avenue College of Engineering University of Illinois at Urbana-Champaign Urbana, Illinois 61801 May 2002 Fully coupled, 3-D simulation of "slumping" propellant in Titan IV rocket motor showing velocity vectors in core fluid and total stress in solid propellant

0_SOWAllText

  • Upload
    jimmy

  • View
    5

  • Download
    0

Embed Size (px)

DESCRIPTION

N-N MATRIX

Citation preview

  • U.S. Department of EnergyNational Nuclear Security AdministrationAdvanced Simulation and ComputingAcademic Strategic Alliances Program

    Center for Simulationof Advanced Rockets

    Y6-10 Statement of Work

    2262 Digital Computer Laboratory1304 West Springfield Avenue

    College of EngineeringUniversity of Illinois at Urbana-Champaign

    Urbana, Illinois 61801

    May 2002

    Fully coupled, 3-D simulation of"slumping" propellant in Titan IV rocketmotor showing velocity vectors in corefluid and total stress in solid propellant

  • Table of Contents 1.1

    1 Table of Contents

    1. Table of Contents 1.1

    2. Executive Summary 2.1

    3. Scientific and Engineering Advances 3.1

    3.1 Simulation Development 3.1

    3.2 Simulation Demonstrations 3.3

    3.3 Structures and Materials 3.5

    3.4 Fluid Dynamics 3.8

    3.5 Combustion and Energetic Materials 3.12

    3.6 Computer Science 3.15

    Computational Mathematics and Geometry 3.15

    Computational Environment 3.16

    Visualization 3.18

    3.7 Verification and Validation 3.19

    4. Outreach and Interaction with NNSA/DP Laboratories 4.1

    5. Organization, Management, and Academic Program 5.1

    6. Legacy 6.1

    6.1 Code Legacy 6.1

    Software Engineering 6.1

    6.2 Center Legacy 6.2

    A. Addendum Additional Research Projects A.1

  • Executive Summary 2.1

    2 Executive Summary

    The goal of CSAR is the detailed, whole-system simulation of solid propellant rockets fromfirst principles under both normal and abnormal operating conditions. The design of solidpropellant rockets is a sophisticated technological problem requiring expertise in diversesubdisciplines, including the ignition and combustion of composite energetic materials; thesolid mechanics of the propellant, case, insulation, and nozzle; the fluid dynamics of the inte-rior flow and exhaust plume; the aging and damage of components; and the analysis of vari-ous potential failure modes. These problems are characterized by very high energy densities,extremely diverse length and time scales, complex interfaces, and reactive, turbulent, andmultiphase flows.

    The scientific and technological needs of the U. S. Department of Energy posed by theAccelerated Strategic Computing Initiative/Academic Strategic Alliances Program(ASCI/ASAP) encouraged the University of Illinois at Urbana-Champaign (UIUC) to estab-lish the Center for Simulation of Advanced Rockets (CSAR) in September 1997. The out-standing quality of the faculty and staff, facilities, and research infrastructure offered byUIUC have enabled a unique partnership among university researchers and the DOE/NNSADefense Program laboratories to advance the state of the art in computational simulation ofcomplex systems. State, regional, and university resources are also supporting the program,and an experienced research team is fulfilling the mission of the Center.

    CSAR is focusing on the reusable solid rocket motor (RSRM) of the NASA Space Trans-portation System, better known as the Space Shuttle, as its long-term simulation vehicle. TheRSRM is a well-established commercial rocket, is globally recognized, and most importantly,design data and propellant configurations are available. Various smaller scale rockets are alsosimulated to provide validation data for CSAR codes. Simulations that include full geometricand materials complexity require a sequence of incremental developmentsin engineeringscience, computer science, and systems integrationover an extended period of time. Fromthe outset, our emphasis has been on system integration rather than separate threads of devel-opment that eventually come together at some point in the future. Rapid exploration of criti-cal system integration issues demanded the use of simplifiedbut fully integratedmodelsand interfaces initially, followed by successively refined models and interfaces as experiencewas gained. CSAR staff have designed and implemented a fully integrated code that includescharacterization of various burn scenarios and the onset of potential component failures. Re-fined multiscale component models and advanced system integration concepts based on les-sons learned from this effort constitute the key features in our proposed research. Use of thesimulation code to explore scientific and engineering issues in complex fluid-structure inter-actions is a major focus for the new program.

    More than 100 UIUC faculty, students, and researchers contribute to the success of theCenter. An External Advisory Board provides critical guidance in rocket simulation andcomputational science. The DOE-supplied budget has been sufficient to maintain an aggres-sive research program. In addition, the University of Illinois has provided funds for ancillaryresearch expenditures, computer workstations, and facility renovation. Center personnel havetraveled widely to explore rocket science and technology, identify technical collaborators,describe the ASCI/ASAP program, and establish relationships among Center investigators,DOE/NNSA DP scientists, and industry leaders.

  • Scientific and Engineering Advances 3.1

    3 Scientific and Engineering Advances

    3.1 Simulation Development

    The central goal of CSAR is the detailed, whole-system simulation of solid propellant rocketsfrom first principles under both normal and abnormal operating conditions. Full simulations(Figure 3.1) of such complexity require a sequence of incremental developmentsin engi-neering science, computer science,and systems integrationover anextended period of time. From theoutset, however, our emphasis hasbeen on system integration ratherthan separate threads of developmentthat eventually come together at somepoint in the future. Rapid explorationof critical system integration issueshave demanded the use of simpli-fiedbut fully integratedmodelsand interfaces initially, followed bysuccessively refined models and in-terfaces as experience is gained (Fig-ure 3.2).

    Simulation Roadmap

    The CSAR Simulation Roadmap(Figure 3.3) depicts the evolution ofincreasingly sophisticated computa-tional models for the primary rocketcomponents and their interactions.Our initial implementation of an inte-grated simulation code (GEN1),which was operational at the end of2000, provided a simplified charac-terization of various burn scenarios. The GEN1 code employed macroscopic models for theseparate components to enable a strong focus on the definition and resolution of system inte-gration issues. Refined, multiscale component models and advanced system integration con-cepts, based on lessons learned from GEN1, constitute the key features in the second-generation (GEN2) code, developed during Years 4 and 5 and beyond. The refined modelsalso reflect the synthesis of fundamental, subscale studies (bottom right side of Figure 3.3)that are critical for detailed simulations of accident scenarios and for reliable simulation ofmultiscale phenomena such as combustion and turbulence. The code numbers in the diagramindicate dependence of the refined and accident models on the subscale simulations.

    The Roadmap indicates the close coupling among the components; physical quantitiessuch as temperature (T), mass flow ( m ), pressure (p), heat flux (q), concentrations (ci), andgeometry that must be exchanged between the solid rocket booster (SRB) component mod-

    Fig. 3.1: Current 3-D fully coupled code includes struc-tural dynamics, combustion, and fluid dynamics simulationmodules and ignition model. Images show interaction ofsolid propellant, combustion layer, and fluid flow follow-ing ignition of RSRM (clockwise from upper left: 5, 10,15, and 20 ms). Colors in solid propellant depict localstress, colored arrows in fluid represent flow direction andspeed, and colored isosurfaces in fluid show temperaturedistribution.

  • Scientific and Engineering Advances 3.2

    els. The computer science integration ef-forts define the framework for these inter-connections and, consequently, theireventual impact on overall code perform-ance. In the right-center box on the dia-gram, computer science research and de-velopment activities are shown that sup-port the SRB simulation through the im-plementation and optimization of thecomponent models and subscale simula-tions, the integration of component mod-els and the computational infrastructurerequired to do large-scale parallel com-putation.

    Finally, the central placement of validation efforts in the diagram highlights the priorityassigned to this activity. Each subscale, component, and integrated simulation must be vali-dated against existing analytical, numerical, and experimental data available in the open lit-erature or obtained from NASA, DOD research agencies, the U.S. rocket industry, or in ex-periments in laboratories on our own campus.

    System integration involves two major tasks to ensure the physical, mathematical, geo-

    Physical Complexity

    Geo

    met

    rical

    Com

    plex

    ity

    1-D

    2-D

    3-D

    Star Grain

    Joints

    Accidents GEN2Family

    Weakly Coupled Fully Coupled Detailed

    GEN0

    GEN1Family

    Fig. 3.2: CSAR follows staged approach to systemintegration.

    Fig. 3.3: CSAR Roadmap showing completed tasks (dark boxes) and planned activities for Y6-10.

  • Scientific and Engineering Advances 3.3

    metric, numerical, and software compatibility of the component models and the codes im-plementing them. The first task is providing information transfer across component bounda-ries. Boundary conditions for the component models must be compatible mathematically(e.g., an outflow from one component becomes an inflow for a neighboring component). Thediscretizations of neighboring components must fit together geometrically. Different spatialresolutions and discretization methodologies must be reconciled via interpolation where nec-essary. These issues have been addressed by developing innovative algorithms for mesh as-sociation and common refinement of interface meshes, together with new data transfer meth-ods that are both physically conservative and highly accurate numerically.

    The other major task is temporal coupling of the components so that the whole system isevolved in a self-consistent manner. Different components require different time step sizesdue to the choice(s) of algorithm(s) (e.g., explicit vs. implicit methods), spatial resolution,and/or the physics of the subproblem that the module solves. The computational cost offorcing each module to take a time step determined by the module requiring the shortest stepis often prohibitive. In a broad, important research area that has come to be known as timezooming, we continue to investigate multiple strategies for coupling modules requiring dif-ferent time step sizes while maintaining the accuracy of the overall simulation.

    Our approach to system integration has been to develop a single executable code con-taining modules for the various components and the interface code for tying them together.We are following an object-oriented design methodology that hides the data structures andother internal details of the individual component codes. This simplifies development andmaintenance of the interface code and the component codes, and also makes it easier to swapdifferent versions of the same componenta critical capability for determining the most effi-cient algorithms and implementations.

    3.2 Simulation Demonstrations

    Team Coordinators: Mark D. Brandyberry and Robert A. Fiedler

    Research Programmer: Johnny C. Norris

    Overview

    Simulating solid propellant rocket motors requires solving an extremely complex, fully cou-pled, multidisciplinary set of physics problems that span a wide range of length and timescales. As noted above, we have taken a staged approach in developing an integrated whole-system rocket simulation package, beginning with relatively simple physical models, geo-metries, and coupling in our first-generation code (GEN1), and progressing toward more de-tailed physics, geometries, and interactions in our second-generation code (GEN2).

    Integrated Rocket Simulations SITeam Members

    We propose a variety of challenging fully-integrated, 3-D simulations to exercise our GENXcode over the next few years. They are a collection of validation studies that include normaland abnormal operating conditions, geometries supplied from the open literature, NASA,DOD, and our U.S. rocket industry partners, and experimental- and production-model rocketmotors. These target problems comprise the annual integrated simulation goals required inthe Statement of Work. Approximate completion dates are included, although implementa-

  • Scientific and Engineering Advances 3.4

    tion of improved physical models (turbulence, aluminum droplets, smoke, radiation, plumechemistry; constitutive relations for the propellant, case, and insulation; heat transfer, igni-tion, and burn rate, etc.) may warrant revisiting a given problem to demonstrate betteragreement with experimental results. This preliminary list of problems includes Titan IVpropellant slumping, flexible inhibitor, aerodynamic loads (rocket in a wind tunnel), detailedfull burn of various small rockets supplied by the U.S. rocket industry, RSRM normal burn,flow past O-rings, and Titan case rupture.

    Integrated Rocket Simulations

    Titan IV propellant slumping This real-life solid propellant booster accident is described in theopen literature. Published documentation includes detailed pressure sensor and hoop strain gaugemeasurements, as well as 2-D simulation results for comparison with our 3-D simulation results. Thecurrent GENX code will soon be able to reproduce the conditions that lead to rupture of the case.This will require a 10 to 15-day run on 512 processors for a coarse mesh study. The amount ofslumping that can be followed is limited by fluids mesh motion. December 2002

    Flexible bore inhibitor The current GENX code should be able to simulate a rocket with a flexinginhibitor provided the deformation is not too great, perhaps a maximum deflection angle of ~45 de-grees. The maximum deformation of an inhibitor in a full rocket model might be less than it is in ourmodel of just a section, unless we carefully choose time-dependent boundary conditions. We havealready generated a preliminary fluids mesh. December 2002

    Aerodynamic loads on the case (for lab scale rockets) This topic is of particular interest to therocket industry, and the current GENX code could easily be extended to include the air in a regionoutside the rocket. The fluids domain would include a virtual wind tunnel outside the case, with atime-dependent inflow speed determined by the thrust history of the rocket. The non-rigid casewould have a fluid-structure boundary on the outside, as well as the usual loads from the internal gaspressure. Most of the work to be done would involve setting up appropriate boundary conditions andmeshing the initial geometry. June 2003

    Lab scale rockets from ignition to burn-out We already can reproduce the measured pressureovershoot, but the pressure rise is much slower in the experimental data than in our simulations us-ing the current ignition model (Rocburn v2.2). The CEM and Fluids teams will collaborate to de-velop an improved model of the heat flux from the gas to the propellant. To simulate propellantburn-out, we need at least a rudimentary remeshing capability to handle the gradually changing to-pology. We expect that the unstructured fluids and structures solvers (Rocflu and Rocfrac) alongwith the serial mesh repair code will be able to address this problem in a time frame of about a year. June 2003

    ARC Rockets Atlantic Research Corporation (ARC) has agreed to supply us with detailed designand experimental data for small guidance rockets, both as a validation test for our code and to assistthem with improving their designs. The most difficult aspects of simulating these small motors is ac-curately modeling ignition transients and adapting the meshes automatically as the propellant burnsaway completely. There are flexible baffles in some of these motors that may divide and reunite thefluid domain as the system evolves, and cracks can form in the propellant and other components. Anadvanced 3-D remeshing capability is required for such topology changes. December 2003

    RSRM Challenger accident (simulate flow past O-rings) This problem requires a structuressolver that handles contact, so perhaps this is an item for the addendum on what could be done if weare given additional funding. The fluids domain topology changes when the gas begins to passaround the O-ring, which requires advanced remeshing capabilities. June 2004

    RSRM complete normal burn The ignition transients for the Space Shuttle booster are well char-acterized in the open literature, and we have access to extensive test data. We hope to reproduce thewaterfall plot (amplitude of vibration as a function of frequency) at various stages during firing.The most difficult aspect of simulating the entire history of a large motor is reducing the run time.

  • Scientific and Engineering Advances 3.5

    For a fluids mesh that is fine enough to allow accurate turbulence modeling, for example, timezooming techniques under consideration by the Fluids team will be required to reach 120 seconds ofphysical problem time. December 2004

    Titan IV case rupture accident In this simulation, the pressure builds up until the case fails. Thefailure begins as a crack that propagates all the way through the case. The fluid domain would in-clude the air outside the case, so hot gas would begin to leak from the crack in the case into the sur-rounding air. The case would tear open, relieving the pressure. In the test firing, the rocket broke upviolently into many pieces, destroying the test stand, but there was no detonation. By the time weimplement the required remeshing capability, we may also have an advanced material model for thepropellant that includes the effect of voids and dewetting. June 2005

    3.3 Structures and Materials

    Group Leaders: Philippe H. Geubelle and Keith D. Hjelmstad

    Faculty Investigators: Robert S. Averback, Armand J. Beaudoin, Jeff G. Erickson, PhilippeH. Geubelle, Michael J. Garland, Robert B. Haber, Yonggang Huang, Keith D. Hjelmstad,Petros Sofronis, and Daniel A. Tortorelli

    Postdoctoral Research Associate: Yinon Ashkenazy

    Research Programmers: M. Scot Breitenfeld and Changyu Hwang

    Overview

    The Structures and Materials group is responsible for the analysis of the solid parts of therocketthe rocket case, unburned solid fuel, insulation and inhibitors, case segment joints,etc. The activities of this group divide into two broad areas: failure modeling and constitutivebehavior of components and materials, and integrated system simulation.

    Failure Modeling andConstitutive Behavior

    Mesoscale Modeling ofConstitutive and FailureResponse of SP and Case A. Beaudoin, E. deSturler, P. Geubelle and Y.Huang

    The continued develop-ment and implementationof physically-based codesneeded to simulate 3-Dconstitutive and failure be-havior of solid propellant(SP) is proposed. Three phenomena taking place in the vicinity of a crack front propagatingin SP will be captured: dewetting of the larger AP particles, progressive growth of the voids,and coalescence of voids through tearing of the matrix. The simulation effort will be per-formed at the mesoscale and will account for the discrete microstructure of SP. The analysis

    StructuralSimulations

    Constitutive andFailure Modeling

    Macroscale (continuum)

    Mesoscale (continuum)

    Atomic scale(atomistic simulations)

    Rocfrac (Explicit)

    Rocsolid (Implicit)

    CaseModeling

    MTS

    Multi-grain

    Dislocations at grainboundaries

    Solid PropellantModeling

    Hills potential

    Particulate composite

    Grain/binder interface

    Fig. 3.4: Structures and Materials Technical Roadmap provides explicitand implict simulation schemes and an array of materials modelingmethods.

  • Scientific and Engineering Advances 3.6

    will involve the combination of a cohesive finite element scheme to capture the spontaneousdewetting of the particles; the virtual internal bound (VIB) method to simulate the tearing ofthe matrix; and packing algorithms to create realistic microstructures. The analysis will beperformed under both quasi-static and dynamic conditions, and will rely on a proposed noveliterative solver that takes advantage of the slowly evolving nature of the secant or tangentstiffness matrix as cohesive and/or VIB elements progressively fail. This project involvesclose collaboration with research groups at Los Alamos and Sandia National Laboratories.

    Micromechanics Approach to Study of Damage and Constitutive Response P. Sofronis

    Finite element methodology and principles of mechanics of materials are used to study andunderstand issues governing the mechanical damage response of a solid propellant underboth static and dynamic loading. We will address macroscopic propellant nonlinearity re-sulting from the interaction between the macromechanisms of damage at the microscale. Thederived constitutive laws will be used to investigate the phenomena of shear localization andhot spot generation under dynamic loading. These phenomena take place spatially at bothnano (interface cohesive laws) and micron-scale (porosity, localized-band width), and mayevolve in time over picoseconds (initiation of dewetting) or a few seconds (void elongation).The importance of atomistic simulations in assessing specific material-parameter inputs tothe micromechanical models will be explored. This project is part of an ongoing researchcollaboration on void growth with the Lawrence Livermore National Laboratory.

    Atomistic Calculations of Debonding, Plastic Behavior and Fracture R. Averback

    This project provides convenient interfacing between atomistic simulations and continuummodeling. In achieving these goals, we will develop analytic potentials that describe intera-tomic interactions in and between different classes of materials: metals, ceramics, and poly-mers. Efforts will focus on developing appropriate reactive potentials for polymer/metal andpolymer/metal-oxide interfaces and on developing a scheme to combine molecular dynamics(MD) with kinetic Monte Carlo (KMC) methods for alloy systems. Atomistic simulations arethen employed to calculate the mechanical response of multiphase materials subjected to highstrain conditions at various temperatures and to develop constitutive laws near interfaces.These results will provide input data for other CSAR continuum models.

    Structural System Simulation

    Adaptive Version of Rocfrac P. Geubelle and S. Breitenfeld

    Developed over the past three years, Rocfrac is a structural solver used in the GEN2 inte-grated code development effort. The code is based on an explicit time stepping scheme andrelies on a 3-D Arbitrary Lagrangian Eulerian formulation to capture the regression of thesolid propellant. It includes nonlinear kinematic description to account for possible large de-formations and/or rotations, and contains a variety of volumetric responses, ranging from lin-ear elasticity to the Arruda-Boyce nonlinearly elastic model used for the grain response andto a specially developed elasto-viscoplastic model for the case. The Rocfrac element librarycurrently includes 4- and 10-node tetrahedral volumetric elements used to model the bulkresponse of the rockets structural components, and 6- and 12-node cohesive elements usedto capture the initiation and rapid propagation of one or more cracks in the grain and alongthe grain/case interface. We will continue the development and integration of this structuralsolver, and, in particular, address the following aspects:

  • Scientific and Engineering Advances 3.7

    Develop and implement the next generation of Rocfrac, with special emphasis on the in-troduction of improved low-order tetrahedral elements and on dynamic mesh adaptivity

    Complete the full integration of Rocfrac within the Charm++/FE (SWIFT) Framework

    Apply the integrated unstructured rocket code to various dynamic fracture problems, withspecial emphasis on the Titan IV grain slumping accident

    Parallel Implicit Finite Element Computations K. Hjelmstad and D. Tortorelli

    The unified, implicit, parallel, finite element program development effort will continue. Im-plicit time integration methods offer a significant advantage over explicit methods in that thesize of the analysis time step is not limited by numerical stability. Future simulations able tomodel more than the first few seconds of a complex burn like the RSRM may demand im-plicit computations. Further, developing methods for time zooming include implicit com-putational methods. This work will be carried out in collaboration with the Computer ScienceGroups efforts on a finite element framework and numerical solvers.

    Convective Burning of Cracks P. Geubelle, C. Hwang, and L. Massa, and B. Asay (LANL)

    Over the past three years, a 2-D fully-integrated fluid/structure code (referred hereafter asGEN2D) has been developed as a 2-D version of GEN2. GEN2D has played an importantrole in the success of its 3-D counterpart (GEN2) as it has allowed us to test the explicitfluid/structure coupling algorithm that is at the heart of the integrated code. The 2-D code hasbeen used over the past two years to test the ability of the coupling algorithm to capture ae-roelastic flutter and to conserve mass across a regressing boundary. The existence of this 2-Dcode is a key reason why the development and implementation of GEN2 was achieved sorapidly and successfully. GEN2D is expected to play a key role over the next five years, asnew coupling algorithms, physical models and meshing strategies are developed for the inte-grated code.

    This project has two facets. First, we will continue an ongoing investigation of the mul-tiphysics problem of convective burning of cracks in energetic materials in collaboration withthe CEM group and the LANL Combustion group (led by B. Asay). This study will be per-formed both at the macroscale, where the discrete nature of the SP microstructure is ne-glected and homogenized constitutive and combustion properties are assumed, and at themesoscale, where the structural solver will be combined with the 2-D version of Rocfire tosimulate the process of crack propagation in a burning solid propellant. Second, GEN2D willbe used to test new coupling strategies before these are used in the 3-D code. Special empha-sis will be placed here on problems requiring adaptivity of the interface code (Rocface), i.e.,those for which important geometrical changes take place along the interface due to substan-tial motion of the interface. Separation of a piece of solid propellant from the rest of therocket core is one of the class of problems to be investigated where a new internal boundaryis created spontaneously.

    Spacetime Discontinuous Galerkin Methods for Coupled Simulations R. Haber, J. Erick-son, and M. Garland

    Spacetime discontinuous Galerkin (STDG) finite element methods have emerged as a power-ful new class of numerical methods for solving hyperbolic problems and nonlinear conserva-tion laws. Research in CSAR and its sister center, the NSF Center for Process Simulation andDesign (CPSD), has moved spacetime methods from blue-sky concepts to a viable simula-

  • Scientific and Engineering Advances 3.8

    tion technique positioned to be uniquely responsive to CSARs simulation requirements. In-herent O(N) computational complexity, rich parallel structure, support of fully unstructuredspacetime grids, strong element-wise conservation properties, intrinsic high-order stabilityand natural shock-capturing properties make STDG methods an ideal vehicle for high-resolution simulations. Parallel, hp-adaptive STDG implementations promise to significantlyextend our capabilities for long-duration and multi-scale rocket simulations. We proposecontinuing research on STDG methods, leveraged by a parallel effort in CPSD, to realize anddemonstrate these advantages. This program will involve new research on STDG formula-tions, spacetime mesh generation, visualization of spacetime data sets, and parallel imple-mentations. The research involves mesh motion and adaptive grids, time zooming, and mul-tiscale modeling of propellants.

    3.4 Fluid Dynamics

    Group Leaders: Robert D. Moser and S. Balachandar

    Faculty: Ronald J. Adrian, Hassan Aref, S. Balachandar, Jonathan Freund, and Robert D.Moser

    Research Scientists: Jiri Blazek, James P. Ferry, Andreas C. Haselbacher, Fady M. Najjar,and Bono Wasistho

    Overview

    The Fluid Dynamics Groupaddresses system-scale SRMmultiphase compressiblecore flow code development,as well as subscale modeldevelopment relevant to theturbulent dynamics of thecombustion interface. Theseinclude injection, dispersionand combustion of aluminumdroplets in the core flow,formation, dispersion andslag accumulation of aluminum oxide particles, and flow within cracks and other defects inthe propellant. Broadly viewed, Fluids Group research includes multiphysics code moduledevelopment (Rocflo and Rocflu), individual physics modules (Rocturb, Rocpart, Rocsmoke,Rocrad, and Rocspecies), and fundamental research projects needed to support simulationcode development (Figure 3.5).

    Multiphysics Fluid Dynamics Code Development

    SRM boosters contain many geometrically complex features, such as the star grain regionand the submerged nozzle. Moreover, additional geometrically complex regions may developbecause of large time-dependent deformations during burning of the propellant. Examplesinclude cracks in the solid propellant, slumping propellant segments, and even periodic mo-tions of inhibitors left partially exposed by propellant burn back. To simulate fluid-dynamic

    3-D StructuredDevelopment (Rocflo)

    3-D UnstructuredDevelopment (Rocflu)

    Al DropletModeling(Rocpart)

    LES TurbulenceModeling(Rocturb)

    RadiationModeling(Rocrad)

    Optimal LESResearch

    Injection DrivenFlow Research

    Multi-phase FlowResearch

    Al OxideModeling

    (Rocsmoke)

    Fluids Multi-Physics Framework

    ChemistryModeling

    (Rocspecies)

    Fig. 3.5: Fluid Dynamics Group technical roadmap provides support-ing fundamental research for integrated code components.

  • Scientific and Engineering Advances 3.9

    processes in complex geometries, CSAR is developing an unstructured-grid code calledRocflu. The simpler regions of the rocket geometry will be modeled with the block-structuredRocflo code. Simulation of one of our key validation studies, the Titan IV slumping propel-lant accident, may depend on the interaction among Rocflu, Rocflo, and Rocfrac (Figure 3.6).

    Rocflu A. Haselbacher

    Rocflu addresses fluid flow in complex and evolving geometric regions. It uses unstructuredgrids composed of arbitrarycombinations of tetrahedra,hexahedra, prisms, and pyra-mids. The development ofRocflu is tightly coupled to thatof the block-structured codeRocflo to maximize code reusethrough a framework of com-mon data structures, variabledefinitions, naming conven-tions, and coding standards.Rocflu development efforts willconcentrate on issues relating togrid motion and repair, the in-corporation of optimal LESstencils, and the integration ofthe physical modeling modulesfor multiphysics simulations. Research related to the code development effort will focus onthe computation of stencil weights for optimal LES on unstructured grids, and the continueddevelopment of commuting filter operators for unstructured grids.

    Rocflo-MP J. Blazek

    A high fidelity simulation of a solid rocket motor based on first principles requires an accu-rate, fast and numerically robust fluids code. We will continue the development of our struc-tured, multi-block flow solver Rocflo-MP for multi-physics simulations of flows inside andoutside of solid rocket motors. It is also proposed to conduct research in numerical methodsin order to enable detailed and accurate simulations of the complete burn of a solid rocketmotor and of possible failure scenarios. Our foci include: reducing significantly the compu-tational time of a simulation while retaining temporal accuracy; developing a robust meshmovement/regeneration methodology, increasing the spatial resolution to allow for reliableLarge Eddy Simulations (LES) inside the full motor; developing numerically robust schemesfor the treatment of stiff source terms arising from chemical species, radiation, burning parti-cles and smoke; and including the effects of gravity, acceleration and aerodynamic loads onthe case.

    Individual Fluids Code Module Development

    Rocturb R. Moser, F. Najjar, and B. Wasistho

    A substantial fraction of the gaseous flow in a solid propellant rocket chamber is turbulent,and this turbulence affects many of the physical process occuring in an SRM. It is thus of

    Fig. 3.6: Schematic interaction among Rocflu, Rocflo, and Rocfracfor the case of slumping propellant segment. Note that interfacesurface grids between Rocflu and Rocflo need not be matching.

  • Scientific and Engineering Advances 3.10

    great importance to model the turbulence in the core flow. The primary turbulence modelingparadigm that has been selected for this application is Large Eddy Simulation (LES). LEShas the advantage of resolving the large-scale turbulence fluctuations, which are needed inmodeling the interactions of turbulence with other phenomena (e.g. particle dispersion, inter-action with the combustion layer, shedding from inhibitors). Earlier simulations of simplerocket motors were performed with simple no-model LES, in which artificial dissipation isused to regularize the fluid flow equations. However, such a numerical treatment lacks a rig-orous physical basis. It fails, for instance, to adequately predict transition without adjustingthe level of artificial dissipation in an ad-hoc manner. The main purpose of Rocturb moduleis therefore to provide physically based models for simulating turbulent flow inside a rocketmotor.

    In its development and improvements, Rocturb relies on four areas of basic research.First, research on application of turbulence models within the solid rocket simulation frame-work itself. This research is directly related to the turbulence phenomena that occur in therocket flow and its simulation. Results of these studies can, thus, be directly implemented inthe Rocturb module. Second is research on a new class of turbulence models called optimalLES, in which LES models are developed directly approximating the best possible determi-nistic LES model. The first use of optimal LES in an applications fluids code will be inRocfluid through the Rocturb module. Third is research on the physics and modeling of theturbulence fluctuations that are generated by the combustion layer and its interaction with thecore flow. This poorly understood process sets the boundary conditions for the turbulence inthe core flow. Finally, there is research on the LES of multiphase flows. The particles in therocket are subgrid in size. They therefore influence and are dispersed by the subgrid flowfields. This interaction must be modeled, and represents a coupling between Rocturb,Rocpart and Rocsmoke models.

    Rocpart F. Najjar, S. Balachandar, and J. Ferry

    Modern solid-propellant rocket motors have aluminum particles added to the mix and theirinclusion contributes for up to 30% of the heat generation. Within the present multi-physicsframework, these Al droplets and the large oxide caps are taken into account throughRocpart. Several enhancements to the present Rocpart are proposed: incorporation of im-proved force law, heat and mass transfer correlations, and burn-rate models to better accountfor flow conditions within the rocket, improved injection model, modeling of oxide smokeentrapment resulting in slag accumulation in the submerged nozzle, modeling of the effect ofsubgrid turbulence on droplet evolution, improved Lagrangian-Eulerian coupling, and thor-ough verification and validation. These enhancements require fundamental investigation andin particular we propose continuing our very productive study of an isolated particle in com-plex ambient flow.

    Rocsmoke and Rocspecies J. Ferry, S. Balachandar, and F. Najjar

    Aluminum oxide smoke is an important by-product of combustion in modern aluminizedpropellants and it impacts the flow physics inside the rocket motor in several critical ways.Smoke modulates the radiative transfer of energy, it gets trapped in the submerged nozzle asslag, and it can represent over 20% of the mass inside a motor. This mass exhibits variationsof concentration that are far more complex than the variations in gas density. Therefore wepropose to continue our fundamental multiphase flow research as a means to assess, develop,

  • Scientific and Engineering Advances 3.11

    and improve continuum models of multiphase flow. Specific areas of focus are the continueddevelopment of the Equilibrium Eulerian method; internal boundary conditions (allowingdifferent methods to be used in adjacent regions); compressibility effects (interaction ofsmoke with acoustic and shock waves); interphase coupling; slag dynamics; collision anddiffusion modeling; and LES modeling for smoke. This research encompasses the develop-ment of methods for handling species, especially taking into account real gas effects.

    Fundamental Supporting Fluid Dynamics Research

    Experimental Studies of Rocket Fluid Mechanics R. Adrian

    We propose continuing an experimental program to provide physical data on the fluid me-chanics of turbulent core-flow, combustion and two-phase flow in solid fuel rockets. Thesedata support the development of physical models needed for the computational fluid dynam-ics formulations and provide validation of computations of basic rocket configurations. Weemploy small-scale models and rocket flow simulations that allow phenomena important tothe numerical simulations to be isolated and studied in well-defined conditions and simplegeometries amenable to computational development. The studies include investigations ofthe sensitivity of the core flow to detailed structure of the wall velocity boundary condition,detailed measurements of the velocity field of the hot exhaust plume, and construction andtesting of a standard rocket model whose fuel can be modified to study characteristics ofturbulence, two-phase flow, and combustion in the burning exhaust plume.

    Coagulation and Fragmentation of Al Particles and Ash H. Aref and S. Balachandar

    We are studying equations describing coagulation and fragmentation phenomena with a viewto applying the results to the aluminum particles used in solid-rocket fuels and the ash parti-cles produced once these particles have burned. Modeling of the size and mass distribution offuel and ash particles is an important input to the global simulation codes being constructed.This research is one element of a broader thrust within the CSAR particle group.

    Particle Dynamics in High-Speed Flows H. Aref

    The dynamics of particles advected by a flow are only known approximately, in general, al-though they are known with considerable accuracy in the limits of irrotational flow and whenthe Reynolds number of the relative motion of particle and fluid is small. Much of the litera-ture ignores effects from particles being non-spherical and from the center of mass and thecentroid of the particle not coinciding. We propose to gain a thorough understanding of parti-cle motions in the types of flow that prevail in the solid booster rocket by considering theKirchhoff-Kelvin equations and modifications thereof. These equations should enable us toachieve insights in the motion of irregular objects, both homogeneous objects of irregularshape, and objects of regular shape with an inhomogeneous mass distribution, and achievea parametrization of the effects of particle-flow interactions in an imposed, essentially unidi-rectional flow, and establish a parametrization of particle-particle interaction effects. Theseissues are relevant as a basis for modeling individual particle motions in the rocket, particleinteractions and collisions, and processes leading to coagulation of particles and build-up ofash and slag.

    Turbulence Research R. Moser, R. Adrian, J. Freund, B. Wasistho, and A. Haselbacher

  • Scientific and Engineering Advances 3.12

    The flow inside an SRM is of very large Reynolds number and is therefore turbulent. Theturbulence affects the transport of aluminum and aluminum oxide particles, the interaction ofthe gas flow with the combustion layer, and with solid components such as inhibitors andnozzle, heat transfer to solid surfaces (e.g. the nozzle), and many other fluid interactions. Inthe Rocfluid code, turbulence is modeled using Large Eddy Simulation (LES) and severalLES models have been incorporated and tested in the Rocturb module. There are, however, anumber of areas in which further development and research are required.

    Continued Rocturb development: Continued development is planned to incorporate bothmore models and new capabilities (e.g. unstructured grid support, zonal modeling).

    New model development and implementation: a new class of LES models called optimalLES models are being developed. The first implementation of an optimal LES model fora real application is planned for Rocturb within the year. Further development to supportmore complex flow physics and better statistical models is also planned.

    Combustion-driven turbulence fluctuations: One of the driving forces for the develop-ment of turbulence are the fluctuations introduced at the combustion layer, and the shearinstabilities produced by long-wave axial acoustic modes. Using information fromRocburn and hydrodynamic simulations, the physics of this interaction will be further in-vestigated, and physics-based models for the injected turbulence fluctuations will be de-veloped and implemented in Rocturb or as a boundary condition module.

    Multiphase LES: Subgrid turbulence interacts with aluminum and oxide particle in theflow. The particles act to produce subgrid turbulence and the subgrid turbulence dispersesthe particles. Models are needed for both of these effects, and they must be integratedwith Rocturb, Rocpart and Rocsmoke.

    Validation Data: Validating the turbulence modeling and simulation in a solid rocket isdifficult because the hostile flow environment makes detailed measurements difficult.Our LES models must thus be validated against simple laboratory and detailed (DNS)simulations. The ongoing work to produce the data needed for validation and other pur-poses is planned to continue.

    3.5 Combustion and Energetic Materials

    Group Leader: M. Quinn Brewster

    Faculty: M. Quinn Brewster, John D. Buckmaster, David M. Ceperley, Herman Krier, Rich-ard M. Martin, Mark Short, and D. Scott Stewart

    Research Scientists and Programmers: Thomas L. Jackson, Kung-Chyun Tang, and XiaojianWang

    Postdoctoral Research Associate: Luca Massa

    Overview

    Combustion of solid propellant composed of energetic materials is the driving thermome-chanical force in the operation of a solid rocket motor. Accurate modeling and simulation ofthe combustion and resulting regression of solid propellants entails research activities in sev-eral areas, including the description and propagation of the propellant combustion interface,

  • Scientific and Engineering Advances 3.13

    modeling of solid propellant flames, combustion instability analysis, and constitutive mod-eling of energetic materials.

    The objective of our work on combustion ofammonium perchlorate (AP) solid propellants isto develop a simulation capability that will allowreliable prediction of macroscopic combustionbehavior, particularly burning rate, from funda-mental material properties and formulation pa-rameters, including AP particle size distribution.Our approach is based on characterizing the mi-croscopic behavior of a burning propellant in thecritical, burning rate-determining region near thesurface of the propellant and developing micro-structural combustion models that can simulateboth micro- and macro-burning behavior.

    Significant progress has been made in re-solving unsteady 1-D issues, except for ignition.Present and near-term multidimensional effortsare and will continue to be on 2-D capability with laminate propellants, focusing on issues ofnon-premixedness of solid fuel and oxidizer, partially premixed diffusion flame structure,and coupling with the solid components. A key part of our approach is a strong emphasis onphysical validation of modeling assumptions and identification of new phenomena or expli-cation of such via simulation. The long-term objective is to develop a 3-D simulation toolthat has the capability to successfully represent quantitatively the effects of AP particle sizeas well as environmental parameters on propellant burning response, both steady and un-steadysomething that has eluded propellant combustion modelers for decades.

    Ignition and Radiation in AP Propellants M. Brewster and K. Tang

    Ignition of AP-composite propellants will be simulated computationally. The simulation willuse a modified Zeldovich-Novozhilov (ZN) theoretical approach, compatible with the non-linear dynamic burning model (Rocburn) that has already been developed and validated. Theignition and nonlinear dynamic burning package will form a unified tool for simulating theentire ignition and propellant combustion process. Radiative energy will be considered as theignition source first, due to the strong role of radiation from burning metal in pyrotechnic ig-niters. Convective heating will be added after the radiant heating mode is developed andvalidated. The effects of radiant flux level, spectral energy distribution, and propellant opticalproperties on ignition delay will be investigated. The model will be used to predict ignitionbehavior of AP and AP-composite propellant and will be validated with experimental resultsfor AP and space shuttle propellant. Upon validation the model will be used to simulate igni-tion and motor pressurization in lab-scale motors, tactical motors, and large-scale spaceboosters. We also propose to simulate thermal radiation coupling with the emitting, absorb-ing, and scattering species (molecular gases and condensed phase particulates) in the coreflow in order to include the effect of radiation on propellant ignition and burning rate in inte-grated motor simulations.

    400 m

    Mass Injection atRegression Rate

    Outflow

    Burning Al Particle

    Flame

    AP Grain

    Binder

    Aluminum Particle

    Combustion LayerMelt Layer

    Outflow Region

    Solid Prope llant

    Fig. 3.7: Goal of CEM Group is integratedsimulation of propellant combustion layer.

  • Scientific and Engineering Advances 3.14

    Heterogeneous Propellant Combustion J. Buckmaster, T. Jackson, M. Short, L. Massa,and X. Wang

    The development of propellant morphology modeling, homogenization strategies to accountfor the smallest AP particles, and a phase-coupled 3-D unsteady propellant burning code nowpermit an attack on the central problems of propellant combustion. A broad spectrum ofproblems will be addressed that include improved surface kinematics (presently we can onlyaccount for surfaces described by single-valued functions); the eruption, melting, agglomera-tion, and detachment of aluminum particles, and the effect of this on the propellant combus-tion; a combined study with the Fluids Group of the role that the combustion code Rocfirehas on the turbulent chamber flow, and the effect that the turbulent chamber flow has oncombustion (erosive burning); and a numerical study of the response of the burning rate toacoustic disturbances for the purposes of validation.

    Simulation of Condensed Phase and its Kinetics S. Stewart

    A comprehensive modeling effort will be carried out to study and develop accurate descrip-tions of the condensed phase processes in energetic materials used in solid propellants andexplosives. The continuum modeling reflects the simultaneous appearance of delineatedsolid, liquid and gas phases, phase boundary regions and exothermic (or endothermic)chemical reaction. Submodels that can describe large deformation for elastic and visco-elastic solid behavior will be used in collaboration/consultation with the Structures and mate-rials group. A simplified chemistry scheme for condensed phase kinetics will be employed.Constitutive information will be sought from molecular dynamics (MD) especially to modelthe kinetics of phase change. Specific and representative problems include a multiphasesimulation of monopropellant flames like AP and HMX. Simulation tools include existingmulti-material codes like GIBBS2D and recently built combustion solvers. We will use highorder methods and level set representation of material interfaces to the extent possible.

    Ab Initio Methods for Calculation of Energetic Material Properties D. Ceperley and R.Martin

    We propose developing new ab initio methods to calculate properties of materials under con-ditions of high temperatures and densities under conditions relevant to rocket engines. Wewill use a combination of density functional and quantum Monte Carlo methods that areneeded for realistic studies. The former are the most widely used methods in physics andchemistry and the latter are the only methods capable of high accuracy and reliability for awide range of conditions at high temperatures and densities. The methods will be bench-marked on relevant energetic materials. Our recent work supported by CSAR is making im-portant steps toward these goals, including simulations of hydrogen compared to shock wavedata from LLNL and phase transformations of nitrogen at high pressure. Active collabora-tions with scientists at LLNL on these topics are ongoing. Our future research will be en-hanced by collaborating on energetic materials projects sponsored under a MURI grant atOklahoma State University (D. L. Thompson, PI). The grant is Accurate Theoretical Pre-dictions of the Properties of Energetic Materials and the fact that we were approached tojoin this work is recognition of the importance of ab initio simulations for understanding en-ergetic materials. However, the funding is not sufficient by itself to make rapid progress onthese difficult topics.

  • Scientific and Engineering Advances 3.15

    3.6 Computer Science

    Group Leaders: Michael T. Heath and Laxmikant V. (Sanjay) Kale

    Faculty: Eric de Sturler, Jeff Ericson, Michael J. Garland, Michael T. Heath, Laxmikant V.Kale, Daniel A. Reed, Paul Saylor, and Marianne Winslett

    Research Scientists/Programmers: Michael T. Campbell, Robert A. Fiedler, Xiangmin (Jim)Jiao, Orion S. Lawlor, and Johnny C. Norris

    Postdoctoral Research Associates: Damrong Guoy and Joerg Liesen

    Overview

    Two teams, Computational Mathematics and Geometry, and Computational Environment,address research in Computer Science. In addition, critical CSAR resources are tools devel-oped by the Visualization subteam in Computational Environment. Work in ComputationalMathematics and Geometry focuses on theareas of linear solvers, mesh generationand adaptation, and data transfer betweendiverse discretizations. The target of ourComputational Environment team is toprovide the broad computational infra-structure necessary to support complex,large-scale simulations in general, and forrocket simulation in particular. Areas ofresearch include parallel programming en-vironments, parallel performance analysisand tools, parallel input/output, and visu-alization.

    Computational Mathematics and Geometry

    Krylov Subspace Methods and Preconditioners for Very Large Scale Problems E. deSturler and P. Saylor

    As we simulate increasingly more complicated problems, we must solve ever-larger linearsystems. ASCs intention is to solve systems of more than a billion unknowns on massivelyparallel computers. In addition, these systems become increasingly harder to solve, more ill-conditioned, non-Hermitian, indefinite, and highly non-normal. This asks for robust methodslike GMRES, GCROT, or variations; however, these are expensive. We would like to useLanczos-type methods, which are cheap per iteration and have low memory requirements,such as BiCG and methods derived from it (QMR, TFQMR, and BiCGSTAB). However,these methods often lack robustness and converge slowly or not at all.

    In order to obtain methods with low cost per iteration (memory and CPU), fast conver-gence, and robustness we propose to derive methods that combine the virtues of both classes.We have done some work on this in the past, and we are currently working on ways to makeLanczos-type methods (BiCG and relatives) more robust and faster converging.

    Another important way to get more robust methods and faster convergence is to developbetter preconditioners, especially for large parallel computers. We propose to analyze and

    Rocdriver

    Roccom

    Rocflo RocfracRocburn

    Rocpanda

    Gridgen

    Makeflo

    Rocface Rocnum

    Tetmesh

    Metis

    Rocflu Rocsolid

    Fig. 3.8: CS Group drives integrated code develop-ment.

  • Scientific and Engineering Advances 3.16

    further develop several classes of parallel preconditioners based on our previous work andthat of others.

    Mesh Adaptation and Refinement D. Guoy and M. Heath

    Integrated, multi-component simulations require a wide range of meshing capabilities. Per-haps the greatest challenge is dealing effectively with dynamically changing geometries andcorrespondingly changing meshes. Integrated rocket simulations require a variety of elementtypes (hexahedral and tetrahedral), mesh types (volume meshes and surface meshes), andlevels of adaptation (from minor mesh motion to substantial geometric change requiring ma-jor mesh repair or complete remeshing). We will develop techniques for addressing each ofthese cases. We have already developed effective techniques for smoothing volume meshessubject to gradual change, such as in propellant burning, and we plan to extend these tosmoothing surface meshes, such as the interface between fluid and solid regions, where relo-cated nodes must be constrained to lie on the (generally curved) surface. For more substantialgeometric changes, such as flexible inhibitors or propagating cracks, we will develop furtherour mesh repair capabilities based on multiple steps of node movement, mesh coarsening,and mesh enrichment. Finally, to deal with more drastic geometric change, such as propellantburnback over an extended period, we plan to develop new capabilities for generating an en-tirely new mesh based in part on our previously developed techniques for sink insertion andsliver removal.

    Mesh Correlation, Data Transfer, and Interface Propagation M. Heath and X. Jiao

    We will continue our work on the issues of correlating multiple meshes, data transfer be-tween meshes, and interface propagation. These problems are directly related to the coupledsimulation code. Mesh correlation and data transfer are needed in many situations that in-volve multiple meshes, such as the interface between fluid and solid domains and a dynamicsolver involving adaptive remeshing. We have achieved reasonable success on mesh correla-tion and data transfer for surface meshes. We propose further work on adaptivity, paralleli-zation, robustness, and generalization to more types of meshes (for example, volumemeshes). The problem of interface propagation is to track the motion of the interface. Thecurrent propagation scheme in GEN2 is fairly ad hoc and is problematic at ridges and cor-ners. We have developed a new approach in two dimensions and propose extending thismethod to three dimensions. We propose tackling the problem in two steps. First, we will de-velop a limited algorithm assuming fixed connectivity for quick prototyping and easy inte-gration. Second, we will develop a more general algorithm that allows full adaptivity of theinterface for better accuracy.

    Computational Environment

    Software Support for System Integration, Error Estimation, and Error Control E. deSturler and X. Jiao

    An important effort of computer science is to develop software tools for easing the develop-ment and integration of physical components. We propose to develop a set of new tools forerror estimation, error control, and stability analysis. These tools will provide a means tomonitor the accuracy and stability of both the individual applications and the integrated code,so that the accuracy and stability of a simulation can be established or violations of such as-sumptions can be identified. This will provide convenience for application developers and

  • Scientific and Engineering Advances 3.17

    users to perform code verification and debugging. Further, it can also serve as a basis for so-lution-based mesh adaptation and time zooming. Much of this effort will take advantage ofour previous work and be conveniently built upon our existing software framework. We havedeveloped a set of software tools for supporting intercomponent communication and orches-tration, including Roccom, Rocnum, and Rocface. We also propose the continuing develop-ment and maintenance of these tools to meet new challenges of the more dynamic behaviorof the rocket simulation code in the coming years.

    AMPI and Component Interface Techniques L. Kale and O. Lawlor

    The integrated codes that we propose to develop arise from multiple research groups and de-velopment teams. Each will be parallel, have dynamic behavior and need to communicatedata to other modules, in addition to internal communication within each parallel module.This project proposes further developing AMPI, an adaptive implementation of MPI, and anorchestration and interface framework to help accomplish these objectives. The AMPI sys-tem currently provides automatic load balancing, automatic checkpointing and the ability tocommunicate across independent modules for MPI programs. However, porting MPI codesto AMPI still requires some effort, notably in writing pack-unpack functions and in collectingglobal variables. We aim to fully automate this process, so no extra work needs to be done byan MPI programmer. Further, AMPI itself will be enhanced by making the implementation amore comprehensive standard-conforming one (including some MPI2 features), and in im-proving its communication efficiency. An orchestration and interface framework will be de-veloped that allows each component to publish its boundary data without being explicitlyaware of which other modules will use it, and correspondingly use complementary boundarydata coming from other modules in an equally opaque manner.

    Component Frameworks L. Kale and O. Lawlor

    Although parallel application modules being developed for rocket simulation, as well asother physical simulations, employ a large number of different algorithms, discretizations,and numerical methods, they all rely on a few basic data structures, such as particles, struc-tured grids, and unstructured meshes. Parallel implementations of each type of code, whenattempted from scratch, require a significant programming effort beyond that needed for se-quential programming. Further, adaptive refinements present a significant challenge for eachdata structureAdaptive Mesh Refinement (AMR) for structured grids; unstructured meshrefinements and changes such as insertion of cohesive elements; and tree rebuilds for parti-cles. We are developing component frameworks that dramatically reduce the effort needed touse these algorithms in a parallel program, by providing abstractions that encapsulate or hidethe parallelism in the runtime system. The frameworks include: the MBlock framework forstructured grid based applications, with automatic handling of communication for curvilineargeometries, an AMR extension to MBlock, an unstructured-mesh framework with applica-tions in structures and finite-volume methods, and a particle framework. Work on the lasttwo will leverage support from other grants, as well as CSAR funding. The proposed workincludes both new capability development as well as help with integrating the frameworksinto mainstream Roc* codes.

    Load Balancing and Communication Optimizations L. Kale and O. Lawlor

    Current and future CSAR simulations will have dynamic behavior and run on large next gen-eration machines, where communication is expensive relative to computation. We plan to

  • Scientific and Engineering Advances 3.18

    develop techniques, including dynamic load balancing and communication optimizations,which will automatically improve the performance of CSAR simulations. Our ideas are basedon two principles: virtualization and the principle of persistence. Virtualization is a techniquewe have been exploring for the past decade in the context of Charm++ system. With this, anapplication developer focuses on decomposing an application into many parallel chunks, butis freed from deciding which processor will run each chunk, and how the communicationbetween them is implemented. This gives the runtime system the flexibility (and the chal-lenge) to change the mapping of chunks to processors at runtimeto do this, we have im-plemented several measurement-based load-balancing strategies. We plan to improve thesestrategies for next generation parallel machines, where we must take the network topologyinto account to minimize communication bandwidth, and employ fully distributed strategies,in contrast to current centralized decision making, to retain scalability. Communication be-tween chunks also needs to be optimized, especially when there are multiple communicatingentities (chunks) per processor.

    Intelligent Performance Tuning D. Reed and M. Campbell

    During the past five years, the Pablo group and CSAR have had a mutually beneficial rela-tionship. We have used the Pablo performance toolkit to identify and correct problematicCSAR code. Likewise, the CSAR code has helped identify the needs of effective perform-ance tools, and the Pablo toolkit has evolved, via funding from other sources, to fulfill thoserequirements. Our goal is to strengthen and build on this symbiotic relationship. The Pabloperformance toolkit will continue to evolve in response to experience with performanceanalysis of CSAR applications. However, based on our experiences to date, we propose fo-cusing software development and enhancements on scalability analysis, offline performancetuning and guidance, and online dynamic adaptation and multiversion code selection.

    Parallel I/O and Data Migration M. Winslett

    Our research group supplies the Rocpanda parallel I/O library used by GENX, along withRocpandas active buffering facilities that keep GENX I/O costs low. We will continue tomeet GENX I/O needs, while enhancing the facilities currently available. In particular, wepropose to add integrated support for compressed data migration in GENX, improveRocpandas I/O load balancing facilities, develop data declustering strategies for the parallelversion of Rocketeer, and continue our work on self-tuning parallel I/O systems. As GENXadopts adaptive grids and time zooming, these features will raise new I/O load imbalance is-sues, which will need to be addressed in the coming years.

    Visualization

    Rocketeer Visualization Suite R. Fiedler and J. Norris

    We will continue to enhance the Rocketeer suite of visualization tools. The most useful toolin the Rocketeer suite for very large, remote data sets is the client/server version(Apollo/Houston), but the serial workstation version (Rocketeer) and the MPI parallel batchmode version (Voyager) share the same code base. Support for large data sets will be im-proved by limiting the number of polygons in images sent to the desktop client. Numeroususeful features will be added, including fly-through animations, streamlines, support for ten-sors, data manipulation capabilities, support for advanced displays, and X-Y plots along ar-bitrary lines. Additional reader plug-ins will be developed to enable input of new data for-

  • Scientific and Engineering Advances 3.19

    mats including HDF5 and streaming data, which would allow on-the-fly visualization of arunning simulation. Scalability of the MPI parallel server Houston will be improved by in-cluding dummy cell values in each data block to reduce communication significantly.

    Rocketeer exceeds many visualization tools in its ability to handle a wide variety of gridson which the data is defined. The grid may be non-uniform, structured, or unstructured, andmultiblock. Rocketeer can display multiple data sets for multiple materials from multiplefiles on a single image. It can perform the same set of graphics operations automatically on aseries of data files to produce frames for animation. Voyager is a fully featured MPI parallelbatch mode version of Rocketeer that takes advantage of a parallel platform by processingconcurrently a different snapshot on each CPU. Voyager takes as input a text file saved dur-ing an interactive Rocketeer session which specifies the camera position, a list of graphicsoperations to perform, and a list of all HDF files to process.

    3.7 Verification and Validation

    Team Coordinator Mark D. Brandyberry

    Our code development effort requires the coding, testing, and integration of multiple physicscodes, along with codes for mesh matching, parallel I/O, job management, etc. The effort ismulti-language and involves many developers. It is the goal of the Software Engineering(SE) and Verification and Validation (V&V) efforts to ensure that the computer code prod-ucts resulting from CSAR efforts are of professional quality, fully tested, and well docu-mented. The proposed work on SE and V&V for the second five years of CSAR will be dis-cussed as separate topics. Software engineering is discussed in the Code Legacy section, be-low.

    Activities that fall under the SE or V&V areas include code and documentation configu-ration management, various forms of testing (unit, regression, verification, validation, etc.),code review, build management, and code design. These activities are often not standalone,and interact with each other. As an example, a test case designed to validate a portion of theintegrated code can be used also as a regression test case if it has an appropriate runtime.

    Verification

    Verification is making sure the code solves the equations correctly. The goal of the effort isto ensure that the results of simulations performed with CSAR codes are consistent and accu-rate as compared to the equations and models that are built into the code(s). Documentationreview and code review are important parts of the verification of the codes in the CSAR inte-grated code suite and will be continued and enhanced. To characterize the accuracy of CSARcodes in comparison to their models, methods exist to estimate result (grid) errors both lo-cally and globally. We propose research on constructing an error estimation framework beinitiated to construct a set of tools for performing error analysis of grid-based computercodes. The framework would be callable from these codes, and would give developers con-sistent and easily used tools to verify computational accuracy of their codes. This frameworkmay be part of a larger effort that would include analyses to support grid refinement. Furtherwork on development of verification problems must be performed to generate analytic resultswith which to compare verification runs of individual physics codes. It is important to ensurethat each component code be verified before attempting to validate the integrated code. Usingmethods such as manufactured solutions will also be investigated and used as appropriate to

  • Scientific and Engineering Advances 3.20

    generate these problems. For the integrated code, methods for computing values such as massconservation across the solid/fluid interface will be extended.

    Validation

    Validation, on the other hand, is making sure the model solves the correct equations. Thus,it is important to be able to compare the results of the code(s) against known physical results.Several problem sets are currently available (Titan IV, labscale rocket), but more need to bedeveloped (other labscale geometries, Space Shuttle RSRM, etc.). Collection and analysis ofappropriate data is a large effort, as is constructing the runtime models for these problems.

    Running any of these physical problems requires a large amount of computer time andgenerates many GB of output data. Currently the Rocketeer visualization tool is available toview the resultant data sets, but no tool is available to analyze the raw data, and/or comparetwo data sets. We propose a tool be constructed to enable the extraction of arbitrary datapoints, sets, etc. from CSAR code output, and that the new tool be able to compare two os-tensibly similar data sets to assess their convergence. This tool would both be available ondemand, and would also interface with the Roctest automated test suite to assess the results ofthe periodic regression tests.

    A series of grid-convergence studies using a lab-scale rocket has begun and will continue.Grid convergence studies on other problems on other scales will be continued. Grid conver-gence studies for models based on analytical results should also be performed as verificationactivities. Code uncertainty and sensitivity to model parameters is also important. Since ingeneral, it takes too long to run the CSAR codes to perform full statistical uncertainty analy-sis, it is proposed that methods be researched that can allow estimation of code run result un-certainties without years of run-time.

    Many of the simulation demonstrations listed in Section 3.2 will also serve as validationtests. In particular, we either already have or expect to obtain detailed test data for a numberof lab-scale rockets and small guidance motors for comparison with our simulation results. Inaddition, under our Space Act Agreement with NASA Marshall Space Flight Center inHuntsville, Alabama, we have received access to extensive design and test data for the SpaceShuttle RSRM. Such test data will enable detailed validation of our integrated simulation re-sults. Further, laboratory experiments on our own campus will be used in validating individ-ual component models in propellant combustion and turbulence.

  • Outreach and Interaction 4.1

    4 Outreach and Interaction with NNSA/DP Laboratories

    4.1 Center-NNSA/DP Interactions

    Center personnel have traveled extensively and have been involved in a large number oftechnical and informational meetings. These included meetings intended to explore rocketscience and technology, identify technical collaborators, describe the ASCI/ASAP program,and establish relationships among Center investigators, DOE lab scientists, and industryleaders. Individual CSAR senior investigators and technical staff have traveled to DOE DPlabs to serve on ASCI/ASAP panels, to participate in ASAP-wide workshops (e.g., materialsand computational environment), to offer research seminars and technical interaction, to re-ceive training on the ACSI computational resources, and to discuss ASCI resource issueswith the CRT. We will continue to work closely with lab representatives on the TST to iden-tify opportunities for detailed interaction. During the first five years of the program, we havedocumented hundreds of CSAR-lab interactions in the following areas:

    CSAR researcher visits to DOE Labs ASC PI Meetings DOE Panel Reviews ASC visits and seminars at CSAR Faculty sabbaticals at DOE Labs ASC Workshops Code Sharing Software/hardware visits from DOE Labs Joint research Detailed technical discussions

    In addition to traditional individual and small-group visits to the DOE/NNSA labs, wewill be taking large teams of CSAR investigators (~15 people) to make presentations andparticipate in technical discussions later this year. A visit to LLNL and SNL-CA has alreadybeen scheduled and a similar trip in the near future will include SNL-NM and LANL.

    One of the leading topics of discussion between lab staff and CSAR investigators will bein pursuing the potential use within the labs of specific technologies developed at CSAR.Immediate technology candidates include our technology for data transfer at component in-terfaces and our framework for integration of separately developed codes with automatedload balancing.

    4.2 Student-NNSA/DP Interactions

    CSAR has been remarkably successful in encouraging student-lab interactions. Leading op-portunities for UIUC graduate and undergraduate student interactions with the NNSA/DPlaboratories include:

    Summer student interns at DOE Labs Joint research Undergrads hired in CSAR labs performing collaborative research with NNSA/DP scien-

    tists

  • Outreach and Interaction 4.2

    Former CSAR/CSE Students at DOE/NNSA Labs

    Ali Pinar, Lawrence Berkeley National Laboratory (PhD, Computer Science, 2002) Thomas Hafenrichter, SNL (MS, Mechanical Engineering, 2002) Michelle Duesterhaus, SNL (MS, Mechanical and Industrial Engineering, 2001) Jason Hales, SNL (PhD, Civil and Environmental Engineering, 2001) Jack Yoh, LLNL (PhD, Theoretical and Applied Mechanics, 2001) Benjamin T. Chorpening, SNL-L (PhD, Mechanical Engineering, 2000) Burkhard Militzer, LLNL (PhD, Physics, 2000) Christopher D. Tomkins, LANL (PhD, Theoretical and Applied Mechanics, 2000) Jeff J. Murphy, SNL-L (PhD, Mechanical Engineering, 1999) Jin Yao, LLNL (PhD, Theoretical and Applied Mechanics, 1999) Donald Siegel, SNL (PhD, Physics, 1999) Steven F. Wojtkiewicz, SNL (PhD, Aero and Astro Engineering, 1999) Boyana Norris, Argonne National Laboratory (PhD, Computer Science, 1999) Giulia Galli, LLNL (PhD, Physics, 1998) Arne Gullerud, SNL (PhD, Civil Engineering, 1998) Michael Ham, LANL (MS, Computer Science, 1998)

    Former CSAR Employees at DP Labs

    Jeffrey Vetter, LLNL James Quirk, LANL Dennis Parsons, LLNL

    4.3 Increasing the Numbers of U.S. Citizen Students and Postdocs

    Although CSAR has been the ASAP leader in placing former students and staff inDOE/NNSA laboratory positions, we seek to improve our record in the coming years. Animportant facet of lab placement is citizenship it is often difficult to place non-U.S. citi-zens in NNSA research positions. To increase the likelihood of CSAR students and staff ac-cepting positions at the labs, we will work diligently to increase the number and quality ofU.S. citizens in the program.

    For several years we have encouraged faculty investigators to preferentially hire U.S.graduate research assistants. In an unwritten policy, we have hired additional students for re-search projects when U.S. citizen students were identified, even if the initial project budgetwas insufficient to cover the additional student. We will continue to extend this support toencourage faculty to seek qualified domestic students and will attempt to broaden the extrasupport to postdoctoral research associates, as well.

    Undergraduate students at the University of Illinois are nearly all U.S. citizens (>98%)and are predominantly Illinois residents (~85%). We propose rapidly growing our program ofusing undergraduate students in our laboratories, and in particular, employing them to workon meshing and laboratory experiment projects. We will make available funds to hire asmany undergraduate students as our faculty and research staff can identify.

  • Organization, Management, and Academic Program 5.1

    5 Organization, Management, and Academic Program

    No changes to the organization, management structure, or academic program are proposed.

    5.1 Organization and Management

    Professor Michael T. Heath, CSAR Director, and the members of the Science SteeringCommittee provide world-class leadership and focus for the Center for Simulation of Ad-vanced Rockets. The Center is administratively housed within the Computational Scienceand Engineering Program of the UIUC College of Engineering, reporting to the Dean of En-gineering, David Daniel.

    The Computational Science and Engineering Program is inherently interdisciplinary, re-quiring expertise inadvanced computingtechnology, as well asin one or more ap-plied disciplines. Thepurpose of the CSEDegree Option is aperfect complementto the academic goalsof ASC/ASAPtofoster interdiscipli-nary, computationallyoriented researchamong all fields ofscience and engi-neering, and to prepare students to work effectively in such an environment (Figure 5.1).

    The CSE Program does not independently admit students or confer graduate de-greesstudents wishing to elect the CSE Option must first be admitted to one of the partici-pating departments before enrolling in the CSE Program. Similarly, all faculty members af-filiated with CSE have regular faculty appointments in one of the participating departments.Students electing the CSE Option become proficient in computing technology, including nu-merical computation and the practical use of advanced computer architectures and in one ormore (traditional) applied disciplines. Such proficiency is gained, in part, through coursesthat are specially designed to reduce the usual barriers to interdisciplinary work. Thesis re-search by CSE students is computationally oriented and actively advised by faculty membersfrom multiple departments.

    Management

    The Director and Science Steering Committee members are responsible for nurturing the re-search program, administering the Center, and maintaining and expanding relationships withthe DOE DP laboratories. This directorate provides the leadership necessary to ensure thatthe Center identifies the most important research areas, attracts the most qualified research-ers, and pursues and completes the work effectively over the long term. A small administra-tive staff works to execute Center activities (Figure 5.2). Each of the Research Groups has

    Center for Simulationof Advanced Rockets

    EducationProgram

    Center for ProcessSimulation and Design

    12 departments130 faculty associates10 graduate fellows80 graduate students

    enrolled

    ResearchProgram

    DOE funded$20 million over 5 years

    First ends in 2002New 5-year contract

    38 faculty36 graduate students3 undergrads27 professional staff

    Computational Science& Engineering Option

    NSF (& DARPA) funded$2.5 million over 3 years

    Ends Sep 2001$4 million over 5 years

    Begins Sep 200112 faculty13 students & postdocs

    Fig. 5.1: CSAR is one of two research centers in UIUC Computational Scienceand Engineering Program. CSE education program is graduate student aca-demic degree option.

  • Organization, Management, and Academic Program 5.2

    co-leaders who coordinate the tech-nical program in that area. Ninetechnical teams are in place to ad-dress specific areas within the re-search effort. The Integrated CodeDevelopment Team (Incode) wasformed to clearly identify the leadauthors of the integrated code and toassure that resources are available.

    The membership of the Ex-ternal Advisory Board (EAB) con-sists of individuals chosen from theDOE DP labs, industry, other gov-ernmental agencies, and other uni-versities (Figure 5.3). The ExternalAdvisory Board reviews CSAR re-search studies, makes research rec-ommendations, and provides exper-tise for translating research findingsinto practice. An active communica-tions link has been established withthe EAB. The Board annually assesses the progress of the Center in reports to the CSAR Di-rector and the Dean of the College of Engineering.

    Administrative Staff

    The Center has a small teamof very high quality profes-sional staff that providesexperienced managementfor the program. WilliamDick serves as ManagingDirector of the CSAR andSheryl Hembrey is the As-sistant Director for Budgetand Resource Planning. Mr.Dicks role in CSAR is tomanage the day-to-day op-erations of the program,provide strategic direction, address facilities and equipment needs (including ASCI comput-ing resources) and to assure that the Center is responsive to the DOE and ASCI. RobertFiedler is the CSAR Technical Program Manager. Dr. Fiedler manages the code developmentprocess and convenes the System Integration Team.

    External advisors TST Chair

    Managing Director

    Daily operations Financial mgmt Outreach

    Internal AdvisoryCommittee

    UIUC dept heads

    Dean

    Director

    Science SteeringCommittee

    Technical prog. coord. Team collaboration DOE DP liaison

    ChancellorProvost

    Tech ProgramManager

    Programmer mgmt HW/SW support

    Code coordination

    External AdvisoryBoard

    Research Groups

    ComputationalEnvironment

    FluidDynamics

    Combustion andEnergetic Materials

    Computational Mathand Geometry

    Structuresand Materials

    Fig. 5.2: CSAR management structure provides cleardirection.

    Rocket IndustryAerojetAlliant TechsystemsAtlantic ResearchGeisler EnterprisesLockheed-Martin Missiles

    & SpaceThiokol Propulsion

    UniversitiesCaltechUniversity of ColoradoUniversity of Tennessee Space InstituteYale University

    Government Research AgenciesAir Force Research LaboratoryArmy Research OfficeLawrence Berkeley National LaboratoryNASA HeadquartersNASA Marshall Space CenterNaval Air Warfare Center, China LakeSandia National Laboratory

    Computer IndustryHewlett Packard CompanyIntel CorporationIBM

    Fig. 5.3: Critical constituencies included on EAB.

  • Organization, Management, and Academic Program 5.3

    Research Group Structure

    The program is being carried out in a collaborative manner by a number of teams, each withspecific responsibilities indicated below. To facilitate communication and cooperation amongteams, there are appropriate overlaps in membership (Figure 5.4).

    System Integration Team (SITeam): Responsible for overall system integration, including themathematical model selection for the system components and the specification of compatibleinterfaces between component models. Includes both physical compatibility of componentmodels and software and data interfaces between corresponding component codes.

    Integrated Code Development Team (Incode): New in 2000, this team brings together each ofthe lead code authors from the four Research Groups. Responsible for developing the inte-grated simulation code.

    Software Integration FrameworkTeam (SWIFT): Responsible for craft-ing and executing a strategy for devel-oping a general software architecturefor component integration.

    Validation, Accident, and Specifica-tion Team (VAST): Responsible forspecifying detailed blueprints of de-vices to be simulated, including physi-cal dimensions and materials. Thisteam has worked closely with NASAand Thiokol in the past to collect de-tailed performance data for the SpaceShuttle RSRM that will be used forvalidating CSAR simulations.

    Combustion and Energetic MaterialsTeam: Responsible for combustion-injection modeling and corresponding codes for simulat-ing burning of composite propellant. Also responsible for continuum-mechanical and mo-lecular-level modeling and corresponding codes for simulating the thermo-mechanical be-havior of energetic materials.

    Fluid Dynamics Team: Responsible for fluid-mechanical modeling and corresponding codesfor simulating the interior cavity flow and exhaust plume.

    Structures and Materials Team: Responsible for solid-mechanical and thermal modeling andcorresponding codes for simulating the case, nozzle, insulation, and propellant.

    Computational Mathematics and Geometry: Responsible for parallel numerical algorithms,such as sparse linear system solvers, as well as algorithms for mesh generation, partitioning,and adaptive refinement, needed for various component codes.

    Computational Environments Team: Responsible for specifying compatible data structuresand data formats for scientific data management and also for parallel I/O and visualization.Also responsible for parallelization strategies, performance evaluation, and tuning of individ-ual component codes as well as integrated system code.

    Groups

    Tea

    ms

    Combustion and

    Energetic Materials

    Fluid Dynamics

    Structures & Materials

    Computer Science

    Computational Environments

    Computational Mathematics and Geometry

    Combustion and Energetic

    Materials

    Fluid Dynamics Structures and Materials

    Engineering Code Development

    Validation, Accidents, and Specification

    Software Integration Framework

    System Integration

    Fig. 5.4: Team efforts contribute to Research Groups.Integrated Code Development team was new in Y3 andresponded to suggestion of DOE Review Team.

  • Legacy 6.1

    6 Legacy

    6.1 Code Legacy

    The integrated code legacy of CSAR will be a scalable, plug-and-play simulation code and framework for performing cou-pled simulations that automatically adapts to changing to-pologies, plus a collection of validated state-of-the-art physicsmodules that may be used to solve a wide variety of mul-tiphysics problems. Further, the simulation code will enableexploration of scientific and engineering issues in a broad ar-ray of complex fluid-structure interactions.

    Software Engineering Principles

    Critical to the code legacy is adherence to centerwide software engineering practices that en-able ease of authorship, maintenance, and multiauthor coding. We propose additions and ex-tensions to the current state of software engineering as practiced at CSAR to be developedduring Years 6-10. These include development of automated multiplatform build and testsuites, development of a grid-error framework for error estimation, construction of furtherverification and validation problems, and implementation of other software engineeringpractices. The proposed work on SE and V&V for the second five years of CSAR are dis-cussed as separate topics in this Statement of Work although they form a coherent whole interms of software engineering process. The activities discussed have been chosen to providethe highest return to the Center, with the least cost, and the least time taken from researchers.They have not been chosen to address any specific level of software engineering practice,but are some of the best practices as evidenced by their use in other large organizations de-veloping complex software.

    The CSAR code development effort requires the coding, testing, and integration of multi-ple physics codes, along with codes for mesh matching, parallel I/O, job management, etc.The effort is multi-language and involves many developers. It is the goal of the SE and V&Vefforts to ensure that the computer code products resulting from CSAR efforts are of profes-sional quality, fully tested, and well documented. Activities that fall under the SE or V&Vareas include code and documentation configuration management, various forms of testing(unit, regression, verification, validation, etc.), code review, build management, and code de-sign. These activities are often not standalone, and interact with each other. As an example, atest case designed to validate a portion of the integrated code can be used also as a regressiontest case if it has an appropriate runtime.

    Code Dissemination

    A year ago we negotiated a Space Act Agreement with NASA Marshall Space Flight Cen-ter to exchange the integrated code for test and flight data from the RSRM to be used in ourvalidation studies. This agreement enables NASA to execute the code to simulate the per-formance of other launch vehicles on a nonexclusive, royalty-free basis. More recently wehave worked with Atlantic Research Corporation, a major U.S. manufacturer of satellitesteering rockets, to use the GEN2 integrated code on one of their SP motor designs. In addi-

    Fig. 6.1: ASC Program ad-vances national strengths incomputational and simulationscience.

  • Legacy 6.2

    tion, the CSAR External Advisory Board members continue to be excited about the potentialfor use of our integrated simulation tool to explore their proprietary rocket designs. The levelof interest in industry, combined with that of NASA, indicate that long-term development,maintenance, and dissemination of the code is worthwhile.

    6.2 Center Legacy

    The research of the ASC/ASAP centers is expected to dr