12
Implementing Hypre-AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02- 07ER84730

Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Embed Size (px)

Citation preview

Page 1: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Implementing Hypre-AMG in NIMROD via

PETScS. Vadlamani- Tech X

S. Kruger- Tech X

T. Manteuffel- CU APPM

S. McCormick- CU APPM

Funding: DE-FG02-07ER84730

Page 2: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Goals

• SBIR funding for – “improving existing multigrid linear solver libraries applied to the

extended MHD system to work efficiently on petascale computers”.• HYPRE chosen because

– Multigrid shown to “scale” – CU development

• callable from PETSc interface

– development of library (ie. AMG method) will benefit all CEMM efforts

• Phase I:– explore HYPRE’s solvers applied to the positive

definite matricies in NIMROD– start a validation process for petascale scalings

• Phase II: – push development for non-symmetric operators of

the extended MHD system on high-order FE grids

Page 3: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Equations of interest

Page 4: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Major Difficulties for MHD system

• MHD equations yield difficult to invert– Velocity advance has 3D matrix which stabilizes– Magnetic field advance has a 3D matrix due to the– Temperature advance has a three-dimensional, anisotropic operator

• All of the matrices discussed, while ill-conditioned, are Hermitian.• Inclusion of extended MHD terms not only increases the condition

• Very hard linear problems, must use solvers that scale• number of matrices (by making the largest eigenvalue larger),• but is fundamentally non-symmetric in nature.••

– with parallel diffusion coefficient that is 5 to 10 orders of magnitude– larger than the perpendicular coefficent.– temperature-dependent resistivity that varies by 5 orders of

magnitude– across the simulation domain,–– the MHD waves to high accuracy,–

• matrices for three reasons:•

Page 5: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Reminder: Algebraic Multigrid• The smoothing process (also known• The coarse-grid

– (1) restriction:– (2) coarse-grid– (3)prolongation: a transfer or interpolation of information

• The effectiveness of this algorithm relies• on carefully choosing restriction and the coarse-grid solve,

which• are dependent on attributes of the system of equations being

solved.•

– back to the finer grid. – solve: solving the linear system on the chosen coarse-grid, – a particular transfer of information to a coarse-grid,

• correction is made up of three subprocesses: • as relaxation) is an application of a linear solver (usually• iterative) that results in a smooth error.

Page 6: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Typical AMG process

Page 7: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Using PETSc

• Level of PETSc Compliance – for developer support– PETSc programs usually initialize and kill

their own MPI communicators..need to match patterns

• Calling from fortran (77 mentality):– #include ”include/finclude/includefile.h”, *.F

for preproccesor– careful to only “include” once in each

encapsulated subroutine– must access arrays via an integer index

name internal to PETSc – zero indexing *IMPORTANT*

Page 8: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Sample code to set elements of an array

#define xx_a(ib) xx_v(xx_i + (ib))double precision xx_v(1)PetscOffset xx_iPetscErrorCode ierrinteger i, nVec xcall VecGetArray(x,xx_v,xx_i,ierr)call VecGetLocalSize(x,n,ierr)do i=1,n xx_a(i) = 3*i + 1enddocall VecRestoreArray(x,xx_v,xx_i,ierr)

Page 9: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Hypre calls in PETSc

• No change to matrix and vector creation

• Will set the preconditioner type to HYPRE types within a linear system solver – KSP package– needed for the smoothing process

• PCHYPRESetType ()• PetscOptionsSetValue()

Page 10: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Work Plan for Phase 1 SBIR

• Implement HYPRE Solvers In Nimrod via PETSc– understanding full NIMROD data structure– backward compatibility with current

solvers– comparative simulations with benchmark

cases

• Establish metrics for solvers’ efficiency• Initial analysis of MG capability for

extended MHD

Page 11: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

This Week

• Revisit work done with SuperLU interface– implementation of distributed interface

will give better insight to NIMROD data structure on communication patterns

• Obtain troublesome matrices in triplet format– Send to Sherry Li for further analysis and

SuperLU development – possibilty of visualization (matplotlib,

etc)

Page 12: Implementing Hypre- AMG in NIMROD via PETSc S. Vadlamani- Tech X S. Kruger- Tech X T. Manteuffel- CU APPM S. McCormick- CU APPM Funding: DE-FG02-07ER84730

Summary

• Beginning implementing PETSc in NIMROD

• Will explore HYPRE solvers with derived metrics to establish effectiveness

• Explore mathematical properties of the extended MHD system to understand feasibility of AMG still scaling while solving these particular non-symmetric matrices

• [way in the future]: May need to use BoxMG (LANL) for anisotropic temperature advance