21
CCFE is the fusion research arm of the United Kingdom Atomic Energy Authority. This work was funded by the RCUK Energy Programme [grant number EP/I501045] . Processing; the end of an era? Jean-Christophe Sublet 1 and Arjan Koning 2 1 UK Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB, United Kingdom, 2 Nuclear Research and Consultancy Group, NL-1755 ZG, Petten, The Netherlands.

Jean-Christophe Sublet and Arjan Koning...Jean-Christophe Sublet1 and Arjan Koning2 1UK Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB, United Kingdom, 2Nuclear

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

  • CCFE is the fusion research arm of the United Kingdom Atomic Energy Authority. This work was funded by the RCUK Energy Programme [grant number EP/I501045] .

    Processing; the end of an era?

    Jean-Christophe Sublet1 and Arjan Koning2

    1UK Atomic Energy Authority, Culham Science Centre, Abingdon OX14 3DB, United Kingdom, 2Nuclear Research and Consultancy Group, NL-1755 ZG, Petten, The Netherlands.

  • Simulation in space, energy and time Boltzmann equation

    transport time independent

    energy and spatial simulation primary response

    Bateman equation inventory

    time dependent secondary response

    Application Program Interface: interfaces to connect Boltzmann and Bateman solvers for non-linear t- and T-dependent transport

    MCNP6 US

    EASY-II EU

    API

    TENDL EU

    2

    TRIPOLI EU

    SERPENT EU

    Carousels

    Nuclear Data

    Number densities (tn) Prompt spectra Reaction rates

    Number densities (tn+Δt) Secondary, decay spectra Irradiation conditions

    Processing

  • Processing: what is it ?

    •  Raw nuclear data files are pretty useless !

    •  Nuclear data processing is a necessity: to feed the many nuclear codes that require processed nuclear data forms

    •  Nuclear data processing steps are not unique

    •  Nuclear data processing steps strongly influence the data forms

    •  Evaluations relate with nuclear physics but processed nuclear data belongs to nuclear engineering: processing bridges the gaps

    3

  • Physics

    •  All problems are multi-physics but usually decomposed into single physics ones

    •  The legacy nuclear data processing steps answers the needs of single physics codes, mainly reactor physics, with some exceptions

    •  Advanced simulation methods are relying on multi-physics codes with complex feedback

    •  This requires integrated, multifaceted processing steps able to deliver rich, interconnected type of nuclear data forms and correlated uncertainty

    •  This in turn requires the basic nuclear data to be complete (cross section, angular distribution, emitted spectra), robust, technological based with variance-covariance for every quantities across all energy ranges

    4

  • Nuclear data forms

    •  Nuclear data are not constant, nor standard but uniquely defined physical quantities

    •  There could/should be only one basic/nominal n-incident nuclear data file per target, most preferably NOT fitted, produced for a particular application

    •  The basic/nominal nuclear data file should be flanked by a full set of variance-covariance information

    •  Adjustment, if any should be done at one of the derived file level, not at the ENDF level. Or even better during the model code processes

    5

  • GPL collaboration

    Partners: NRG, CCFE, CEA DIF, JUKO research, Uppsala University, IAEA, Vattenfall, UPM, ULB, VTT, ATI, IRMM, NNL,..

    Objectives: To create a set of modern baseline general-purpose files aimed at satisfying the radiation transport and activation-transmutation requirements for facilities in support of all nuclear technologies.

    Ø  α, γ, d, p, n-General Purpose Library Ø  Multi applications, consistent libraries Ø  Complete variance-covariance information Ø  TALYS nuclear model Ø  Build from technology

    6

  • GPL philosophy

    •  From model parameters to code result quantities •  Allow physical parameters to impact the basic nuclear data

    and not an engineered localized adjustment, unable to account for compensation effects.

    •  With variance-covariance data based on experimental data and nuclear models, allowing design optimization of nuclear technology.

    •  Account for the processing (non unique) steps. •  Include, account for V&V Verification and Validation

    processes. •  Feedback of extensive validation and benchmark activities

    are automatically and rapidly, within a year not 10, taken into account.

    7

  • Processing steps

    8

    Nuclear data processing steps and libraries

  • Particle transport needs •  Solution to the Boltzmann and Bateman equations

    §  Full triple integral form §  Integro-differential form §  Diffusion theory

    –  Method of characteristic, collision probability, nodal, etc.. §  Ordinary differential form

    •  Data forms: §  Full microscopic §  Group data; multi-parameter library §  Macroscopic data and coefficients; Σtr Σd, Σa, νΣf, D1, D2

    •  Nuclear data are diverse and can take many forms: reactor physics, nuclear physics, shielding, inventories, high energy physics, medical, astrophysics, etc..

    9

  • Nuclear data forms: the ENDF-6 rules

    •  Legendre polynomial order: stability problem, and why up to 64 is permitted in ENDF??

    •  URR data forms: probability table, distribution function •  Emitted particle spectra not covering the entire channel

    energy range, it happens far too often •  Energy distribution NOT normalized to 1 •  Sum up rules not respected

    •  Nowadays (21st century) cross section processing has matured, stabilized to a high degree of confidence, 0.1%. Unfortunately this is far from being the case regarding the angular distributions and emitted spectra.

    10

  • Processing steps

    •  All processed data library stems from a handful of ENDF-6 formatted libraries handled by an even lesser number of processing codes (with specialized modules)

    - NJOY – LANL (heatr, purr, thermr, acer,..) - PREPRO – LLNL (sixpack, activate,..) - CALENDF – CEA/UKAEA (condentp, mixisotp, sigtteum..) - ….

    11

    •  Advantages -  Robustness, completeness -  Redundancy -  Portability, repeatability -  Legacy and maturity

    •  Disadvantages -  1970’s technology -  non unique -  Years of silent cover up,

    accumulation of ad hoc corrections: 99.396 & 12-032

  • 12

    Processing steps: an example

    •  NJOY12-032 •  reconr •  broadr •  unresr •  thermr •  heatr •  gaspr

    • purr • groupr • acer

    •  PREPRO-2014 •  linear •  recent •  sigma1 •  sixpack •  activate •  merger •  dictin •  groupie

    •  CALENDF-2010 •  calendf •  regroutp •  lecritp •  ….

    PT file

    ENDF file ACE file

    cross-check cross-check

    Single script for all TENDL files

  • Processed forms •  Processed mf-3: temperature dependent point wise, group

    wise cross sections with self-shielding factors §  Why the mf-2 vanishes ? Why evaluator do not include room

    temperature data, the one they used in their immaculate plot ?

    •  Processed mf-4, mf-5 and mf-6 §  Why those mf need processing for Monte Carlo ? §  Group to group matrices for deterministic codes, but then with fixed

    Legendre order for all channels and targets !! Criticality and shielding requirements differs

    §  Matrice for total channel # sum of partial channels matrices •  Processed mf-32, mf-33, mf-40

    §  Why those mfs need processing ? §  Grp to grp correlation, what about channel to channel, iso to iso ?

    •  Many (too) specialized output format: groupr, matxsr, resxr, dtfr, ccccr, wimsr, acer, casmo®, eranos® ,apollo®,….

    13

  • W-184 TENDL-2013 example

    •  W-184 TENDL-2013, verification plots from the ACE file, the file directly used by MCNP or SERPENT

    14

    W184 TENDL-2013 NJOY12.032 293.6KPrincipal cross sections

    10-11 10-9 10-7 10-5 10-3 10-1 101

    Energy (MeV)

    10-2

    10-1

    100

    101

    102

    103

    104

    Cro

    ss s

    ectio

    n (b

    arns

    )

    totalabsorptionelasticgamma production

    W184 TENDL-2013 NJOY12.032 293.6KUR capture cross section

    10-1

    Energy (MeV)

    10-1

    Cro

    ss s

    ectio

    n (b

    arns

    )

    Inf. Dil.100 b1 b

    •  Probability table self-shielding in the URR for MC, not Bondarenko

  • W-184 TENDL-2013 example

    15

    •  W-184 TENDL-2013 ; complete, consistent evaluation –  Q = 7.344130 MeV, recoil on Hf-181

    •  But W-184 ENDF/B-VII.1; incomplete, inconsistent mix of mf3 (600,649 ; 800-849) mf4 (600; 800-807), mf5 ??, mf6 (649 ; 849)

    –  ***error in findf*** mat7437 mf 6 mt103, mt107 not on tape 21 –  ***message from NJOY12.032 –  The file “satisfy criticality” requirement, useless for material science

    W184 TENDL-2013alphas from (n,a)

    10-7

    10-6

    z

    0

    20

    40*10 6

    Sec. Energy 02

    4 68 1

    0 1214 1

    6 1820 2

    2 2426 2

    8 30 *106

    Energy

    (MeV)

    W184 TENDL-2013recoil from (n,a)

    10-6

    10-5

    10-4

    z

    0.00.5

    1.0*10 6

    Sec. Energy 02

    4 68 1

    0 1214 1

    6 1820 2

    2 2426 2

    8 30 *106

    Energy

    (MeV)

  • The next steps

    •  With little but sturdy efforts TALYS (through TEFAL) outputted energy-angular distribution could be directly feed into Monte Carlo code using cumulative distributions

    •  The mf-2, resonance parameters should always be accessible, available at any stage by Monte Carlo code and self-shielding factors SSF for deterministic code

    •  Probability table for self-shielding in the RRR and URR •  With minimal but unique/established processing mf-3 could

    be reconstructed on the fly, from the mf-2. The continuum mf-3 part should already be directly usable

    •  With minimal effort physical quantities (n-spectra, nu, ..) should be uniquely and properly defined

    All this partly exist already

    16

  • Nuclear data forms

    •  Nuclear data file formats strongly influence the accessibility and handling of the data forms they contain

    17

    Eva

    luat

    ion ENDF

    format

    PE

    ND

    F pointwise + angular and energy distributions

    GE

    ND

    F groupwise + matrices

    •  ENDF-6 format is certainly too tight, but it exist and is used, GND will take over to unleash many format and physics aspects

  • Nuclear Data form

    •  AGE: A Global Evaluated data form

    18

    Eval

    uatio

    n

    Eva

    luat

    ed d

    ata - Parameters

    - Probability tables - Unionized pointwise - Cumulative angular and energy distrib.

    •  Ready to be used by Monte Carlo codes, build on ACE

  • Nuclear Data form

    •  AGE: A Global Evaluated data form; a subset

    19

    Eval

    uatio

    n

    Eva

    luat

    ed d

    ata - Parameters

    - Probability tables - Groupwise - Cumulative energy distribution

    •  Already used in the EASY-II multi-particles, multi-systems code, applying an ENDF-6 dictionary framework

  • 20

  • Acknowledgement

    The authors would like to recognized the dedication, commitment and abnegation of the pioneers in the field:

    D. E. Cullen, R. E. MacFarlane and P. Ribon Since the 70’s they conscientiously, silently and patiently bridged the abysses that exist between nuclear physicist and nuclear engineer, the many codes and data forms they used without ever been properly recognized by either of those communities. It has been a privilege to work with such talented minds, still is ! (Bob’s message)

    Time has come to close the gap

    21