24
IAEA International Atomic Energy Agency Meeting Objectives H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Joint IAEA-ITAMP Technical Meeting on Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data Cambridge, Massachusetts, USA 7-9 July 2014

Meeting Objectives

  • Upload
    kimn

  • View
    42

  • Download
    1

Embed Size (px)

DESCRIPTION

Meeting Objectives. H .-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Joint IAEA-ITAMP Technical Meeting on Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data Cambridge, Massachusetts, USA 7-9 July 2014. - PowerPoint PPT Presentation

Citation preview

Page 1: Meeting Objectives

IAEAInternational Atomic Energy Agency

Meeting Objectives

H.-K. Chung and B. J. BraamsAtomic and Molecular Data Unit, Nuclear Data Section

Joint IAEA-ITAMP Technical Meeting on Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data

Cambridge, Massachusetts, USA

7-9 July 2014

Page 2: Meeting Objectives

IAEA

IAEA Atomic and Molecular (A+M) Data Unit to support for fusion program worldwide

We say that we will put the sun into a box. The idea is pretty. The problem is, we don't know how to make the box. -- Nobel prize winner Pierre-Gilles de Gennes

Fusion research requires huge amounts of material data – AM/PSI data

• IAEA A+M Unit formed in 1977

• Review progress and achievements of Atomic, molecular and plasma-surface interaction (A+M/PSI) data for Fusion programme worldwide

• Stimulate international cooperation in measurement, compilation and evaluation of A+M / PSI data for fusion

157 Member States~2400 Staffs

Coordinated Research Projects

PublicationsKnowledgebase

DatabasesMeetings

Page 3: Meeting Objectives

IAEA

Network Collaboration for AM/PSI Data for Fusion

Data Centre Network

ADAS, Summers H. CRAAMD, Jun, Y. FZJ, Reiter D.IAEA, Braams, B. J. JAEA, Nakano, T. KAERI, Kwon, D. Kurchatov, Kukushkin, A. B. NIFS, Murakami, I.NIST, Ralchenko, Y.NFRI, Yoon, J..

Fusion Laboratories

ITEREFDA

JET, UKAEAASDEX-Upgrade, IPPTEXTOR, Jülich, FZJ

KSTAR, NFRINIFS, JAEAPPPL, ORNL

Code Centre Network

Curtin Univ. I. Bray Kitasato Univ. F. Koike

Univ. Autonoma de Madrid I. Rabadan Univ. P&M. Curie, Paris, A. Dubois

Univ. of Bari, M. Capitelli Kurchatov Institute, A. Kukushkin

Lebedev Institute, L. Vainshtein Ernst-Moritz-Arndt Univ, R. Schneider

NIST, Y. Ralchenko PPPL, D. Stotler

LANL, J. Abdallah Jr. IAEA, B. J. Braams

HULLAC M. Klapisch CNEA, P.D. Fainstein

Weizmann, E. Stambulchik

Data Users

Data Producers

IAEACoordination

Coordinated Research Projects

PublicationsKnowledgebase

DatabasesMeetings

Data Centres &Evaluators

Page 4: Meeting Objectives

IAEA

ITER – the biggest tokamak ever built€15 B - 7 Member States (EU, India, Russia, China, S. Korea, Japan and USA)

Construction completed in 2020

Power supplies:

120 MW during operations

Up to 620 MW for 30s periods

Vessel: 5400 tons steel

Superconducting Coils: 9500 tons steel

Divertor: 10 MW/m2 on 8 m2

Cryostat : - 270 oC

ITER is twice as big as the world’s largest currently operating tokamak (JET)

Huge Investment for the Sustainable Future of Mankind

Page 5: Meeting Objectives

IAEA

Evaluated and recommended data needed for ITER design and operation

1 day

1-2 weeks

3 months

TEXTOR (R=1.75 m)Jülich, GER

JET (R=2.96 m), Oxford, UK

ITER (R=6.2 m), Cadarache, FRA

Because of more important plasma chemistry (increased non-linearity, non-locality, in sources).

Typical edge transport code runtime (for same model, equations, grid size)

Page 6: Meeting Objectives

IAEA

Need international collaboration for Evaluated Data http://www-amdis.iaea.org/DCN/Evaluation/

Feb

12

• CM on Procedures for Evaluation of AM/PMI Data for Fusion• Current status & future coordination (Japan)

Jun

12

• CM on Data Evaluation & Establishment of a Standard Library of AM/PMI Data for Fusion (IAEA) Drake, Schultz

Sep

12

• TM on Data Evaluation for AM/PSI Processes in Fusion (Korea)• Reiter, Ballance, Krstic, Kramida

May 13

• TM on International Code Centre Network: General Guidelines for Uncertainty Assessments of Theoretical Data Bartschat, Tennyson, Kokoouline, Ralchenko

Dec

13

• CM on Evaluation of Data for Collisions of Electrons with Nitrogen Molecule and Nitrogen Molecular Ion Tennyson, Kokoouline

Jul 14

• Joint IAEA-ITAMP TM on Uncertainty Assessment for Theoretical Atomic Molecular Scattering Data

Page 7: Meeting Objectives

IAEA

TM on Data Evaluation 2012http://www-amdis.iaea.org/meetings/NFRI2012/

• More than 20 Participants from 11 countries

• Proceeding papers published at Fusion Science and Technology (2013)

• Community Consensus needed to produce evaluated/recommended data• Disseminate standard definitions of TERMINOLOGIES adopted internationally

• Disseminate materials with the CRITICAL ANALYSIS SKILLS NRC report

• Involve COMMUNITY in data evaluation eMOL, Group evaluation

• Technical Issues• Assessment for THEORETICAL data UNCERTAINTY ESTIMATES

• Assessment of EXPERIMENTAL data Self-consistency checks

• ERROR PROPAGATION and SENSITIVITY ANALYSIS Uncertainties in “Data” & “Data Processing Toolbox”

Page 8: Meeting Objectives

IAEA

Define Terminology: Uncertainty ApproachIt’s NOT AN ERROR but AN UNCERTAINTY

• Terminology in metrology adopted by international communities

(IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc)VIM (Vocabulaire International de métrologie, Bureau Int. des Poids et Measures) 2007

GUM (guide to the expression of uncertainty in measurement) 2008

• Conceptual Changes of Values and Uncertainties

True Value (Error Approach, ~ 1984) A measure of the possible error in the estimated value of the measurand as provided by the result of a measurement

Measured Value (Uncertainty Approach) Parameter that characterizes the dispersion of the quantity value that are being attributed to a measurand, based on the information used. (VIM 3)

value value and uncertainty

uncertainty

Page 9: Meeting Objectives

IAEA

Critical Assessment for Modeling of Physical Processes

• Verification. The process of determining how accurately a computer program (“code”) correctly solves the equations of the mathematical model.

• Validation. The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model

• Uncertainty quantification (UQ). The process of quantifying uncertainties associated with model calculations of true, physical QOIs, with the goals of accounting for all sources of uncertainty and quantifying the contributions of specific sources to the overall uncertainty.

NSF Division of Mathematics and Physical Sciences should encourage interdisciplinary interaction between domain scientists and mathematicians on the topic of uncertainty quantification, verification and validation, risk assessment, and decision making. (2010)

Page 10: Meeting Objectives

IAEA

Bayesian Update for Data Evaluation and Uncertainty Assignment

Example: observable f(x)=a+bx+cx2 simulated experimental data given by f(y)=(a+by+cy2)(1+d*r) + ∆t(y)where r is a uniform random variabled is the width of the random interval∆t(y) is a systematic error

Bayes Theorem (1763):

p(x|σ M) = p(σ |xM) p(x|M) / p(σ|M) x :Set of calculation parameter

posterior = likelihood x prior / evidence σ : Set of experimental Data

p(x|σ M) : Aposteriori distribution, p(x|M) : Prior distribution

p(σ |xM) : Likelihood function gives the probability distribution of the data σ for a model M with parameters

H. Leeb et al. Consistent Procedure for Nuclear Data Evaluation Based on Modeling, Nuclear Data Sheets, 109 (2008) 2762

Page 11: Meeting Objectives

IAEA

Theoretical cross-sections without uncertainty estimates

What is the best way to assess the quality of theoretical data without physical measurements?

“The Low-Energy, Heavy-Particle Collisions–A Close-Coupling Treatment” Kimura and Lane, AAMOP, 26, 79 (1989)

Page 12: Meeting Objectives

IAEA

5 Steps in Uncertainty Evaluation

Modeling the measurement

Identifying uncertainty componentsfor each input quantity

Evaluating standard uncertainty Type A, Type B

Combining standard uncertaintiesof input quantities

Expanded uncertainty

Coverage factor

Sensitivity coefficient

n

su

nsu

sor

),...,,( 21 nxxxfy

ixu

iN

i i

xux

fyu 2

2

1

2c )(

2k

)(c yukU

),...(),(),(

3,

2,

1,

i

i

i

xuxuxu

Page 13: Meeting Objectives

IAEA

3rd Code Centre Network TM 2013

• Strategies to develop guidelines for the uncertainty estimates of theoretical data

• Depend on Target, Resolution, Observable of interest (QOI in NRC)

• Atomic structures

• State descriptions, operators, basis sizes, basis parameters, sensitivity

• Special volume in “Atoms” journal – 5 papers on the topic

• Atomic collisions

• Highly accurate, computationally intensive codes vs production codes

• Benchmark results, basis sets, different methods, consistency check

• Molecular collisions

• Target, resonances, different methods, consistency check

Page 14: Meeting Objectives

IAEA

IAEA-ITAMP TM : Uncertainty Assessment for Theoretical Atomic and Molecular Scattering Data

• Bring together a number of people who are working on electron collisions with atoms, ions, and molecules, heavy-particle collisions, and electronic structure of atoms and molecules (~ 25 Participants)

• Come up with reasonable uncertainty estimates for calculations using the various methods of collision physics: perturbative, nonperturbative, time-independent, time-dependent, semi-classical, etc.

• Desirable output Guidelines for estimating uncertainties of theoretical atomic and molecular data

Page 15: Meeting Objectives

IAEA

Discussion sessions are extremely important!!!

Gordon Drake, "Uncertainty Estimates: A new Editorial Standard"

David Schultz , "Heavy Particle Collisions" Detlev Reiter, "Status of database and data analysis for fusion edge plasma transport studies"

Tom Kirchner, "Heavy-particle collisions involving many active electrons: how (in-)accurate are our calculated cross sections?"

Heavy particle Discussion, Discussion Leader: David Schultz

Yuri Ralchenko, "Data quality and needs for collisionalradiative modeling"

Predrag Krstic, "Sensitivity, Error and Uncertainty Quantification: Interfacing Models at Different Scales"

Attila Csaszar, "Uncertainties of Molecular Structural Parameters"

Alex Kramida, Assessing Uncertainties of Theoretical Atomic Transition Probabilities with Monte Carlo Random Trials

Luis Mendez, "Uncertainties of electron capture cross sections in Be4+ + H(1s) collisions"

Daniel Haxton, "Calculation of dissociative electron attachment cross sections"

e-Atom Discussion, Discussion Leader: Klaus Bartschat

Conner P. Ballance, "The propagation of correlated uncertainties from theoretical collisional calculations through to plasma spectral diagnostics"

Jimena D. Gorfinkiel, "Calculations of electronmolecule scattering cross sections using the Rmatrix method"

e-Molecule Discussion, Discussion Leader : Jonathan Tennyson

James Colgan, "Time Dependent Close Coupling Methods for Electron Atom/Molecule Scattering"

Karel Houfek, "Uncertainties in calculations of low-energy resonant electron collisions with diatomic molecules"

IAEA Discussion and Conclusions, Hyun-Kyung Chung

Al Stauffer, "An Evaluation of the Accuracy of the Relativistic Distorted Wave Approximation for the Calculation of Cross Sections for the Excitation of Atoms by Electron Impact"

Marco A. P. Lima, "Low energy electron scattering data for chemical plasma treatment of biomass"

Meeting Adjourn

Oleg Zatsarinny, "Benchmark calculations for electron collisions with complex atoms"

Robert Lucchese, "Electron Collisions with Molecular Ions"

Time for open or follow-on discussions

Dmitry Fursa, "Convergent Close Coupling Calculations for ElectronAtom/Molecule Scattering"

Viatcheslav Kokoouline, "Development of a procedure for the uncertainty evaluation in theoretical calculations of cross sections and rate coefficients"

Page 16: Meeting Objectives

IAEA

SUPPLEMENTARY SLIDES

Page 17: Meeting Objectives

IAEA

Community Role: Consensus Building

• Change of notions: Databases Data research

• Enlighten young generation (in early career) that data evaluation is a critical part of scientific work

• Disseminate materials to train students and researchers with the “Critical Analysis Skills”

• Disseminate the standard definitions of terminologies adopted by international organizations (IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc)• VIM (Vocabulaire International de métrologie, Bureau Int. des Poids et Measures) 2007

• GUM (guide to the expression of uncertainty in measurement) 2008

• Agree on the procedure of evaluation towards a standard reference data

Page 18: Meeting Objectives

IAEA

Common Workflow Guidelines

Advantages: Easier to expand the evaluators’ network including early career researchers.

Introduces more rigorous procedures for evaluation and increases the dependability of the evaluation.

Disadvantages: The quality of evaluation critically depends on the experiences of the evaluators.

It is possible that different people may reach at different conclusions using the same guidelines and the results may not be reproducible.

Solutions: Collaborations can help reducing the disadvantages. Evaluation activities with scientific advisors and the editorial panels will be a great mechanism to produce the evaluated data library.

Workflow of critical evaluation of data on wavelengths and energy levels (NIST)

Page 19: Meeting Objectives

IAEA

Evaluation by the Community

• The community consensus with an endorsement from the IAEA or other international authorities

• Group Evaluation: 4-5 panelists including young and senior people like the editorial board for a journal with the broad backgrounds ( experimentalists, theoreticians, producers and users)

• Self-Evaluation: Data producers with a deep knowledge in some cases. May work better for theoretical data sets.

• Establishment of the evaluation guidelines: will evolve with time and experience with broad collaborations from the community

• Merits of Group Evaluation:• Facilitate the knowledge transfer to younger generation• Review papers can be written for the evaluation work

Page 20: Meeting Objectives

IAEA

Experimental Data Evaluation

• Check Lists• Uncertainty estimates or error assessment critical• Self-consistencies checked.• Experimental techniques evaluated. • Reputation of the data producer considered• Anomalies in some experimental processes (ro-vibrational / metastable states)

• Wish Lists• Evaluation by a group of “established’ experts with broad expertise

• Engage the community of relative measurements for cross-section measurements• Provide Recommended values where possible• Include a comparison with theory and an assessment of overall status• Evaluation will lead to the understanding of the gaps of the field• Establish “benchmarks” where possible:

• Benchmarks will be accepted as heavier connotation than recommendation

Page 21: Meeting Objectives

IAEA

Theoretical Data Evaluation

• No criteria of assessments for theoretical data

• Need guidelines for uncertainty estimates of theoretical data

• Should not try to give a straight recipe for assessing uncertainties, however, there are still several to start with.• There are prescriptions such as energy grids for resonances and partial waves.

• One may take a model to see a convergence and estimate uncertainties based on assumptions within the model.

• Comparison with experiments: this can be dangerous.

• Different theories: if some theories are better than another, it may be given a benchmark status.

• For scattering data maybe we should aim for “ideas” or “suggestions” rather than “guidelines”.

• Theoreticians may have an idea of uncertainty estimates already• Journal policies can change the culture: PRA policies (G. Drake)

Page 22: Meeting Objectives

IAEA

Atomic structure

collision codes fundamental data Processed data (rate coef.) CR transition matrix A = A_excit+ A_radiative+ A_ionis+ A_cx+ A_recomb+….

effective rates, population coefficients cooling rates, beam stopping rates,….

Error Propagation and Sensitivity Analysis: Uncertainties in “Data” & “Data Processing Toolbox” ?

experimental data

Velocity distribution:Boltzmann solver,Maxwellian

Linear algebra,ODE solvers

Sensitivity,error propagationto final model results:PDEq, IDEq,…leave to modelers,spectroscopists

Monte Carlo

Page 23: Meeting Objectives

IAEA

Conceptual Changes of Uncertainties

~ 1992 An estimate characterizing the range of values with which the true value of a measurand lies (VIM 1)

~ 1984 A measure of the possible error in the estimated value of the measurand as provided by the result of a measurement

~ 2007 Parameter associated with the result of a

measurement, that characterizes the dispersion of the values that could reasonably be attributed to the measurand (VIM 2)

2008 ~ Parameter that characterizes the dispersion of the

quantity value that are being attributed to a measurand, based on the information used. (VIM 3)

Page 24: Meeting Objectives

IAEA

Uncertainty Evaluation

SamplingAnalyst

extraction

stability

matrix

ability

experience

Laboratory Reagent Instrument

Temp

humidity

pressure

Reference m.

purity

impurity

resolution

automation

drift

Data treatment

Calibration

Assumption

Result

Variation of each parameter

Guide to the expression of Uncertainty in Measurement (GUM),

1993, BIPM,IEC,IFCC,ISO, IUPAC, IUPAP, OIML