Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
SDSF 2017 November 7, 2017 1DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Uncertainty Quantification-driven Model-Based Engineering for DoD System Design and Evaluation
Sponsor: DASD(SE)By
Mr. Douglas Ray5th Annual SERC Doctoral Students Forum
November 7, 2017FHI 360 CONFERENCE CENTER1825 Connecticut Avenue NW
8th FloorWashington, DC 20009
www.sercuarc.org
SDSF 2017 November 7, 2017 2DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Problem:There is currently significant emphasis on, and need for, the use of computational modeling & simulation (M&S) as a key component of development, test and evaluation of Warfighter systems within the Department of Defense [1].This work focuses on developing a framework for integrating M&S and UQ-base probabilistic methods into the DoD systems engineering process, and leveraging M&S data to augment empirical models from ‘live’ testing/experimentation (especially when this testing is expensive or resource intensive) using Uncertainty Quantification techniques [2], with an emphasis on visual data assimilation methods.
The intent is to provide decision-makers with richer information for design decisions prior to prototype build, a simplified and credible approach to determine the utility of the M&S model in augmenting live testing, determine the need for additional testing, and determine the range of applicability for data augmentation relative to inherent system variation. The purpose is to inform the SE process, particularly physical and functional decomposition, concept selection, system design-build-test with accurate M&S-based prediction and test results.
Problem Statement
SDSF 2017 November 7, 2017 3DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Definitions
Model-Based Engineering (M&S):
An approach to engineering that uses models as an integral part of the technical baseline that includes the requirements, analysis, design, implementation, and verification of a capability, system, and/or product throughout the acquisition lifecycle.
Uncertainty Quantification (UQ):
The process of identifying all relevant sources of uncertainties, characterizing them in all models, experiments, comparisons of M&S results and experiments, and of quantifying uncertainties in all relevant inputs and outputs of the simulation or experiment.
SDSF 2017 November 7, 2017 4DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Systems Engineering Process
Stakeholder Analysis
Requirements Definition
Concept Generation, System Architecture,
Functional Decomposition
Model & Physical Decomposition, Tradespace
Exploration, Concept Selection
Design / Build / Test
System Integration, Interface Management
Verification & Validation
Transition / Lifecycle Management
SDSF 2017 November 7, 2017 5DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Overarching Probabilistic M&S Framework: Digital Engineering
Computational Modeling & Simulation
Computational M&S Emulation / Reduced-Order
Modeling
Data Assimilation / V&V(M&S and Empirical)
Sensitivity Analyses:• Probabilistic Methods• Numerical Optimization• Error Propagation• Capability Analysis• Tradespace Studies• Inverse Propagation• Reliability-based design optimization
Aeroballistics
Structural Modeling
Thermodynamics Materials & Additive Man.
Cost Modeling
Throughput / Yield Forecast
Reliability Availability
POFA
Computational Chemistry
‘Live’ Testing & DOE-based Empirical Modeling
Results: Reliable, Robust, Optimized Products & Systems Credible and Defensible Engineering M&S Analytics Reduced Design Cycle Iterations; time to field ID’d opportunities to reduce manufacturing costs Reduced performance variation
Planning
Functional Responsesy1, y2, y3...
Test Humidity
Test Temp
Uncontrollable (measurable)
FactorsCovariates
Day
Nuisance (notmeasurable or
controllable) Blocks
Operator
Clean & Lubricate Cycle
Held-Constant Factors
Cooling Time
Test Mount
Firing Rate
Ammo Lot
Noisesz1, z2, z3...
Weapon design variation
Ammo Temperature
Firing Rate
Ammo Type
Design Parametersx1, x2, x3...
Weapon design variation
Weapon design variation
M&STest
C
B A
Column
APRESS_M
AWT
APRESSBB
BBANG
CANTLEN
BBSHAPE
ALENGTH
AIZZ
Main Effect
0.681
0.046
0.041
0.027
0.026
0.018
0.008
0.007
Total Effect
0.711
0.068
0.063
0.044
0.042
0.03
0.016
0.015
.2 .4 .6 .8
LSL
56000
62000
68000
Planning
Functional Responsesy1, y2, y3...
Test Humidity
Test Temp
Uncontrollable (measurable)
FactorsCovariates
Day
Nuisance (notmeasurable or
controllable) Blocks
Operator
Clean & Lubricate Cycle
Held-Constant Factors
Cooling Time
Test Mount
Firing Rate
Ammo Lot
Noisesz1, z2, z3...
Weapon design variation
Ammo Temperature
Firing Rate
Ammo Type
Design Parametersx1, x2, x3...
Weapon design variation
Weapon design variation
ε++++= 211222110 xxβxβxββy
Factor Weighting
Noise FactorNF Variation Magnitude
Sensitivity of DF to NF
NF VRPN DF VRPNFactor
WeightingNoise Factor
NF Variation Magnitude
Sensitivity of DF to NF
NF VRPN DF VRPN
z1 - Temperature 5 6 120 z1 - Temperature 5 6 150z2 - Wind 6 4 96 z2 - Wind 6 4 120
z3 - Precipitation 8 6 192 z3 - Precipitation 8 6 240z4 - Air density 3 3 36 z4 - Air density 3 3 45
z5 - Debris 1 5 20 z5 - Debris 1 5 25z1 - Temperature 5 6 90 z1 - Temperature 5 6 180
z2 - Wind 6 4 72 z2 - Wind 6 4 144z3 - Precipitation 8 6 144 z3 - Precipitation 8 6 288z4 - Air density 3 3 27 z4 - Air density 3 3 54
z5 - Debris 1 5 15 z5 - Debris 1 5 30z1 - Temperature 5 6 180 z1 - Temperature 5 6 90
z2 - Wind 6 4 144 z2 - Wind 6 4 72z3 - Precipitation 8 6 288 z3 - Precipitation 8 6 144z4 - Air density 3 3 54 z4 - Air density 3 3 27
z5 - Debris 1 5 30 z5 - Debris 1 5 15z1 - Temperature 5 6 30 z1 - Temperature 5 6 120
z2 - Wind 6 4 24 z2 - Wind 6 4 96z3 - Precipitation 8 6 48 z3 - Precipitation 8 6 192z4 - Air density 3 3 9 z4 - Air density 3 3 36
z5 - Debris 1 5 5 z5 - Debris 1 5 20z1 - Temperature 5 6 60 z1 - Temperature 5 6 60
z2 - Wind 6 4 48 z2 - Wind 6 4 48z3 - Precipitation 8 6 96 z3 - Precipitation 8 6 96z4 - Air density 3 3 18 z4 - Air density 3 3 18
z5 - Debris 1 5 10 z5 - Debris 1 5 10z1 5 6 150 z1 5 6 30z2 6 4 120 z2 6 4 24z3 8 6 240 z3 8 6 48z4 3 3 45 z4 3 3 9z5 1 5 25 z5 1 5 5
1 x1 - Ogive Projectiletransfer
energy to target
4 464
2 x2 Projectiletransfer
energy to target
3 348
3 x3 PrimerInitiate
propellant6 696
4 x4 PropellantPropel
projectile1 116
5 x5Cartridge
case
Contain munition
components2 232
6 x6 PropellantPropel
projectile5 580
Response: y1 - Horizontal Dispersion# Design Factor (KPC)
Associated Subsystem
Subsystem's Main Function
Response: y2 - Vertical Dispersion
5 580
6 696
3 348
4 464
2 232
1 116
SDSF 2017 November 7, 2017 6DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Area of Focus
‘Live’ Testing
Empirical Model
Combined Model
M&S Surrogate Model
Stratified Simulation
Assimilation / Visualization
Decision & Action
• ‘Data Assimilation’ - integration of M&S and ‘live’ data
• How realistic and credible are the model predictions throughout the design space relative to estimated variation?
• Decision-maker can ID low-reliance (high-risk) regions of the design space relative to random variation and propagated error across simulated model data
• Approach: Ensemble Modeling, Crossvalidation, Dimension Projection, Monte-Carlo Filtering, Multidimensional Data Visualization
SDSF 2017 November 7, 2017 7DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Case Study
• Munition example (anonymized for operational security reasons):―Key performance parameter: long-range target engagement capability―Engineering team executes pre-prototype M&S of various subsystems:o Aero, structural, interior ballistics, lethality, MBSE/functional architecture, etc
SDSF 2017 November 7, 2017 8DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Aero Case Study - Background
• 6-DoF Aeroballistic model is developed to verify that tentative airframe design performs as intended across trajectory
• What happens to our ability to meet KPP (long-range target engagement capability, in terms of impact errors in the x- and y-directions and velocity) when we vary initial velocity, launch disturbances in the x- and y-axis, and spin rate of the munition (Hz); given tentative design (canard/fin geometry, projectile geometry, CG, etc)?―Resulting Velocity (velocity decay)―Other unintended consequences to the system (pressures required to
achieve velocity/range, and impact of those pressures on system reliability / parts fatigue)
SDSF 2017 November 7, 2017 9DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Case Study - Approach
• Objective: Study the impact of varying Aero inputs on the outputs, then explore tradespace to determine aero solution which minimizes x- and y-dispersion errors, and maximizes downrange velocity retention
• How: Simulate the model in various scenarios to support a DOE-based model emulator/surrogate model―Can use emulator to rapidly execute what-if analysis, sensitivity analysis,
optimization and robustness analysis
SDSF 2017 November 7, 2017 10DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Case Study - Analysis
• Simulation DOE
• Emulation / Empirical Model Fitting
• Numerical Optimization & Propagation of Error
• Monte-Carlo Simulation, Sensitivity Analysis & KPP validation
SDSF 2017 November 7, 2017 11DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Simulation DOE
• 400 run Sequential MaxPro Latin Hypercube Space-Filling DOE
• Colored by ‘Velocity Delta’ response (red = greatest velocity decay)
Input Variable Factor Units Low High Output Variable Responsex1 MV -- 0.8 1.0 y1 Delta Velx2 Vert Launch Dist Rad/sec -6 6 y2 Final Velx3 Horz Launch Dist Rad/sec -6 6 y3 Deflection Error (X)x4 Spin Rate Hz 20 60 y4 Altitude Error (Y)
Scatterplot Matrix
0.800.850.900.95
-6
-4-2024
-6
-4-2024
30
45
-150-100
-500
50100
-80
-20
40
40
100
160
0.74
0.80
0.86
0.040.050.060.070.080.09
1 2
Block
0.80 0.95
MV
-6 -2 2 4
Vertical
plane launch
disturbance
-6 -2 2 4
Horizontal
plane launch
disturbance
30 45
Spin Rate
-100 0
Defl Error
-80 40
Alt Error
40 160
Total Miss
0.74 0.86
Terminal V
Graph Builder
Overall Contribution vs. Predictor
Overall Contribution
0.0 0.1 0.2 0.3 0.4 0.5 0.6
Spin Rate
Horizontal plane launch disturbance
Vertical plane launch disturbance
MV
SDSF 2017 November 7, 2017 12DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Response Emulator
• Fit the simulation data emulator for each response using Gaussian Process Model (Kriging model w/ Gaussian correlation function)
• MV contributes the most variation to responses, and SR contributes the least
• Strong interaction effect between MV and Launch Disturbances― ‘Hypersensitivity’ to launch disturbance as MV
increases beyond ~0.95Gaussian Process Model of Defl Error
Actual by Predicted Plot
-150
-100
-50
0
50
100
150
-150 -100 -50 0 50 100 150
Defl Error Jackknife Predicted
Model Report
Column
MVVertical plane launch disturbanceHorizontal plane launch disturbanceSpin Rate
Theta
50.0140060.03864880.03498190.0000168
Total
Sensitivity
0.29813490.23842360.67892860.0091837
Main Effect
0.00027790.0895760.468418
0.0000562
MV
Interaction
.0.11711110.18047890.0002671
Vertical plane launch
disturbance Interaction
0.1171111.
0.02645380.0052827
Horizontal plane launch
disturbance Interaction
0.18047890.0264538
.0.0035778
Spin Rate
Interaction
0.00026710.00528270.0035778
.
μ
52.206007
σ²
58815.66
Nugget
0.001
-2*LogLikelihood
3319.9344
Fit using the Gaussian correlation function.
Nugget parameter set to avoid singular variance matrix.
Gaussian Process Model of Alt Error
Actual by Predicted Plot
-120
-100
-80
-60
-40
-20
0
20
40
60
-120 -100 -80 -60 -40 -20 0 20 40 60
Alt Error Jackknife Predicted
Model Report
Column
MVVertical plane launch disturbanceHorizontal plane launch disturbanceSpin Rate
Theta
40.4123360.02123570.03191071.2247e-5
Total
Sensitivity
0.36398030.63012430.21237340.0087911
Main Effect
0.09487060.44312350.08692230.0002149
MV
Interaction
.0.166261
0.10279225.6512e-5
Vertical plane launch
disturbance Interaction
0.166261.
0.01743950.0033002
Horizontal plane launch
disturbance Interaction
0.10279220.0174395
.0.0052194
Spin Rate
Interaction
5.6512e-50.00330020.0052194
.
μ
-68.65742
σ²
48382.389
Nugget
0.001
-2*LogLikelihood
3127.8517
Fit using the Gaussian correlation function.
Nugget parameter set to avoid singular variance matrix.
SDSF 2017 November 7, 2017 13DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
• Using Robust Parameter Design principles (related to ‘Taguchi methods’), Propagation of Error (POE) analysis involves taking the partial derivative of response function wrt ‘Noise’ variables to minimize transmission of variability to responses
• Numerical optimization - setting the optimization goals and constraints such that:
― Launch Disturbances set as ‘Noise’ variables, with N~(0, 1.5)― Target zero value for Alt and Defl errors
― Maximize Final Velocity
― Target zero value for all partial derivatives (zero slope = flat/ insensitive regions)
• Key Takeaway: Robust-Optimal ‘sweet-spot’ setting is at MV = 0.85 when SR = 40 (giving terminal velocity of 0.80), with some margin in MV
Robust Optimization
[ ] 21
2 ),(),( σσ +
∂
∂= ∑
=
r
i izZ z
yyVari
zxzx
Input variation
Input sensitivity
Output variation
Total Variation
Prediction Profiler
-150-100
-500
50100
-0.8186
-100
-50
0
50
100
0.050511
-100
-50
0
50
100
-2.96606
-120
-60
0
60
0.027988
-60-40-20
0204060
0.047949
-60-40-20
0204060
-0.90379
0.74
0.8
0.86
0.92
0.805606
-0.06-0.04-0.02
00.020.040.06
7.788e-5
-0.06-0.04-0.02
00.020.040.06
-7.14e-5
0.055
0.07
0.085
0.044754
-0.015-0.01
-0.0050
0.0050.01
0.015
-6.5e-5
-0.015-0.01
-0.0050
0.0050.01
0.015
-0.00004
0
0.25
0.5
0.75
1
0.987208
0.8507037
MV
0
Vertical plane
aunch disturbance
0
Horizontal plane
aunch disturbance
39.997154
Spin Rate Desirability
SDSF 2017 November 7, 2017 14DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
• 20,000 Monte-Carlo simulations treat launch disturbances as random variables
• Downrange Dispersion Errors resulting from setting MV to 0.85, 0.90, 0.95, and 1.00
• Individual data points are colored by Velocity Decay (smallest to largest from blue to grey to red)
Dispersion Error MC Simulation
Graph Builder
Alt Error Prediction Formula vs. Defl Error Prediction FormulaMV
0.85
Defl Error (meters)
-140 -100 -80 -60 -40 -20 0 20 40 60 80 100 120 140 160
-140
-120
-100
-80
-60
-40
-20
0
20
40
60
0.9
-140 -100 -80 -60 -40 -20 0 20 40 60 80 100 120 140 160
0.95
-140
-120
-100
-80
-60
-40
-20
0
20
40
60
1
Graph Builder
Delta V Prediction Formula vs. MV
Delta V Prediction Formula
0.040 0.050 0.060 0.070 0.080 0.090 0.100
MV
0.85
0.9
0.95
1
SDSF 2017 November 7, 2017 15DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
Case Study Outputs / Conclusions
• Results will inform:―Tentative design and probability of meeting KPP
o Decreasing MV creates some performance margin wrt other performance parameters, and robustness/insensitivity to presence of system ‘noises’
―Decisions regarding other attributes at the system-level (target effects, structural reliability, etc)
―Integration with other subsystem-level emulators to support system-level digital evaluation using hierarchical meta-model
o To overcome hypersensitivity to launch disturbances at higher launch velocities we can reconfigure projectile design (adjust CG to achieve better stability at higher MV, for example)
―Fusion of modeling data and predictions via emulator with ‘live’ test data at subsystem and/or system-level upon prototype design / build / test
o Visualization-based data assimilation (validation or calibration)
SDSF 2017 November 7, 2017 16DISTRIBUTION A. Approved for Public Release; Distribution is unlimited.
UNCLASSIFIED
1. Gilmore, M. Director of OT&E Memorandum: Guidance on the Validation of Models and Simulation Used in Operational Test and Live Fire Assessments, 14 March 2016.
2. Committee on Mathematical Foundations of Verification, Validation, and Uncertainty Quantification; Board on Mathematical Sciences and Their Applications, Division on Engineering and Physical Sciences, National Research Council (2012). Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification. National Academies Press.
3. Experiments: Planning, Analysis, and Optimization, 2nd ed., C. F. J. Wu, M. Hamada 4. R. Myers, D. Montgomery, C. Anderson-Cook, Response Surface Methodology, 3rd ed., John
Wiley & Sons, 2009.5. Sacks, J.W., Welch, J., Mitchell, T. J., and Wynn, H. P. (1989). Design and analysis of computer
experiments. Statistical Science, 409–423.6. V. Roshan Joseph (2016) Space-filling designs for computer experiments: A review, Quality
Engineering, 28:1, 28-357. Shan Ba, William R. Myers & William A. Brenneman (2015) Optimal Sliced Latin Hypercube
Designs, Technometrics, 57:4, 479-4878. Kennedy, M. C., O’Hagan, A. (2001). Bayesian calibration of computer models. Journal Royal
Statistical Society B, 63:425–4649. Reinman, G. et. al. (2012). Design for Variation. Quality Engineering, 24:317-345
Literature
Uncertainty Quantification-driven Model-Based Engineering for DoD System Design and EvaluationProblem StatementDefinitionsSystems Engineering ProcessOverarching Probabilistic M&S Framework: �Digital EngineeringArea of FocusCase StudyAero Case Study - BackgroundCase Study - ApproachCase Study - AnalysisSimulation DOEResponse EmulatorRobust OptimizationDispersion Error MC SimulationCase Study Outputs / ConclusionsLiterature