22
Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007 Validation Methodology for Agent-Based Simulations Workshop DoD Validation Baseline

Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

  • Upload
    raleigh

  • View
    35

  • Download
    0

Embed Size (px)

DESCRIPTION

Validation Methodology for Agent-Based Simulations Workshop DoD Validation Baseline. Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007. Outline. Validation defined General approach Issues for ABS validation. Outline. Validation defined General approach Issues for ABS validation. - PowerPoint PPT Presentation

Citation preview

Page 1: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Ms. Lisa Jean Moya

WernerAnderson, Inc.

01 May 2007

Validation Methodology for Agent-Based Simulations

Workshop

DoD Validation Baseline

Page 2: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Outline

• Validation defined

• General approach

• Issues for ABS validation

Page 3: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Outline

• Validation defined

• General approach

• Issues for ABS validation

Page 4: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

DODI 5000.61DoD Definitions

• Verification The process of determining that a model implementation

and its associated data accurately represents the developer’s conceptual description and specifications

• Validation The process of determining the degree to which a model

and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model

• Accreditation The official certification that a model, simulation, or

federation of models and simulations and its associated data are acceptable for use for a specific purpose

The workshop focus is Validation

Page 5: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Utility of Validation

• Military analysis requires the capability to evaluate an environment dominated by non-physical effects Cold War analysis is not sufficient Fighting the last war is not good enough

• Subject matter expertise needs codification and expansion

• Make appropriate use of M&S Avoid using bad M&S/analysis Avoid throwing out good M&S/analysis

Page 6: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

DoD 5000 on M&S VV&A

• Much attention paid to “principals” but little to “principles” Provides DoD authoritative definitions Little emphasis on the “how’s”

• Policies and procedures for M&S applications at the DoD Component level Allows the tailoring of VV&A policies and

procedures to the needs of the user Likely to result in inconsistencies – little to no

standardization of TTPs

Page 7: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Outline

• Validation defined

• General approach

• Issues for ABS validation

Page 8: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

DMSO, VV&A Recommended Practices Guide – Validation Special TopicValidation Steps

Characterize requirements

Compare subject &

requirements

Select referent

Compute accuracies

Characterize system

User objectives

Validation results

Available referents

Subject system

information

• Verify M&S requirements• Develop V&V plan• Validate conceptual

model• Verify design• Verify implementation• Validate results

• Verify M&S requirements• Develop V&V plan• Validate conceptual

model• Verify design• Verify implementation• Validate results

Page 9: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

General ProcessProblem Solving Process

Define problem

Establish objectives

Define problem

Establish objectives

Accept & record

Analyze results

Accept & record

Analyze results

Repository

Select approaches

Select approaches

Non-M&S methodsNon-M&S methodsApply resultsApply results

Execute & prepare results

Execute & prepare results

Makeaccreditation

decision

Makeaccreditation

decision

Prepare M&S for

use

Prepare M&S for

use

M&S MethodM&S Method

Define M&S reqmts

Plan approach

M&S MethodM&S Method

Define M&S reqmts

Plan approach

M&S Use Process

Accreditation Process

Develop accreditation

plan

Develop accreditation

plan

Perform accreditation assessment

Perform accreditation assessment

Collect and evaluate accreditation informationCollect and evaluate accreditation information

Verify reqmts

Verify reqmts

Develop V&V plan

Develop V&V plan

Perform V&V activities appropriate for M&S categoryPerform V&V activities appropriate for M&S category

Construct Federation

Determine Fed Reqmts

Determine Fed Reqmts

Plan Fed Construction

Plan Fed Construction

Develop & Test

Design

Develop & Test

Design

Integrate & Test Fed

Integrate & Test Fed

Develop Fed Conceptual

Model

Develop Fed Conceptual

Model

Develop New M&S

Determine M&S Reqmts

Determine M&S Reqmts

Plan M&S DevelopmentPlan M&S

DevelopmentDevelop

Conceptual Model

Develop Conceptual

ModelImplement &

Test M&SImplement &

Test M&S

Develop & Test

Design

Develop & Test

Design

Modify Legacy M&S

Plan Modifications

Plan Modifications

Modify Conceptual

Model

Modify Conceptual

ModelDetermine Mod Reqmts

Determine Mod Reqmts

Implement & Test M&S

Mods

Implement & Test M&S

Mods

Modify & Test Mod Design

Modify & Test Mod Design

M&S Development & Preparation Process

Construct Federation

Determine Fed Reqmts

Determine Fed Reqmts

Plan Fed Construction

Plan Fed Construction

Develop & Test

Design

Develop & Test

Design

Integrate & Test Fed

Integrate & Test Fed

Develop Fed Conceptual

Model

Develop Fed Conceptual

Model

Construct Federation

Determine Fed Reqmts

Determine Fed Reqmts

Plan Fed Construction

Plan Fed Construction

Develop & Test

Design

Develop & Test

Design

Integrate & Test Fed

Integrate & Test Fed

Develop Fed Conceptual

Model

Develop Fed Conceptual

Model

Develop New M&S

Determine M&S Reqmts

Determine M&S Reqmts

Plan M&S DevelopmentPlan M&S

DevelopmentDevelop

Conceptual Model

Develop Conceptual

ModelImplement &

Test M&SImplement &

Test M&S

Develop & Test

Design

Develop & Test

Design

Develop New M&S

Determine M&S Reqmts

Determine M&S Reqmts

Plan M&S DevelopmentPlan M&S

DevelopmentDevelop

Conceptual Model

Develop Conceptual

ModelImplement &

Test M&SImplement &

Test M&S

Develop & Test

Design

Develop & Test

Design

Modify Legacy M&S

Plan Modifications

Plan Modifications

Modify Conceptual

Model

Modify Conceptual

ModelDetermine Mod Reqmts

Determine Mod Reqmts

Implement & Test M&S

Mods

Implement & Test M&S

Mods

Modify & Test Mod Design

Modify & Test Mod Design

Modify Legacy M&S

Plan Modifications

Plan Modifications

Modify Conceptual

Model

Modify Conceptual

ModelDetermine Mod Reqmts

Determine Mod Reqmts

Implement & Test M&S

Mods

Implement & Test M&S

Mods

Modify & Test Mod Design

Modify & Test Mod Design

M&S Development & Preparation Process

Y

N

V&V Process

• Verify M&S requirements• Develop V&V plan• Validate conceptual model• Verify design• Verify implementation• Validate results

Basic representation

Effect of interactions

Empirical Assessment• Another model

• Mathematical• Simulation• Formalism

• Historical event• Live experiment

• SME / Turing• Statistical• Metric

Assessment• Appropriate referents• Rule set (alone & in

the composition)• Instantiation• Interpretation• Trajectory

Adapted from DMSO, VV&A Recommended Practices Guide

Page 10: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Adapted from DMSO, VV&A Recommended Practices Guide – Requirements Special Topic

Overlap Between Domain Areas& Requirements

• Use cases – e.g., scenario

• Representation fidelity• Mission, enemy,

terrain, troops, time Available (METT-T)

• Behaviors, tactics

UserDomain

• Use cases – e.g.,scenario

• Representation fidelity

• Mission, enemy, terrain, troops, time Available (METT-T))

• Behaviors, tactics

SimulationDomain

• Application types – analysis, training, acquisition

• Physics – laws, forces, systems• Representational requirements

– Performance & behaviors of real entities

• Missions, doctrine, operations, rulesof engagement/deployment

Problem Domain

M&S Requirements

Real-world based

Implement functions &

features

Page 11: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Finding a Referent

• Experimental data • Empirical data• Experience, knowledge, and intuition of

SMEs• Validated mathematical models• Qualitative descriptions• Other simulations• Combinations of the types described above

Conceptual model = Content and internal representations of the M&S; includes logic and algorithms; recognizes assumptions and limitations

DMSO, VV&A Recommended Practices Guide – Validation Special Topic

Page 12: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Human Behavior Model Referents

• SMEs

• Empirical observations or experimental data from actual operations

• Models of human behavior

• Models of physiological processes

• Models of sociological phenomena

• Simulations of human behavior

Page 13: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

When a Referent Doesn’t Exist

• Assemble from known components of the system or procedure

• Assemble from known basic phenomena underlying the system’s behavior

• Build a scale model of the system or its components and perform experiments

• Use the referents for a similar existing system or similar situations

DMSO, VV&A Recommended Practices Guide – Validation Special Topic

Page 14: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Conceptual Model ComponentsDMSO, VV&A Recommended Practices Guide – Conceptual Model Special Topic

The model should be as simple as possible, but not too simple

SpecificationsSpecifications

Conceptual ModelSimulation Environment

ObjectsObjects

ObjectsObjects

ObjectsObjects

Data/Nouns (Inputs and Outputs)• Attributes• Resources• Behavior states

Actions/Activities/Verbs• Functions & algorithms that

• Create/change data• Create additional actions

Environment• Constraints• Relationships• Geometry

RequirementsRequirements

Page 15: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Conceptual Model Analysis• Test/analyze component algorithms of overall model to validate

each individually Mathematical analysis Results of component algorithms should match available data Increases confidence that interactions of the collected algorithms

(i.e., the overall model) are valid• Algorithm testing

3rd party program (e.g., Excel) Should examine a range of data

• Assumption testing (supplementary or alternative approach) Determine assumptions (rarely stated) – structural, causal, and

mathematical Identify operational impacts of assumptions relative to intended

application Determine acceptability of operational impacts with Application

Sponsor (Accreditation Authority)• If they exist, unexpected/emergent interactions should appear in

model output• However, interactions between algorithms may not be

addressed

Page 16: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

V&V Technique Taxonomy

• Informal Determine “reasonableness” Most commonly used,

subjective Audit, review, face

validation, inspection, Turing test

• Static Assess accuracy of design Automated tools available Analyses:

semantic/structural, data/control, interface, traceability

• Dynamic Assess model execution Requires model

instrumentation Tests: acceptance,

fault/failure, assertion, execution, regression, predictive validation, structure, sensitivity, statistical

• Formal Complex, time consuming Induction, inference,

predicate calculus, proof of correctness

How much V&V depends on budgetary considerations, significance of supported decisions, and the risk of inaccuracy.

DMSO, VV&A Recommended Practices Guide – V&V Techniques Special Topic

Page 17: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

7 Recommended TechniquesDA-PAM 5-11, Verification, Validation, and Accreditation of Army M&S

• Face validation SME review

• Comparison to other M&S Legacy, non-Government,

alternative formulation

• Functional decomposition Validating the parts,

assuming the whole

• Sensitivity analyses Run boundary conditions

• Visualization Output appears to match

intent

• Turing tests “If it walks like a duck, …”

• Modeling-test-model Anticipate, experiment,

refine

Each technique has its drawbacks

Page 18: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Intuition vs. Data• Results match intuitive

expectations• Dynamic technique• Results

SMEs use intuition and estimates of expected behaviors and outputs

Model and system behaviors considered subjectively

• Best used in early stages of development

• Issues Dependent on experience with

the system being modeled to provide intuitive expectations

Subject to human error Difficult to predict

unexpected/emergent behaviors based on intuition/experience

• Results match data from past experience Historical, exercise, other

models• Dynamic technique• Reasonable results

Predictive validation – results provide a reasonable prediction of subsequent real-world behavior/results

Historical/exercise/model data should generate outputs similar to associated results

• Models should be consistent Multiple models for the same

system should produce the “same” results from the “same” data

Systematic biases will not be detected

Page 19: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Outline

• Validation defined

• General approach

• Issues for ABS validation

Page 20: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

DMSO, VV&A Recommended Practices Guide – Human Behavioral Representation (HBR) Special TopicMoya & Tolk, Toward a Taxonomy of Agents & MAS

Agent Validation

• Evaluate Conceptual model design Knowledge Base Engine and Knowledge

Base implementation Integration with simulation

environment

Behavior Engine

Simulated World

sensed input

action outputstate

changes

Behavior Engine

Internal State Representation

Knowledge Base

Representation of the Simulated World

state info

dependency functions

dependency results

Communication

Reasoning / Decision-making

Reactivity GoalsPerc

eptio

n

Agent

Beliefs Memory

Actio

n

Page 21: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Agent System ValidationMoya & Tolk, Toward a Taxonomy of Agents & MAS

Agent Agent

Agent AgentCommunicating

Sphere of influence

Environment

• Effect of parameter settings and system/agent instantiations (ranges, settings, interpretations, rules) Interactions Overall results

Page 22: Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007

Areas Affecting HBR Validity

• Interactions between multiple behaviors Assumes that interacting nonlinear behaviors will create

even more convoluted nonlinear behavior• Dependencies between properties in the behavior

space• Sensitivities between behavior space property

changes• Nonlinear behavior

Errors can hide or be misinterpreted• Nonlinear component behavior transitions• Complex environmental interactions• Stochastic behaviors

Probabilistic sensing

DMSO, VV&A Recommended Practices Guide – HBR Special Topic