Executable UML –the silver bullet… or maybe not…
Dr. Joel HenryOctober 22, 2008
Overview• Background
– Model Driven Development (MDD)
– Unified Modeling Language (UML)
– Executable UML (xUML)
• Testing challenges– When? Where? How?
• xUML testing– Integration into the process– Preliminary research results
Model Driven Development
• Model Driven Development is about making:
– software development more domain-driven as opposed to software restricted
– model development in a specific domain is more efficient (in terms of development time)
– maintenance is model-centered rather than software-centered activity (challenge!)
Model Driven Development
• Model Driven Development has advantages:
– Models are free of implementation artifacts –directly represent domain knowledge
– Domain experts can play a direct role in development
– Implementations for various platforms can be generated (Web, standalone, mobile device)
Model Driven Development
• Domain expert develops model(s)
• Using code generation templates, the model is transformed to executable code
• The generated code is merged with manually written code
Maintenance done HERE!
Model Driven Development
Instruction Fetch DecodeTrue
False
Cache
Instruction Hit in L1
DelayInstruction Miss
True
False
Cache
Instruction Hit in L2
L2 Cache DelayInstruction Miss
0 0
0
0
0
0
0
0
Model Driven Development
Model Driven Development
• Recap
– Specifying requirements is development
– Graphical and mathematical specification
– Model and then generate source code
• Examples– MatrixX– Matlab/Simulink
– xUML (Restricted UML that can be simulated)
UML – A brief primer
• Graphical representation– Class diagram– Use cases– Sequence diagrams– State charts
• Textual descriptions– Pseudocode in methods– State transition “actions” textual
• Many processes - design or reverse engineer
UML Example – class diagram
UML Example – use case
UML Example – sequence diagram
UML Example – state diagram
Challenges of UML
• Computationally incomplete
– UML describes a system by specifying the desired results (use cases, sequence diagrams)
– Specifies what the software produces but not how (…and…the devil is in the …)
• Missing key ingredients:
– Implementation of methods are specified by a “language dependent” pseudocode
– Actions associated with state machines are specified by a text string, or “action” part of UML
Executable UML
• Background
– Requirements is development
– Graphical and mathematical specification
• xUML– Restricted UML that can be executed (simulated)
UML V1.xxUML =Semantically
Weak Elements
- Precisely Defined Action
Semantics+
Executable UML
• xUML is an executable version of UML– clearly defined model structure– precise semantics for actions– action specification language for methods– an accompanying process
• xUML based on strict development process– executable models– large-scale reuse -> pattern based design
Executable UML
• xUML contains
– Domains (divide problem into smaller problems)
– Use cases (how domains work together)
– Sequence diagrams (what happens when)
– Within domains:• Class diagrams – each with a state transition diagram
• Methods – written in precise, but limited, ASL
– Bridges between domains
Executable UML – domain model
Executable UML – use case diagram
Executable UML – sequence diagram
Executable UML
• Model is a precise specification
– Can be simulated from use cases (Platform Independent Model - PIM)
– Easier transition to target system (Platform Specific Model - PSM)
– No ambiguity in the models
– Model results immediately available (simulation)
– Managers feel confidence that progress is being made
– No software development needed…but wait…
Executable UML
• Model requires
– State transition diagrams (simple…. but)
– Interfaces between classes and domains (requires some thought…)
– Action specification language (another programming language…)
– Testing (BIGGEST CHALLENGE)• How to test these models?
• Without testing how do we know any of the above is correct?
xUML System Development
System Requirements
Specify Domains
Specify PIM to PSM Translation
Build PIMUse cases
Class Diagram
State Charts
Methods
GeneratePSM
Deployed
System
TEST?
Requirements Driven Testing
Required but unimplemented functionality
Required and implemented functionality
Implemented but acceptable functionality
Implemented but unacceptable functionality
SOFTWARE REQUIREMENTS
SOFTWARE FUNCTIONALITY
Where is this line?
Requirements Driven Testing
• What does this mean?– Model and system functionality drives testing– Use knowledge of model design to generate tests
• Why do this beyond black box testing?– Monitor model functionality while testing– Detect defects and locations/causes of defects
• What does this require?– Knowledge of model and system– Tools to generate and configure test data– Ability to identify defects in test results
Requirements Driven TestingBall and Urn Analogy
Model BasedSoftware
(PIM or PSM)
Requirements
Test case
Compare
Test Results
123
Requirements Driven Testing AnalysisBall and Urn Analogy
Test case 1
Output Values and Events from Test 1
Test case 2
Test case 3
Output Values and Events from Test 2
Output Values and Events from Test 3
Requirements Driven TestingBall and Urn Analogy
• What does this mean?– Test cases for specific software functionality
• Likely to find defects• Critical for safety, reliability, success, etc.
• What to test?– Values around critical points– Large number of input value combinations– Sufficient coverage of each equivalence class
• What are the results?– Test coverage for input ranges and combinations– Output range coverage, reliability, MTTF, etc.
Testing Goals
• Testing solution requires:– Innovative, reusable, long-term testing
environment– Requirements and structure driven testing– Implement without change to models– Defect detection, test case re-execution, testing
measurement– Test model and translated model with same tests– Leverage past success with Matlab/Simulink
Testing Requirements
• Requirements– Input file/matrix– Output file/matrix – Sample time – variable or set frequency
• Variable Range– Input variable – min, max, and accuracy– Output variable – min, max, and accuracy
• Defects/Exceptions/Faults– Identification– Tracing
Test Execution
• Create test data– Functions, freehand, imported
• Execute tests– Configure input data – Wrap model, simulate, unwrap– Capture output values
• Capture results– Input, states, output– Detect exceptions
• Analyze results across multiple tests
Defect Detection
• Simple value range detection
• Percent change– Allows exception detection if the output value
changes more than a specified percent over a specified number of steps
• Absolute change– Allows exception detection if the output value
changes more than a specified amount over a specified number of steps
Defect Detection
• Advanced Exceptions– Combinations of exception definitions– Disjoint ranges
• Create exception definitions by time range • Combinatorial definitions based on multiple exception
definitions
– Overall system reliability• Scenario based reliability per major function• Overall reliability combining scenario reliability
Constraint Determination
• Search based method to find min or max values for a Simulink outport
• Two methods
– Genetic algorithm
– Combination of Simplex and Simulated Annealing
• Research tool (needs an interface, help, etc.)
• International acceptance (paper invited to conference in Oxford UK in 2007)
Constraint Determination
Matlab/SimulinkModel
ConstraintDetermination
Tool
Tests
Test results: Global MinimumGlobal MaximumInput values associated with Min and Max
TestResults
Integration of Tools and Methods
• Where to place the tools?
• How to use the tools effectively?
• What to do with results?
• How to gain acceptance?
MDA Testing & Tool Usage
Determine Objectives,
Alternates and Constraints
Evaluate alternatives
And Resolve Risks
Develop and Verify
Software and System
Plan Next
Iteration
Initial
Requirements
Risk
Analysis
Select
Solution
Build
and
Verify
Algorithm
Development
Translate
Link to 3GL code
Simulate
Test Simulate
Test
Execute
Test
xUML Testing Placement
System Requirements
Specify Domains
Specify PIM to PSM Translation
Build PIMUse cases
Class Diagram
State Charts
Operations
Validate PIMUse cases
Class Diagram
State Charts
Operations
GeneratePSM
Testing
Deployed
System
Testing
Test Data &
Drivers
xUML Testing Approach
• Build a set of testing domains independent of application domains
• Implement reusable testing methods within testing domains
• Encapsulate application-testing domain coupling in bridges
• Include the ability to automate testing
• New applications require only new bridges
• Suitable for PIM and PSM testing
xUML Testing Approach
Test Data
Application Domains
Expected Results
File Access Functions
Data Creation Functions
Test Execution Functions
Data Storage Functions
Test Measurement
Domain
Bridge
Bridge Bridge
Bridge
Bridge
DomainDomain
Bridge
xUML Testing Domains
• Data Creation
– User configurable functions generate test data
• File Access Functions
– Functions to retrieve data from xml files or DB
• Test Data
– Data format conversion functions
Test Data
File Access Functions
Data Creation Functions
xUML Testing Domains
• Expected Results
– Organize output data into a consistent format
• Test Execution Functions
– A set of test types that can be executed on any application domain (through bridges)
Expected Results
Test Execution Functions
xUML Testing Domains
• Data Storage Functions
– Output the test data, actual and expected results, and exceptions in a consistent format
• Test Measurements
– Functions that perform a variety of test measurements (MTTF, Range Coverage, etc.)
Data Storage Functions
Test Measurement
xUML Development Testing
1. Read input data2. Read expected results3. Configure the Input
Data (initialize, call-data pairs)
4. Execute the tests5. Capture test results6. Calculate test
measurements7. Store the test results
Test Data
Application Domains
Expected Results
File Access Functions
Data Creation Functions
Test Execution Functions
Data Storage Functions
Test Measurement
Domain
Bridge
Bridge Bridge
Bridge
Bridge
DomainDomain
Bridge
xUML Maintenance Testing
1. Read input data from data gathered during use
2. Read expected resultsa) Actual results from
deployed system ORb) Expected results for
new functionality3. Configure the Input Data
(initialize, call-data pairs)4. Execute the tests5. Capture test results6. Calculate test
measurements7. Store the test results
Test Data
Application Domains Version 2.0
Expected Results
File Access Functions
Data Creation Functions
Test Execution Functions
Data Storage Functions
Test Measurement
Domain
Bridge
Bridge Bridge
Bridge
Bridge
DomainDomain
Bridge
New Application Domains
Reuse xUML Testing Domains
Domain Domain
BridgeBridge
1. Read input data2. Read expected results3. Configure the Input
Data (initialize, call-data pairs)
4. Execute the tests (using new Test Bridges)
5. Capture test results6. Calculate test
measurements7. Store the test results
Test DataExpected Results
File Access Functions
Data Creation Functions
Test Execution Functions
Data Storage Functions
Test Measurement
Bridge
Bridge
xUML Testing Process
xUML Testing Comparison
• How did it work?– Unit testing: 31% less time than testing C++
– Integration testing: 9% less time than C++
– Requirements driven testing: still working…but…• More time to build testing domains than application!
• 90% less time for 2nd application (test bridges and test files)
• Caveats?– Unit testing has NO reuse across applications
– Integration testing has some reuse (test harness)
– Requirements driven testing largely reusable (testing domains)
Questions….?
This research funded by:
Lockheed Martin (Denver) This research done with:
MRI TechnologiesWith strong support from:
The University of Montana
Acronyms
• MDA – model driven application
• PIM – platform independent model
• PSM – platform specfic model
• xUML – executable unified modeling language
• MTTF – mean time to failure
• MATT – Matlab automated testing tool
• RATT – reliability automated testing tool
• GIST – graphical input specification tool