Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
Measuring Tests Using COSMIC
2°International Conference on
IT Data collection, Analysis and Benchmarking
Tokyo (Japan) - October 22, 2014
Thomas M. Fehlmann, Zü
Eberhard Kranich, Duisburg
Testing ICT Services in the Cloud
2IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Measuring Tests Using COSMIC
Goals of the presentation
�G1. Understand COSMIC measurements in testing
�G2. Free software testing from lines of code (LoC)
�G3. Measure and benchmark software testing
3IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
l 1981: Dr. Math. ETHZ l 1991: Six Sigma for Software Black Beltl 1999: Euro Project Office AG, Zürichl 2001: Akao Price 2001 for original contributions to QFDl 2003: SwissICT Expert for Software Metrics, ICTscope.chl 2004: Member of the Board QFD Institute Deutschland – QFD Architectl 2007: CMMI for Software – Level 4 & 5l 2011: Net Promoter® Certified Associatel 2013: Vice-President ISBSG
Dr. Thomas Fehlmann
l 1981: Dr. Math. ETHZ l 1991: Six Sigma for Software Black Beltl 1999: Euro Project Office AG, Zürichl 2001: Akao Price 2001 for original contributions to QFDl 2003: SwissICT Expert for Software Metrics, ICTscope.chl 2004: Member of the Board QFD Institute Deutschland – QFD Architectl 2007: CMMI for Software – Level 4 & 5l 2011: Net Promoter® Certified Associatel 2013: Vice-President ISBSG
4IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Eberhard Kranich
l Mathematics and Computer Science
l Emphasis on Mathematical Statistics
l Mathematical Optimization
l Theory of Polynomial Complexity of Algorithms
l Working at T-Systems International GmbH in Bonn, Germany
l Six Sigma Black Belt for Software Development
l Software Quality Assurance Manager
l Mathematics and Computer Sciencel Emphasis on Mathematical Statisticsl Mathematical Optimizationl Theory of Polynomial Complexity of Algorithmsl Worked at T-Systems International GmbH in Bonn, Germanyl Six Sigma Black Belt for Software Developmentl Software Quality Assurance Manager
5IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
What is a Defect?
l Defect = Behavior impacting expected or required functionality of software
è How many bugs?
è By counting the
size of defect repositories?
è By number of entries???
6IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Software Testing as a Game
l Tester sees selected sequences in the UML sequence diagram
l Tester can “walk” the data movements when planning or executing tests
è Functionality becomes visible to the agile team
è Defects impacting functionality become visible to testers
Other
Application
Other
Application
Some
Device
8.// Move some data
9.// Move some data
10.// Move some data
11.// Move some data
Other
Device
7IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Functionality, Defect Size, and Defect Density
l What happens if data movements have defects?
l Testers mark the data movement where a defect has been detected
l Same Metric:
èè ISO/IEC 19761 COSMICISO/IEC 19761 COSMIC
Other
Application
Other
Application
Some
Device
8.// Move some data
Move some data
10.// Move some data
11.// Move some data
Other
Device
l Functional Sizeè Number of Data Movements needed to implement all FUR
l Test Sizeè Number of Data Movements executed in Tests
l Test Storyè Collection of Test Cases aiming at certain FURs
l Defect Countè Number of Data Movements affected by some defect detected in a test story
8IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Defects Density Prediction?
Now he counts the defects!
And counts and adjusts test size
By ISO/IEC 19761 COSMICISO/IEC 19761 COSMIC
Other
Application
Other
Application
Some
Device
8.// Move some data
9.// Move some data
10.// Move some data
11.// Move some data
Other
Device
How does he knowHow does he know
that he foundthat he found
all the defects?all the defects?
9IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
ISO/IEC Standard 29119 on Software Testing
Published as ISO/IEC 29119 (2013-07) International Standard
Defines the Test Process
Calls for suitable Test Measures
10IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
ISO/IEC Standard 29119 on Software TestingOrganisational Test Process
Test Management Processes
Static Test
Processes
Test PlanningTest Monitoring
& ControlTest Completion
Organisational Test Documentation
Feedback on Organisational Test Documentation
Test Plan Updates
Test Plan
Test Completion Report
Test Measures
Dynamic Test
Processes
Test Plan, Control Directives
Test Management
Processes
Test Plan,Test Completion Report,Test Measures
Test Measures
Test Plan, Control Directives
Test Plan, Control Directives
© Tafline Murnane and Stuart Reid from ISO/IEC JTC1/SC7 WG26 Software Testing
11IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
The SW Testing Qualifications Board
l ISTQB
è 295’000 Certificates
è Iqnite Conferences (Sydney 2013)
l Third after ITIL and PMI
l Importance of Testing grows
12IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
But it’s Even Worse…
l What is Defect Density?
è Defects per KDLOC?
l What is Test Coverage?
è Code lines executed by some test case?
13IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
SW Testing and SW Metrics
l Counting practices for defect counting are undocumented
è “Number of Defects Found” per Stages / with Tests / etc.
è How do you count “Number of Defects”?
l Is it simply the number of entries in a defect repository?
è How can you avoid double reporting?
è Or make sure two defects are reported twice and not in a single report?
l A successor to the “Defect Measurement Manual” published by UKSMA in October 2000 is under review: “Defect Measurement and Analysis Handbook”
è By European cooperation
è Important enhancement for ISBSG’s Data Collection!
14IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
SW Testing and SW Metrics
l Counting practices for defect counting are undocumented
è “Number of Defects Found” per Stages / with Tests / etc.
è How do you count “Number of Defects”?
l Is it simply the number of entries in a defect repository?
è How can you avoid double reporting?
è Or make sure two defects are reported twice and not in a single report?
l A successor to the “Defect Measurement Manual” published by UKSMA in October 2000 is under review: “Defect Measurement and Analysis Handbook”
è By European cooperation
è Important enhancement for ISBSG’s Data Collection!
Review
Comments �
15IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Goal Profile
0.62
0.69
0.37
1) R001 Search Data
2) R002 Answer Questions
3) R003 Keep Data Safe
Functional User Requirements
A Simple Data Search Example
l Functional User Requirements (FUR) describe a very simple data search
l They meet Customer’s Needs
l And have a Priority Profile
1 Entry (E) + 2 eXit (X) + 2 Read (R) + 1 Write (W) = 6 CFP
User DataData
1.// Search Criteria
Trigger
2.// Write Search
3.// Get Result
4.// Show Result
5.// Nothing Found
6.// Show Error Message
Customer's Needs Topics Attributes
Y.a Data Access y1 Access Data Always Reliable Frequently
y2 Repeatable Responses Responses identical Always
Y.b Data Integrity y3 Cannot impact data No Write allowed
Goal Profile derived fromGoal Profile derived from
Voice of the CustomerVoice of the Customer
16IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
The Following SW Tests look Appropriate:
l Test Stories (scenarios) have
è Many Test Cases
è Each Test Case has
• Test Data
• Known Expected Response
l Test Size and Test Profiles can be measured
è by Functionality Covered
Test Story
CT-A Prepare CT-A.1 Retrieve Previous Responses
CT-A.2 Detect Missing Data
CT-A.3 Data Stays Untouched
Priority
Test Size
FALSE
Measured Defect Profile
0.43 11
0.74 18
0.51 1241
Show Defects
17IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Execute Test Case CT-A.1.1
Entering valid search string
è Returns expected response
è Test Size is 4 CFP
Test Story No. 1
Functional User Requirements
CT-A.1 Retrieve Previous Responses R001: Search Data R002: Answer Questions R003: Keep Data Safe Expected Response CFP
CT-A.1.1 Enter valid Search String X001,R001,W001,E001 X001,E001 R001,W001 Return (known) Answer 8
CT-A.1.2 Enter invalid Search String E001 R002,W001 Invalid Search String 3
Test Story Contribution (CFP): 5 2 4 Test Size 11
Test Case Measurements
for Test Story CT-A.1
1 Entry (E) + 2 eXit (X) + 2 Read (R) + 1 Write (W) = 6 CFP
User DataData
1.// Search Criteria
Trigger
2.// Write Search
3.// Get Result
4.// Show Result
5.// Nothing Found
6.// Show Error Message
18IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Total Test Size
Total Test Size is 11 + 18 + 12 = 41 CFP
è Compares to Functional Size of 6 CFP
è Yields a Test Intensity of 41/6 = 6.8
è On average, < 7 tests per data movement
Test Story No. 1
Functional User Requirements
CT-A.1 Retrieve Previous Responses R001: Search Data R002: Answer Questions R003: Keep Data Safe Expected Response CFP
CT-A.1.1 Enter valid Search String X001,R001,W001,E001 X001,E001 R001,W001 Return (known) Answer 8
CT-A.1.2 Enter invalid Search String E001 R002,W001 Invalid Search String 3
Test Story Contribution (CFP): 5 2 4 Test Size 11
Test Case Measurements
for Test Story CT-A.1
CT-A.2 Detect Missing Data R001: Search Data R002: Answer Questions R003: Keep Data Safe Expected Response CFP
CT-A.2.1 Enter valid Search String for No Data X002,R002,W001,E001 X001,R001,W001,E001 R002,W001 No Data Available 10
CT-A.2.2 Enter invalid Search String R001,W001,X002,E001 X002,E001 R002,W001 Invalid Search String 8
Test Story Contribution (CFP): 8 6 4 Test Size 18
CT-A.3 Data Stays Untouched R001: Search Data R002: Answer Questions R003: Keep Data Safe Expected Response CFP
CT-A.3.1 Enter valid Search String W001,E001 X001,R001 R001,W001 Return identical Answer 6
CT-A.3.2 Enter invalid Search String X002,E001 Invalid Search String 2
CT-A.3.3 Enter Same String Again R001,W001,X001,E001 Return identical Answer 4
Test Story Contribution (CFP): 2 8 2 Test Size 12
Test Size in CFP: 41
Test Intensity in CFP: 6.8
Test Coverage: 100%
19IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Recording a Defect
Test Story No. 2
Functional User Requirements
CT-A.2 Detect Missing Data R001: Search Data R002: Answer Questions R003: Keep Data Safe Expected Response CFP
CT-A.2.1 Enter valid Search String for No Data X002,R002,W001,E001 X001,R001,W001,E001 R002,W001 No Data Available 10
CT-A.2.2 Enter invalid Search String R001,W001,X002,E001 X002,E001 R002,W001 Invalid Search String 8
Test Story Contribution (CFP): 8 6 4 Test Size 18
Test Case Measurements
for Test Story CT-A.2
l “Bug” in 6.// Show Error Messageè Detected with data base containing no dataè Test Size is 4 CFPè 1 Defect1 Defect1 Defect1 Defect found!
1 Entry (E) + 2 eXit (X) + 2 Read (R) + 1 Write (W) = 6 CFP
User DataData
1.// Search Criteria
Trigger
2.// Write Search
3.// Get Result
4.// Show Result
5.// Nothing Found
6.// Show Error Message
20IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Defect Observed
One Defect Found
è Possibly observable in all tests touching the data movement 6.// Show Error Message – named X002
è Counted once
Name Label Description Name Label
#001 Escape Chars Some characters such as 'ä' are wrongly interpreted as escape characetrs in strings R001 Get Result
1
Defects Observed Data Movements Affected
Defect Count
Test Story No. 2
Functional User Requirements
CT-A.2 Detect Missing Data R001: Search Data R002: Answer Questions R003: Keep Data Safe Expected Response CFP
CT-A.2.1 Enter valid Search String for No Data X002,R002,W001,E001 X001,R001,W001,E001 R002,W001 No Data Available 10
CT-A.2.2 Enter invalid Search String R001,W001,X002,E001 X002,E001 R002,W001 Invalid Search String 8
Test Story Contribution (CFP): 8 6 4 Test Size 18
Test Case Measurements
for Test Story CT-A.2
21IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Test Status Reporting
l Test Status Characterization
è Test Size is the total number of data movements executed in all Test Cases of all test stories
è One Data Movement can have only one defect identified per Test Story
l However, one misbehavior found might affect more than one data movement and thus count for more than one defect
Test Status Summary
Total CFP: 6
Defects Pending for Removal: 2 Test Size in CFP: 41
Defects Found in Total: 2 Test Intensity in CFP: 6.8
Defect Density: 33% Test Coverage: 100%
22IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Caveat
l Test size increases with meaningless test cases added to the test stories – little variations of test data with almost identical expected responses might spoil measurements!
è We need a metric indicating that our test strategy is appropriate
23IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
What is Six Sigma Testing?
24IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
What is Lean Six Sigma Testing?
Replace manyReplace many
Test StoriesTest Stories……
… by those
needed for the
Eigensolution
…… go for thego for the
Eigenvector!Eigenvector!
25IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Calculating the Eigenvector with Jacobi Iterative
AA AAT
x
26IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Calculating the Eigenvector with Jacobi Iterative
AA AAT
x9999 0 3 0
0 9999 0 8888
1 0 5 3
9999 0
0 9999
3 0
0 8888
90 0 24
0 145 24
24 24 35
27IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Test Stories
Goal Test Coverage
Retrieve Previous Responses
Detect Missing Data
Data Stays Untouched
Ach
ieved Coverage
CT-A.1
CT-A.2
CT-A.3
R001 Search Data 0.62 5 8888 2 0.64
R002 Answer Questions 0.69 2 6 8888 0.66
R003 Keep Data Safe 0.37 4 4 2 0.40
Ideal Profile for Test Stories 0.43 0.74 0.51 Convergence Gap
0.42 0.7 0.5 0.04
0.10 Convergence Range
0.20 Convergence Limit
Test Stories
Deployment Combinator
Functional User Requirements
Measuring Test Coverage with Eigensolution
Number ofNumber of
data movementsdata movements
executedexecuted
28IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Lean & Six Sigma for Software Testing
Six Sigma
è Design of Experiments
è Multi-linear Regressionfor Root Cause Analysis and process control
l Lean
è Detect Waste (muda, 無駄)
è Test-Driven Development
è Defect-free delivery
ll Lean Six Sigma Lean Six Sigma Lean Six Sigma Lean Six Sigma Lean Six Sigma Lean Six Sigma Lean Six Sigma Lean Six Sigma è Predict Waste (muda, 無駄)è Eigenvectors solutions Eigenvectors solutions Eigenvectors solutions Eigenvectors solutions for Root
Cause Analysis and process controlè Predict Defect DensityDefect DensityDefect DensityDefect Densityè Q Control Charts Q Control Charts Q Control Charts Q Control Charts for SW Testing
29IT Confidence 2014 – October 22, 2014 http://itconfidence2014.wordpress.com
Questions?