View
214
Download
0
Tags:
Embed Size (px)
Citation preview
1/136
Copyright © 2005 Korson-Consulting
Testing Quasi Agile Software Projects
by
Timothy D. KorsonPresented at:
Korson-ConsultingP.O. Box 2201
Collegedale, TN 37315Phone: 423.432.9836
Fax: 614.583.9836
www.korson-consulting.com
Testing Quasi Agile Software Projects
3/136
Copyright © 2005 Korson-Consulting
Restricted UseThis copyrighted material is provided to attendees of Korson-
Consulting courses under a restricted licensing agreement exclusively for the personal use of the attendees. Korson-
Consulting retains ownership rights to the material contained in these course notes.
Please respect these rights.
Any presentation or reuse of any part of this material in any form must be approved in writing by tim@korson-
consulting.com
Copyright © 2005 Korson-Consulting. All rights reserved.
4/136
Copyright © 2005 Korson-Consulting
Testing Iterative/Incremental Projects Abstract
• Most corporations are still fairly traditionally structured• Many software development teams are heading full steam
into modern iterative/incremental development techniques and exploring other new technologies such as MDA.
• This leaves corporate QA stuck coping with an organizational and technical paradigm shift that traditional QA practices are inadequate to handle.
• In the highly iterative, fast paced, environment characteristic of these agile development projects, traditional approaches to budgeting, testing, quality assurance, requirements gathering, scheduling and estimating, etc. break down.
• QA managers trying to encourage best practices as recommended by CMMI and SPICE find themselves at odds with developers trying to adopt best practices as recommended by the agile manifesto.
5/136
Copyright © 2005 Korson-Consulting
Testing Iterative/Incremental Projects Abstract
• In the end no one wins. • Because of the constraints of corporate policies and
management edicts, developers can’t fully adopt modern software engineering practices.
• Because the developers do adopt as much of the agile process as they can get away with, the QA team finds that traditional approaches to quality management don’t work.
• Such projects must succeed in what I call a quasi agile development environment.
• In my experience these quasi agile development environments characterize a large percentage of today’s significant software projects. Lack of explicit understanding of this reality, and failure to actively adapt to it, is causing significant problems in many software development organizations.
6/136
Copyright © 2005 Korson-Consulting
Table Of Contents
Unit 1 – Modern Software Development Processes
Unit 2 – How testing and quality assurance is affected by a modern iterative, incremental software development process
Unit 3 – Creating System Test CasesUnit 4 – Developers Test TooUnit 5 – Test first developmentUnit 6 – Summary
7/136
Copyright © 2005 Korson-Consulting
For Additional Details on Testing
Practical Guide to Testing Object-Oriented Softwareby David A. Sykes, John D. McGregor
8/136
Copyright © 2005 Korson-Consulting
For Additional Details on UML Vocabulary
http://www.uml.org/
9/136
Copyright © 2005 Korson-Consulting
Unit 1
Modern Software Development Processes
10/136
Copyright © 2005 Korson-Consulting
Who Cares?• Process
• Technology
Analysis
Design
Code
Test
What Difference Does it Make to Testers?
Domain Analysis
Application Analysis
Application Design
Class Design and
Development
Application Assembly
Testing
11/136
Copyright © 2005 Korson-Consulting
Over the Wall
System
ProcessTechnology
12/136
Copyright © 2005 Korson-Consulting
System Testing
System
Tester
13/136
Copyright © 2005 Korson-Consulting
Levels of QA and Testing
• Model Tests– Requirements model
– Domain Model
– Design Model
• Code Tests– Component Tests
– Integration Tests
– Increment tests
– System tests
14/136
Copyright © 2005 Korson-Consulting
System Testing
15/136
Copyright © 2005 Korson-Consulting
Traditional Software Engineering
The process makes it easier to specialize and handoff
Analysis
Design
Code
Test
16/136
Copyright © 2005 Korson-Consulting
Modern Software Engineering
Domain Analysis
Application Analysis
Application Design
Class Design and
Development
Application Assembly
Testing
The process encourages integrated teams
17/136
Copyright © 2005 Korson-Consulting
Iterative
• Why plan to do certain activities over again?
– Add more detail
– Correct mistakes
– Iterate within an increment
– Improve quality
– Account for cyclical dependencies
An iterative development process means there is more regression testing
18/136
Copyright © 2005 Korson-Consulting
Incremental Model
• An increment is some subset of the system that is “completely” coded and tested.
AnalysisHigh-level design
Detailed designImplementation
TestingProduction
19/136
Copyright © 2005 Korson-Consulting
Example• Business logic to post a simple transaction
• GUI to enter transaction data
• DBMS to persist transaction
• Log on and user authentication code
• Additional business logic to all layers
• …
Multi-currency, fund accounting system
20/136
Copyright © 2005 Korson-Consulting
Component Based
• In a component based development the increments often coincide with components that need to be developed
• Purchased components often need in-house testing
• Example: 3rd partyspreadsheet Java moduleto do the budget worksheet
21/136
Copyright © 2005 Korson-Consulting
It’s Incremental Everything
Other important documents, such as the user manual and design documents, are also developed, and verified incrementally.
Requirements Specification
Domain Model
Design Document
Analyze
Design
Implement
Analyze
Design
Implement
Analyze
Design
Implement
22/136
Copyright © 2005 Korson-Consulting
Basic Development Process
light activity
heavy activity
In
Increment
In+1 In+2 In+3
Domain Analysis
Application Assembly
Application Design
Class Development
Application Analysis
Cradle to grave teams work best
23/136
Copyright © 2005 Korson-Consulting
Stepping through the process Getting Started
Domain Analysis
Application Analysis
Phase
Application Design
Code
Integrate and Test
Project ScopeTestin
g?
24/136
Copyright © 2005 Korson-Consulting
Stepping through the process Increment 1 Project Scope
Testin
g?
I1Domain Analysis
Application Analysis
Phase
Application Design
Code
Integrate and Test
25/136
Copyright © 2005 Korson-Consulting
Stepping through the process Increment 2
Project Scope
When performing the Ith increment, bring the entire analysis and design up to date.
Testin
g?
I1 I2Domain Analysis
Application Analysis
Phase
Application Design
Code
Integrate and Test
26/136
Copyright © 2005 Korson-Consulting
Process Spectrum
Waterfall XP
27/136
Copyright © 2005 Korson-Consulting
Agile Manifesto
We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:
• Individuals and interactions over processes and tools
• Working software over comprehensive documentation
• Customer collaboration over contract negotiation • Responding to change over following a plan
That is, while there is value in the items on the right, we value the items on the left more.
Kent Beck et al - February 13, 2001 – Snowbird, Utah
agileManifesto
28/136
Copyright © 2005 Korson-Consulting
Quasi Agile Development
• A QAD environment is one where the manager must integrate:– the values and practices of agile development – the constraints of corporate policies– The reality of large scale development– The complexity of dealing with multiple, scarce,
stakeholders
• The problems encountered by the manager of a QAD environment will primarily arise from a fundamental difference in core beliefs and values.
29/136
Copyright © 2005 Korson-Consulting
Individuals and interactions over processes and tools
Agile principles - 1
30/136
Copyright © 2005 Korson-Consulting
Working software over comprehensive documentation
Agile principles - 2
31/136
Copyright © 2005 Korson-Consulting
Customer collaboration over contract negotiation
Agile principles - 3
32/136
Copyright © 2005 Korson-Consulting
Responding to change over following a plan
Agile principles - 4
33/136
Copyright © 2005 Korson-Consulting
Agile Practices
• the planning game, • small releases, • metaphor, • simple design, • travel light,• no functionality added early,• refactoring, • test-first development, • pair programming, • collective ownership, • continuous integration, • 40-hour week, • on-site customer, and • coding standards.
Disclaimer: XP andAgile are not synonymous
34/136
Copyright © 2005 Korson-Consulting
Frequent, Small Releases
• Based on iterative development• Very good for the development team• Time consuming, albeit necessary, for
the clientlight activity
heavy activity
In
IncrementIn+1 In+2In+3
Domain Analysis
Integration and regression testing
Application Design
Class Development
Application Analysis
35/136
Copyright © 2005 Korson-Consulting
Refactoring
• Refactor mercilessly to keep the design simple as you go and to avoid needless clutter and complexity.
• Keep your code clean and concise so it is easier to understand, modify, and extend.
• Make sure everything is expressed once and only once.
• In the end it takes less time to produce a system that is well groomed.
36/136
Copyright © 2005 Korson-Consulting
Pair Programming
37/136
Copyright © 2005 Korson-Consulting
Collective Ownership• Collective Code Ownership encourages everyone to
contribute new ideas to all segments of the project. Any developer can change any line of code to add functionality, fix bugs, or refactor. No one person becomes a bottle neck for changes.
• This is hard to understand at first. It's almost inconceivable that an entire team can be responsible for the system's architecture. Not having a single chief architect that keeps some visionary flame alive seems like it couldn't possibly work.
• An extensive regression test suite and near continuous integration makes this feasible for small teams.
38/136
Copyright © 2005 Korson-Consulting
Almost Continuous Integration
• Almost continuous integration avoids or detects compatibility problems early. Integration is a "pay me now or pay me more later" kind of activity.
• Developers should be integrating and releasing code into the code repository every few hours, when ever possible. In any case never hold onto changes for more than a day.
• Each development pair is responsible for integrating their own code when ever a reasonable break presents itself.
39/136
Copyright © 2005 Korson-Consulting
On-Site Customer
• One of the few requirements of extreme programming (XP) is to have the customer available as a vital part of the development team.
• All phases of an XP project require communication with the customer, preferably face to face, on site.
• Beware of the customer's department trying to pass off a trainee as an expert. You need the expert.
40/136
Copyright © 2005 Korson-Consulting
Why are QAD Environments so
Common?• Agile principles and practices are compelling
intuitively to developers• Agile principles and practices are counter
intuitive to a large number of managers, analysts, testers, and stakeholders
-----------• Many developers have experienced success
using agile principles and practices• Agile principles and practices conflict with the
commonly accepted management processes and corporate policies
41/136
Copyright © 2005 Korson-Consulting
Bottom Up
• “The interest in XP generally comes from the bottom up, from developers and testers tired of burdensome processes, documentation, metrics, and formality. These individuals are not abandoning discipline, but excessive formality that is often mistaken for discipline. They are finding new ways to deliver high-quality software faster and more flexibly.”
42/136
Copyright © 2005 Korson-Consulting
RUP
43/136
Copyright © 2005 Korson-Consulting
Balancing Agility and Discipline
• Personnel– Less skilled – Highly Skilled
• Criticality– Many lives, Single life, Essential Funds, Discretionary funds, Comfort
• Project Size– 300, 100, 30, 10, 3
• Culture– Requires Order, Thrives on Chaos
• Dynamism– Very stable requirements, Highly volatile requirements
Plan Driven Agile
44/136
Copyright © 2005 Korson-Consulting
Balancing Agility and Discipline
• Industry oversight– Heavily regulated, somewhat regulated, unregulated
• Hardware dependencies– Embedded, custom attached devices, desktop application
• Organizational culture– Suit and tie, business casual, jeans
• Industry type– Commercial businesses; Hi tech: aerospace, software
• Funding capability– Deep pockets, standard funding, shoestring budget
• Organization size– Huge: government; Small: start up
Plan Driven Agile
45/136
Copyright © 2005 Korson-Consulting
Unit 2
How Testing And Quality Assurance is Affected by a
Modern Iterative, Incremental Software Development Process
46/136
Copyright © 2005 Korson-Consulting
Testing Process Should Match the Development Process
• Basic development process is incremental• Iterate within increments, with prototype support
as necessary. Often this will involve reworking one piece of the system several times before an increment is finished. Previous increments are only revisited to fix errors or serious flaws.
• Testing process should be incremental• Early increments are gray boxes• The testing perspective is black box, but an early
increment may have lots of incomplete functionality and may not be executable as a “stand alone” system.
47/136
Copyright © 2005 Korson-Consulting
Every Increment
• Model testing• Component testing• Integration testing• Increment testing• Regression testing
48/136
Copyright © 2005 Korson-Consulting
RUP Workflow
TestEngineer
ComponentEngineer
Integration Tester
System Tester
Plan Test Design
Test
Implement Test
Perform Integration
Test
Evaluate Test
Perform System
Test
*Figure 11.8 page 304 “The Unified Software Development Process, Jacobson, Booch, Rumbaugh
Figure 11.8 The workflow during testing, including the participating workers and their activities.
49/136
Copyright © 2005 Korson-Consulting
Who Should Play Each Role?
TestEngineer
System/Increment Tester
Component/Class Tester
Integration/Cluster Tester
*Figure 11.8 page 304 “The Unified Software Development Process, Jacobson, Booch, Rumbaugh
Figure 11.8 The workflow during testing, including the participating workers and their activities.
Test Professional Developer
50/136
Copyright © 2005 Korson-Consulting
Importance of Mentoring
• Testing is becoming a well-defined specialty in software development. Unfortunately, most developers do not have the level of testing expertise we would like for them to have.
• Each developer and tester should have access to a mentor who can provide guidance in developing, selecting, organizing, and executing test cases.– Testers mentor the developers– Expert testers mentor novice testers
51/136
Copyright © 2005 Korson-Consulting
Organization Test Strategy
• A comprehensive testing strategy for an organization will accomplish the following:– Define an organizational structure that
supports and facilitates testing at all levels
– Establish responsibilities for all testing activities
– Establish policies and procedures for each organizational group.
– Establish a close relationship between testers and developers and clients
52/136
Copyright © 2005 Korson-Consulting
Organizational Approach #1
• Advantages___
• Disadvantages___
Define
Team 1Development
Testing
Development
Team 2 Team 3Development
TestingDevelopment
Testing
53/136
Copyright © 2005 Korson-Consulting
Organizational Approach #2
• Advantages___
• Disadvantages___
Define
Development
Team 1 Team 2 Testing
54/136
Copyright © 2005 Korson-Consulting
Organizational Approach #3
• Advantages___
• Disadvantages___
Define
Development
Team 1 Team 2 Team 3
Testing
55/136
Copyright © 2005 Korson-Consulting
Recommended Organizational Approach
Define
Team 1Development
Testing
Development
Team 2 Team 3Development
TestingDevelopment
Testing
Testing
56/136
Copyright © 2005 Korson-Consulting
Choosing An Approach
• No matter which organizational approach you have, the important thing is to achieve good communication, cooperation, and collaboration.
57/136
Copyright © 2005 Korson-Consulting
Traditional Effort Distribution
• Waterfall Model
Test
Developer-ledTesting
Group
Time
Foc
us
58/136
Copyright © 2005 Korson-Consulting
Inception Elaboration Construction Transition
Requirements
Analysis
Design
Implementation
Test
Preliminary iteration(s)
Itr. #1
Itr. #2
Itr. #n
Itr. #n+1
Itr. #n+2
Itr. #m
Itr. #m+1
Core Workflows Phases
Iterations
*Figure 11.2 page 296 “The United Software Development Process, Jacobson, Booch, Rumbaugh
RUP Effort Distribution
59/136
Copyright © 2005 Korson-Consulting
RUP Effort Distribution• Developers use an iterative/incremental
model – Testers adapt but don’t convert
Test
Developer-ledTesting
Group
Time
Foc
us
60/136
Copyright © 2005 Korson-Consulting
Ideal Effort Distribution• Testers and developers use an
iterative/incremental model
Test
Developer-ledTesting
Group
Time
Foc
us
61/136
Copyright © 2005 Korson-Consulting
• Mentor• Auditor• Test Engineer• Model tester • Increment tester• Integration tester• Framework tester • Component certifier• System Tester
Changing Role of Test Professionals
62/136
Copyright © 2005 Korson-Consulting
Test Philosophy
• Test early• Test often• Test enough
63/136
Copyright © 2005 Korson-Consulting
Test Process
• Analyze a little• Design a little• Code a little• Test what you can
64/136
Copyright © 2005 Korson-Consulting
Why Do We Test?
What is the difference between the testers perspective and the developers perspective?
65/136
Copyright © 2005 Korson-Consulting
Test Allocation Strategies
• Directed tests– Test where the
errors are
• Representative tests– Risk-based– Operational
profile
66/136
Copyright © 2005 Korson-Consulting
Refactored Code
• The agile development philosophy results in frequently refactoring code
• Frequent restructuring of code that is already tested calls for lots of regression testing– Requires automated
test suites
67/136
Copyright © 2005 Korson-Consulting
Testing refactored code and regression testing
• Automated tools become essential
68/136
Copyright © 2005 Korson-Consulting
Rationing Test Cases
Use Case #
Frequency
Criticality Risk
Combined
Frequency/
Criticality/ Risk
Relative allocatio
n of testing effort
1 Low Low Low Low 1
2 Low Medium High Medium 3
3 High High Medium High 9
4 Low High Medium Medium 3
5 Low Medium Medium Medium 3
6 Low Medium Medium Medium 3
7 Low Medium Medium Medium 3
69/136
Copyright © 2005 Korson-Consulting
New Tester Skill Sets
• Need to be able to test “partially finished” artifacts– Development environment skills
• Team skills– Good at working with developers
• Familiarity with UML– Use case to test case
• Familiarity with the software development process
Start here!!
70/136
Copyright © 2005 Korson-Consulting
Testing “partially finished” artifacts
• Often requires mock objects• This is a real mindset change for system
testers
71/136
Copyright © 2005 Korson-Consulting
New Developer Skill Sets
• Testing skills– Test planning, test case selection, unit test
execution
• Team skills– Good at working with testers and clients
• Willingness• Familiarity with the testing
process• Testing tools
– JUnit
72/136
Copyright © 2005 Korson-Consulting
Unit 3
Creating System Test Cases
73/136
Copyright © 2005 Korson-Consulting
Use Cases• A use case describes a related set of
end-to-end scenarios (sequence of actions) that describes a use of the system by a user (actor) to accomplish a specific goal.
• Use cases:
– Describe a system’s functional requirements
– Should include additional information such as risk assessment
Define
74/136
Copyright © 2005 Korson-Consulting
Use Cases
• Use cases provide the following benefits:
– Capture requirements from the users’ perspective
– Involve users in the requirements gathering process
– Provide a basis for the identification of classes and relationships
– Provides traceability from requirements through analysis and into design
– Serve as the foundation for system test cases
75/136
Copyright © 2005 Korson-Consulting
Use Case TemplateUse Case ID: {This should be coded to identify the owning team and the level of the use case}
Use Case Type: {Essential, Concrete, Abstract}
Use Case Name: {Short descriptive phrase}
Basic Course: {This is a complete description of the use. Each subsection is explained below.}
Actor(s): {Which actor(s) initiate and participate in this course of the use case?}
Pre-conditions: {Requirements on the state of the system prior to this use being valid.}
Description: {Numbered flow of events: 1 The user … 2 The system responds by...}
{In this section reference is made to abstract use cases that this use case uses.}
Relevant requirements: {Reference to other relevant requirements documents.}
Post-conditions: {This describes the state of the system following the completion of this use. Affects on other systems and actors may also be described.}
Alternative Courses and Exceptions: {Structured like the basic course}
Rationale: {Explanation of why this requirement is present in the system. This field is typically only used for essential use cases}
Extensions: {This section briefly describes extensions to this use case. It references those use cases that have an extends relation with the current use case.}
Concurrent Uses: {This use can occur concurrently with the uses listed in this section.}
Related Use Cases: {use cases that are either usually performed just before or after the current use.}
76/136
Copyright © 2005 Korson-Consulting
Use Case Template(Continued)
Decision Support
Frequency: {How often will this use of the system occur? This field is combined with the other decision support fields to determine the number of tests that will be used to test the functionality. It may also be used in certain design decisions.}
Criticality: {How necessary is this use to the successful functioning of the program from a user’s perspective. How bad would it be if this use case failed? This field is used to assist in determining the extent to which the use will be tested.}
Risk: {The project risk associated with providing this use. How likely are we to encounter problems in correctly implementing this use case? This is used in project planning. Often riskier uses will be implemented early to provide sufficient recovery time.}
---------------------------------------------------------------------------------------------------------------------
Modification History -- {Follow the standard corporate document versioning template}
Owner:
Initiation date:
Date last modified:
77/136
Copyright © 2005 Korson-Consulting
System Test Cases
• Modern processes usually employ a use case model to represent the majority of system requirements
• This model represents both the users of the system and their requirements for the system
Use Cases
Design
Test Cases
System
Test Cases
78/136
Copyright © 2005 Korson-Consulting
Use Case to Test Case
Instance 1
Instance n
Instance n+1
Instance n+m+1
Instance n+m+j
Instance n+m
Test case 1
Test case n
Test case n+1
Test case n+m+1
Test case n+m+j
Test case n+m
*
* A use case instance is often called a scenario
79/136
Copyright © 2005 Korson-Consulting
Hierarchical Use Cases
• Use cases can be layered to make the use case model more understandable for large systems
• Each lower layer provides “specializations” of the use case above it
• This approach directly supports a hierarchical set of test cases and can promote reuse of design and code
Define
General Use Case
Specific Use Case
80/136
Copyright © 2005 Korson-Consulting
Developing Test Cases
• Scenarios serve as a guide to the creation of test cases
• Use risk analysis, equivalence classes, and boundary conditions to determine the number of test cases per use case.
Use Case Use CaseUse Case Use Case
Use Case Use CaseUse Case Use Case
scenario scenarioscenario
Alternate AlternateBasic course Alternate
Test Case Test Casetest Case
81/136
Copyright © 2005 Korson-Consulting
Each Level is Complete
1. Define course policies
1.11 Define late policy 1.2 Define category weights
• Use case 1.1 is a specific, more detailed, complete use case with the category of use cases defined by use case 1.
82/136
Copyright © 2005 Korson-Consulting
Use cases are best developed iteratively and
incrementally
• The only way to get quality is to iterate• Requirements change while the system is being
developed• As the development team better understands the
domain, they are better able to review the use cases
83/136
Copyright © 2005 Korson-Consulting
Use Case 1
• Customer buys soda at vending machine– customer inserts enough coins for purchase– machine displays total deposited– customer pushes button to indicate selection– machine dispenses soda can to bottom tray and
change to change tray– customer retrieves soda and change
Many OO teams incorrectly think the first level of use cases should jump directly to interface specifications.
84/136
Copyright © 2005 Korson-Consulting
Business requirements should be kept separate from interface
specifications
ACCEPT PAYMENT
Electronic Cash
85/136
Copyright © 2005 Korson-Consulting
How?
• Keep first n levels of the use case hierarchy interface neutral
86/136
Copyright © 2005 Korson-Consulting
Use Case to Test CaseBuy Soda Update PricesStock Machine Collect Money
Blue toothAuto debit
…Credit cardPurchase
Cash Purchase
.15 changeFor $.85 coke
.25 change for$.75 Sprite
.10 changefor .65 Water
Change …Exact paymentNo change needed
Cancelmid purchase
Test Case Test Casetest Case
An increment might involve developing as little as a single course within a use case
Essential Use Cases:
Concrete Use Cases:
Courses (alternates) within a Use Case:
Scenarios (use case instances):
87/136
Copyright © 2005 Korson-Consulting
Use Case Profiles• The number of test cases will vary based on the priority of the use
case.Use Case#
Frequency
Criticality
Risk Combined FrequencyCriticalityRisk
Relative allocation of testing effort
1 Low Low Low Low 1
2 Low Medium High Medium 3
3 High High Medium High 9
4 Low High Medium Medium 3
5 Low Medium Medium Medium 3
6 Low Medium Medium Medium 3
7 Low Medium Medium Medium 3
88/136
Copyright © 2005 Korson-Consulting
Summary
89/136
Copyright © 2005 Korson-Consulting
Unit 4
Developers Test Too
90/136
Copyright © 2005 Korson-Consulting
Unit Testing
• Developers will test their own components
• Testers may help plan, oversee and evaluate the unit testing process– So testers need to be familiar with unit
testing
• Testing early increments will feel like unit testing
91/136
Copyright © 2005 Korson-Consulting
Testing components
• System test group may be called upon to certify components destined for the corporate software library
92/136
Copyright © 2005 Korson-Consulting
Testing Classes
• A message is sent to an object to invoke a method.
• The only way to interact with an object is to send it a message. Each message must correspond to a method in the receiving object’s protocol.
• Each class is a package of methods and attributes that must be tested as a unit.
Define
93/136
Copyright © 2005 Korson-Consulting
Implications For Testing
• The separation of specification from implementation allows functional test cases (written from the specification) to be created separately from and prior to structural test cases (written from the implementation).
• Messaging between objects is similar to subroutine or function calls in procedural systems but they occur more frequently and may use dynamic binding.
• How does this affect testing?
94/136
Copyright © 2005 Korson-Consulting
Specifications
• Systems are specified by Use Cases– Stories are simplified Use Cases
• Components are specified by the syntax and semantics of the methods in the public interface of the component– The syntax is unique to the programming
language or component environment– The semantics are specified by the pre-
conditions, post-conditions and class invariants
95/136
Copyright © 2005 Korson-Consulting
Definitions
• Pre-Condition – The pre-condition for a method is a statement of the assumptions made in designing the method. For example, the Add method for a list could have a pre-condition that no Add will be attempted if the list is full.
• Post-Condition – The post-condition for a method is a statement of the results produced by a method, provided the pre-condition was true initially. For example, the Add method for a list would have a post-condition that the item has been added to the list, assuming the list was not full initially.
96/136
Copyright © 2005 Korson-Consulting
Design By ContractStrong Preconditions
Class Savings Account
Open: (acc_name:name, amount:money, kind:instrument)Pre: not open and amount >= required_opening_depositPost: open AND balance = amount
Deposit (amount:money, kind:instrument)Pre: openPost: balancepost =balancepre + amount ANDIf amount >= notify.amount and kind=cash
Then IRS has been notified..
invariant: balance >= 0
97/136
Copyright © 2005 Korson-Consulting
Design By ContractWeak Preconditions
Class Savings Account
Open: (acc_name:name, amount:money, kind:instrument)
Pre:none
Post: if already.open throw already_open exception else
if amount < required_opening_amount throw amount_exception else
open AND balance = amount
Deposit (amount:money, kind:instrument)
Pre:none
Post: If not open then throw not_open exception else
balancepost =balancepre + amount AND
if amount >= notify.amount and kind=cash
then IRS has been notified
. . .
invariant: balance >= 0
98/136
Copyright © 2005 Korson-Consulting
Testing PerspectiveDeposit (amount:money, kind:instrument)
Pre: nonePost: If not open then throw not_open exception else
balancepost =balancepre + amount ANDif amount >= notify.amount and kind=cash
then IRS has been notified
Deposit (amount:money, kind:instrument)Pre: openPost: balancepost =balancepre + amount AND
If amount >= notify.amount and kind=cashthen IRS has been notified
To write the test script, the tester needs to know which style of programming the developer has used!
99/136
Copyright © 2005 Korson-Consulting
Levels Of Testing• Class testing is the object oriented
equivalent of unit testing in the procedural paradigm. Class testing combines traditional black box/white box testing approaches with some aspects of integration testing.
• Cluster testing is used to test a set of closely interacting classes. The focus at this level is the interaction between objects in the cluster.
• System testing is used to test a complete system. The focus at this level is demonstrating the required functionality and performance.
Define
100/136
Copyright © 2005 Korson-Consulting
Types Of Test Cases• Three types of test cases allow the
tester to know that various aspects of the component or system under test are correct:
– Functional test cases are created to verify the product against the specification
– Structural test cases are created to fully exercise the code
– Interaction test cases are created to explicitly determine the correctness of the interactions
Define
101/136
Copyright © 2005 Korson-Consulting
Functional Test Cases• Constructed by analyzing the specification of the
class – the aggregation of the specifications of all methods
• Coverage may be expressed in terms of the percentage of post-conditions, or the percentage of transitions in the state representation, covered by the selected test cases
• Remember: It is impossible to guarantee any level of coverage of the underlying implementation
• Synonyms: (1) Specification-based; (2) Black-box testing
102/136
Copyright © 2005 Korson-Consulting
Functional Test Case Construction
• Each pre-condition is used to establish the appropriate testing environment for the object
• Each post-condition is a logical statement constructed as a sequence of “if-then” clauses. These may be linked together with disjunctive (“or”) clauses. Each disjunctive clause may have several conjunctive (“and”) clauses
103/136
Copyright © 2005 Korson-Consulting
Functional Test Case Construction
Example:(If the UserID is valid, then the system will …)
or (if the UserID is invalid, then the system will ….)
Example
104/136
Copyright © 2005 Korson-Consulting
Functional Test Case Construction
• Create a test case for each “or” clause.
– For each “or” clause check the resulting environment to determine that every “and” clause within the “or” clause has been satisfied.
• Create a test case for each exception.
• Create test cases for obvious boundary conditions such as empty stacks, full stacks, a stack with only one element.
105/136
Copyright © 2005 Korson-Consulting
Equivalence Classes
• An equivalence class consists of all data values that should be processed identically based on the requirements.
Age
0 16 18 55
Define
106/136
Copyright © 2005 Korson-Consulting
Boundary Value Testing
• Boundary value testing chooses test conditions at the extremes of the legitimate range of values.
• Should boundary testing include values that are not legitimate, i.e. that do not meet the pre-conditions?
Define
107/136
Copyright © 2005 Korson-Consulting
Structural Test Cases• Constructed by analyzing the internal
implementation of the class
• Coverage is expressed in terms of the percentage of the code that is executed. This can be defined in terms of statements, branches, or paths
• Unfortunately, we can’t usually cover all paths so we can’t really say how much of the specification has been covered
• Synonyms: (1) Program-based; (2) White-box; (3) Clear-box
108/136
Copyright © 2005 Korson-Consulting
Structural Test Cases
• Use techniques similar to those for procedurally-oriented analysis to identify test cases that exercise the paths through the code.
109/136
Copyright © 2005 Korson-Consulting
Exercise 4.3 Background
• Review the following code for the computeActualCost method:
Dollars cost computeActualCost(Rate r, Hour h){int days, hours, weeks;days=h/24;hours=h%24;weeks=days/7;days=days%7;if (r==”standard”) return (240*weeks + 40*days + 8*hours);if (r==”A”) return (140*weeks + 28*days);if (r==”B”) return (150*weeks + 30*days);if (r==”C”) return (160*weeks + 32*days);if (r==”D”) return (230*weeks + 36*days + 3*hours);}
• Determine the number of test cases necessary to test all of the unique paths in this method. Define each of these test cases.
Exercise
110/136
Copyright © 2005 Korson-Consulting
Interaction Test Cases - Intraclass
• Test cases are identified by considering methods that access a common attribute or send messages to other methods within the object.
• Coverage would be expressed as a percentage of interactions tested.
Define
111/136
Copyright © 2005 Korson-Consulting
Interaction Test Cases - Intraclass
• Consider these two methods in the same class:
• Do these methods interact?
set_width()
int width
get_width()
112/136
Copyright © 2005 Korson-Consulting
Interaction Test Planning Matrix
• Intra-class interactions are identified as either one method invoking another (M), or as two methods messaging the same object (O), or two methods sharing data (D).Start Stop ResetRead
Start
Stop
Read
Reset
Document
113/136
Copyright © 2005 Korson-Consulting
Test Case Execution Sequence
• Develop a functional test suite that covers the complete class specification
– Develop state-based test cases for all of the transitions in the dynamic model
• Develop test cases that test the interactions between methods within a class and between classes
• Develop structural test cases to cover every line of code and every conditional
114/136
Copyright © 2005 Korson-Consulting
Parallel Architecture For Component Testing
GenericTest Harness
TesterOfClass2
TesterOfClass4
TesterOfClass1
TesterOfClass6
TesterOfClass5
TesterOfClass3
PACT
Class1
Class6Class5Class3Class2
Class4
Production Architecture
115/136
Copyright © 2005 Korson-Consulting
Generic Test Harness Class
• Methods in the Generic Test Harness include:– Test scripts that sequence test execution– Logging and reporting mechanisms– Methods that “catch” exceptions– Methods that watch for memory leakage– Methods that access test hardware
116/136
Copyright © 2005 Korson-Consulting
JUnit
• JUnit is a java-based implementation of the PACT testing framework created by Kent Beck and Erich Gamma
• You can find a JUnit tutorial and download the JUnit code from:
http://members.pingnet.ch/gamma/junit.htm
http://www.extreme-java.de/junitx/ (for accessing private variables)
117/136
Copyright © 2005 Korson-Consulting
JUnit
• JUnit provides a class TestCase from which your individual test case classes inherit
• TestCase provides methods such as assertEquals that are used to verify that the derived result equals the expected result. If the assertEquals fails, JUnit writes the information to a log file
• JUnit also provides a TestSuite class that can store and then execute a collection of tests (individual test cases and other test suites)
118/136
Copyright © 2005 Korson-Consulting
JUnit• JUnit provides a graphical interface used
to run the tests. Type the name of the test class or suite and press the Run button
119/136
Copyright © 2005 Korson-Consulting
JUnit• While the test is run, JUnit displays a
progress bar. The bar remains green as long as the tests are executing successfully but turns red if an error is encountered
• Details of a test case failure can also be displayed
120/136
Copyright © 2005 Korson-Consulting
Summary
121/136
Copyright © 2005 Korson-Consulting
Unit 5
Test First Development
122/136
Copyright © 2005 Korson-Consulting
Test First
• Many programmers have adopted this approach with near religious zeal
“Test-first design is infectious! Developers swear by it. We have yet to meet a developer who abandons test-first design after giving it an honest trial.” Robert C. Martin
123/136
Copyright © 2005 Korson-Consulting
The Process
• Write a test that specifies a tiny bit of functionality
• Compile the test and watch it fail (you haven't built the functionality yet!)
• Write only the code necessary to make the test pass
• Refactor the code, ensuring that it has the simplest design possible for the functionality built to date
• Repeat
124/136
Copyright © 2005 Korson-Consulting
The Philosophy
• Tests come first
• Test everything
• All tests run at 100% all the time
125/136
Copyright © 2005 Korson-Consulting
Implications• Code is written so that modules are testable in isolation.
– Code written without tests in mind is often highly coupled, a big hint that you have a poor object-oriented design. If you have to write tests first, you'll devise ways of minimizing dependencies in your system in order to write your tests.
• The tests act as system-level documentation. – They are the first client of your classes; they show how the
developer intended for the class to be used. • The system has automated tests by definition.
– As your system grows, running full regression tests manually will ultimately take outrageous amounts of time.
• Measurable progress is paced. – Tiny units of functionality are specified in the tests. Tiny
means seconds to a few minutes. The code base progresses forward at a relatively constant rate in terms of the functionality supported.
126/136
Copyright © 2005 Korson-Consulting
Problems
• Specifications are hidden in the tests• Tests should not be the only
documentation – Pre and post conditions should be explicit
• Test first development should be combined with programming by contract
127/136
Copyright © 2005 Korson-Consulting
The Revised Process
• Write the specification for the component– Method signatures and pre and post conditions
• Write a test that exercises a tiny bit of functionality
• Compile the test and watch it fail (you haven't built the functionality yet!)
• Write only the code necessary to make the test pass
• Refactor the code, ensuring that it has the simplest design possible for the functionality built to date
• Repeat
128/136
Copyright © 2005 Korson-Consulting
Extensions
• Test driven development focuses on unit testing– Increments could be defined by test cases
• System level test cases, for an increment, should be created by the test team before code is written for that increment– Ideally these test cases would be approved
by the stakeholders
129/136
Copyright © 2005 Korson-Consulting
Unit 6
SummaryHow the fundamentals of testing are affected by a modern iterative, incremental software
development
130/136
Copyright © 2005 Korson-Consulting
Test Planning
• Planning is incremental, just like everything else
• Planning tends to be lighter weight• Planning tends to be more about
process than actual test cases
131/136
Copyright © 2005 Korson-Consulting
Test Budgeting
• Budget is spread throughout the project, not saved till the end
132/136
Copyright © 2005 Korson-Consulting
Test Execution
• Starts with increment 1• Automated• Run almost continuously
133/136
Copyright © 2005 Korson-Consulting
Test Evaluation and Process Improvement
• Process improvement is primarily focused on improvements within a given project
There's a fundamental assumption in the CMMI that processes can be repeatable, and that they are predictive processes, basically not empirical processes. That is the fundamental flaw in the CMMI
Michael SpaydFormer CMM process assessor
134/136
Copyright © 2005 Korson-Consulting
Top 10 Potential Pitfalls
1. Over the wall2. Rebelling at testing immature increments3. Manual tests4. Organizational barriers5. Vague requirements6. Waterfall requirements7. Record/playback tools8. Rigid CMMI implementation9. Insufficient regression testing10.Ad hoc development of unit tests
135/136
Copyright © 2005 Korson-Consulting
Top 10 Factors for Success
1. Automated tests2. Respect each other’s culture3. Integrated organizational structure4. Developers learn about testing5. Testers learn the development
environment6. Good requirements, obtained iteratively7. Stakeholder access8. Educated management9. Tester involvement in requirements10.Tester involvement in models
136/136
Copyright © 2005 Korson-Consulting
Thanks for coming• On behalf of Korson Consulting, thanks for attending
this course.
• Let us know about your testing work. We’d like to hear about your successes and your difficulties.
• My e-mail address is:[email protected]
137/136
Copyright © 2005 Korson-Consulting
Sample Use Case
138/136
Copyright © 2005 Korson-Consulting
Use Case ID: 32.1.3
Use Case Level: Concrete
Actor: Controller
Pre-conditions:
Exchange rates defined for all currencies of accounts to be included in revaluation with the exchange type as used by the journal for revaluation. Journal for revaluation must be a General Journal Type.
Description:
The purpose of this feature is to determine what adjustments need to be made in order for the account currency and base currency balances of given accounts to match the latest exchange rates for a given dated cycle. Note: For first release, there is no requirement that the revaluation and its calculated adjustments be persisted other than as a transaction with reval entries posted into the system.
Calculate Revaluation
Sample use case from a real project
Example
139/136
Copyright © 2005 Korson-Consulting
Trigger:
The user initiates an action by...
1. Selecting the revaluation option from the "Tasks" menu.
2. When the revaluation input parameter window comes up, the user enters the following parameters:
• A set of accounts to revalue - The system will revalue those accounts in this set whose isRevalued() property is true.
• A set of currencies to revalue - The system will exclude from revaluation any given accounts whose currency is not in this set.
• Apply date - Date which to apply the revaluation posting
• Exchange Cycle - Cycle used to determine which exchange rates the accounts are revalued to. (This must be a dated cycle. The system will choose the exchange rates effective as of the ending date of this cycle).
140/136
Copyright © 2005 Korson-Consulting
• Fluctuation Account - The gain/loss account where the balancing entries to the adjustments will be posted. This account's currency must be the base currency
• Journal - The journal in which the revaluation transaction will be created and posted from. This must be a General Journal type (i.e. not any of the special journals like checks, receipts, etc). The system will choose the exchange rates of the Exchange type defined in this journal.
• Document number (Optional) - Document number for the revaluation transaction. This is only valid if a non-autonumbered journal is selected. This field should be grayed out otherwise.
• Description (Optional) - Description for the revaluation transaction. This should default to "Revaluation" or the equivalent client locale translation
141/136
Copyright © 2005 Korson-Consulting
The input screen should look similar to the following:
Note: The left pane should contain a view of the classifier tree so that the user can select which accounts to revalue. Selecting a classifier indicates that all accounts under this node (including those under all levels of sub-classifiers) should be selected. InternalFrame title should be something like "Revaluation Input Parameters.
142/136
Copyright © 2005 Korson-Consulting
The system responds by...Calculating the appropriate adjustments to be made to the given accountsand displaying a screen that shows the adjustments.
The screen should look similar to the following:
143/136
Copyright © 2005 Korson-Consulting
Relevant requirements:
1. See revaluation section of SunAccount IAM manual.2. Revaluation procedure document. 3. Revaluation report spec (See Carl Friday)
Post-conditions: 1. Adjustments are (re-)created on the server according to the revaluation parameters2. System brings up a window that shows detail on adjustments that need to be made
to bring the accounts' currency balances up to date with the effective exchangerates.
Misc Sanity Checks:There should be no more adjustments than accounts originally selected to be
revalued.There should be no more Fluctuation Items than currencies originally selected to
be revalued.
144/136
Copyright © 2005 Korson-Consulting
Alternative Courses of Action:
There are ways to accomplish the same result by overriding exchange rates, but discussions with accountants prove that it is too horrendouslytedious and is never acceptable to any accountant in their right mind.
Exceptions: Inconsistent account currency between account(s) to revalue and its/their RAP accounts
Extensions: From the revaluation adjustments screen, the user may select the option to post or print this revaluation, or both.
Concurrent Uses: Related Use Cases:
PostRevaluation, PrintRevaluationReport
145/136
Copyright © 2005 Korson-Consulting
-------------------------------------------------------------------------------------------------------
Decision Support
Frequency:
Low: Typically once per month. Possibly (though rarely) once per day.
Criticality:
High: This feature (together with PostRevaluation) must be present for multi-currency installations. If it doesn’t calculate the correct revaluation amounts, it will lead to incorrect information in the accounts.
Risk:
High usability risk for multi-currency installations.
---------------------------------------------------------------------------------------------------------
Modification History --
Owner: Kharl Friday Initiation date: Feb 23, 2000 Date last modified: Mar 1, 2000