46
Juha Itkonen Helsinki University of Technology HELSINKI UNIVERSITY OF TECHNOLOGY SoberIT [email protected]

Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

  • Upload
    lytuyen

  • View
    221

  • Download
    6

Embed Size (px)

Citation preview

Page 1: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Juha ItkonenHelsinki University of Technology

HELSINKI UNIVERSITY OF TECHNOLOGY

y gySoberIT

[email protected]

Page 2: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Contents

Introduciton to experience based Introduciton to experience based and exploratory testing

Intelligent manual testing

dSession-Based Test Management

Highlights of our Research on Exploratory TestingExploratory Testing

HELSINKI UNIVERSITY OF TECHNOLOGY2Juha Itkonen, 2009

SoberIT

Page 3: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Th id f t ti The idea of testing –test case design?

Infinite amount of values to testIn any realistic software system

O l f f th l ill l Only few of those values will reveal the defectsSystematic test case design aims at Systematic test case design aims at selecting good inputs systematically

that reveal the defectsthat cover the softwarethat minimize testing effortthat build on a model of the software that build on a model of the software systemthat are based on some theory

HELSINKI UNIVERSITY OF TECHNOLOGY3Juha Itkonen, 2009

SoberIT

Page 4: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Manual Testing

Testing that is pe fo med b h man Testing that is performed by human testers Research has shown:

1. Individual differences in testing are high

Stereotype of manual testing Executing detailed pre-designed test cases

testing are high2. Test case design techniques

do not explain the results

casesMechanical step-by-step following the instructionsT t d k th t b d dTreated as work that anybody can do

In practice, it’s clear that some testers are better than others in manual testing and more effective at revealing d f t

HELSINKI UNIVERSITY OF TECHNOLOGYJuha Itkonen, SoberIT -

20094

defects...

Page 5: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

M i i t E i B d My viewpoint: Experience Based – Intelligent – Manual Testing

M l t ti th t b ild th t t ’ Manual testing that builds on the tester’s knowledge and skillsSome aspects of testing rely on tester’s p g yskills

during testinge g exact input values expected results or e.g., exact input values, expected results, or interactions

Testers are assumed to know what they are d idoing

don’t program people

Focus on the actual testing work in practiceocus o t e actua test g o p act ceWhat happens during testing activities?How are defects actually found?E i b d d l t t f

HELSINKI UNIVERSITY OF TECHNOLOGY

Experience-based and exploratory aspects of software testing

Juha Itkonen, SoberIT -2009

5

Page 6: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Exploratory Testing is manual testing without predefined test casesBased on experience knowledge and skills of the testerBased on experience, knowledge and skills of the tester

1. Tests are not defined in advance Exploring with a general missionwithout specific step-by-step instructions on how to accomplish the mission

2. Testing is guided by the results of previously performed tests and the gained knowledge from them

3. The focus is on finding defects by explorationInstead of systematically covering the software or producing comprehensive test cases for later use

4. Simultaneous learning of the system under test, test design, and test execution

5. Experience and skills of an individual tester

HELSINKI UNIVERSITY OF TECHNOLOGY

pstrongly affect effectiveness

Juha Itkonen, SoberIT -2009

6

Page 7: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Scripted vs. Exploratory TestsMine-field analogy

HELSINKI UNIVERSITY OF TECHNOLOGY7Juha Itkonen, 2009 SoberIT

bugs fixes

Page 8: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

E l t d i t d t ti th d Exploratory and scripted testing are the ends of a continuum

ET is an approachMany testing techniques can be used in exploratory way

Freestyle exploratory “bug hunting”

Pure scripted(automated) testing

High level

Manual scripts

High leveltest cases

Charteredexploratory testing

HELSINKI UNIVERSITY OF TECHNOLOGY8Juha Itkonen, 2009

SoberIT

Page 9: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Scripted Model vs. Mental Model

In scripted testing, tests are In scripted testing, tests are first designed and recorded. Then they may be executed at some later time or by a Testssome later time or by a different tester.

Tests

In exploratory testing, tests are d i d d t d t th

Testsdesigned and executed at the same time, and they often are not recorded.

l d l f h d P d tA mental model of the productModel of what the product is and how it behaves, and how it’s supposed to behave

Product

HELSINKI UNIVERSITY OF TECHNOLOGY9Juha Itkonen, 2009

SoberIT

supposed to behaveJames Bach, Rapid Software Testing, 2002

Page 10: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Lateral thinking

Allowed to be distractedFind side paths and explore interesting areasp p gPeriodically check your status against your mission

HELSINKI UNIVERSITY OF TECHNOLOGY10Juha Itkonen, 2009

SoberIT

Page 11: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Exploring is asking questions and questioning

In exploratory testing tests are like questionsIn exploratory testing tests are like questionsQuestioning skills are at the heart of ETExploratory tester proceeds by asking questions Exploratory tester proceeds by asking questions of the software under test

And uses the answers to guide testing and find the next questionsthe next questions

Lets play “20 questions”Lets play 20 questionsI think of some “thing” and you try to find out that “thing” by asking 20 questions that I can answer either ‘yes’ or ‘no’yes or no

Imagine having to write down those questions beforehand

HELSINKI UNIVERSITY OF TECHNOLOGY11Juha Itkonen, 2009

SoberIT

q

Page 12: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

ET and Dimensions of testing

Testing combines techniques that focus onTesting combines techniques that focus onTesters: who does the testingCoverage: what gets testedg gPotential problems: what risk are you testing forEvaluation: how to tell whether the test passed or failedActivities: how do you test

Exploratory Testing is an experience based approach

ET explicitly does not script how to test

All testing involves all five dimensionsHowever each testing technique typically covers only However, each testing technique typically covers only one or two

Black Box Software Testing (Course Notes Academic Version 2004)

HELSINKI UNIVERSITY OF TECHNOLOGY12Juha Itkonen, 2009

SoberIT

Black Box Software Testing (Course Notes, Academic Version, 2004) www.testingeducation.org

Page 13: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Intelligent Manual Testing (IMT)is more than ET

IMT, in most cases, is largely exploratory testingET is always IMT

IMT trusts the skills of the tester and gives freedomHowever the tester can be supported by systematic However, the tester can be supported by systematic practices that make testing less exploratory

But more manageable

IMT can be pre-planned and systematicTh l f l i b llThe role of exploring can be small

HELSINKI UNIVERSITY OF TECHNOLOGY13Juha Itkonen, SoberIT -

2009

Page 14: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Strengths of IMT – Testers’ skills

Utilizing the skills and experience of the testerUtilizing the skills and experience of the testerTesters know how the software is used and for what purposeT t k h t f ti lit d f t iti lTesters know what functionality and features are criticalTesters know what problems are relevantTesters know how the software was built

Risks, tacit knowledge

Enables creative exploringEnables fast learning and improving testing Enables fast learning and improving testing

HELSINKI UNIVERSITY OF TECHNOLOGYJuha Itkonen, SoberIT -

200914

Page 15: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Strengths of IMT – Process

Agility and flexibilityAgility and flexibilityEasy and fast to focus on critical areasFast reaction to changes Ability to work with missing or weak documentation

Effectiveness Effectiveness Reveals large number of relevant defects

Efficiency Low documentation overheadFast feedbackFast feedback

HELSINKI UNIVERSITY OF TECHNOLOGYJuha Itkonen, SoberIT -

200915

Page 16: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Challenges of IMT

Planning and trackingPlanning and trackingHow much testing is needed, how long does it take?What is the status of testing?How to share testing work between testers?

Managing test coverageWhat has been tested?What has been tested?When are we done?

Logging and reportingVisibility outside testing team

or outside individual testing sessions

Quality of testingQuality of testingHow to assure the quality of tester’s work

Detailed test cases can be reviewed, at least

Weaker ability to prevent defects through early test design

HELSINKI UNIVERSITY OF TECHNOLOGY

Weaker ability to prevent defects through early test design

Juha Itkonen, SoberIT -2009

16

Page 17: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Exploratory testing vs. ad-hoc testing?

Exploratory test design and execution easily seem like ad-Exploratory test design and execution easily seem like ad-hoc activities

ET is ad hoc at the level of test case designAd hoc actually means fit for the situation at hand1y

NOT unmanaged, sloppy, vague, or random

Exploratory testing can be a planned, goal driven and managed process – not an ad-hoc processG l f d b ll l dGoals, focus, and coverage can be well plannedDetailed test log is created during the testing and defects are reportedS i b d t ti i h t thi Session-based testing is an approach to manage this creative and flexible process

1“for the particular end or case at hand without consideration of wider application“-- Merriam Webster Online dictionary

HELSINKI UNIVERSITY OF TECHNOLOGY17Juha Itkonen, 2009

SoberIT

e iam Webste Online dictiona y

Page 18: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Session Based Test Management (SBTM)Bach, J. "Session-Based Test Management", STQE, vol. 2, no. 6, 2000.

CharterTime BoxTime BoxReviewable ResultDebriefing

HELSINKI UNIVERSITY OF TECHNOLOGY18Juha Itkonen, 2009

SoberIT

Page 19: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Charter

Designing the Charters is test planning and designg g p g gBrief information / guidelines on:

What should be tested?A t f t Areas, components, features, …

Why do we test this?Goals, focus

How to test (approach)?Specific techniques or tactics to be usedTest data to be used

What problems to look for?

Might include guidelines on:Tools to useTools to useWhat risks are involvedDocuments to examine

HELSINKI UNIVERSITY OF TECHNOLOGY19Juha Itkonen, 2009

SoberIT

Desired output (information objectives) from the testing

Page 20: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Time BoxShort: 60 minutes (+-15)N l 90 i t (+ 15)e.g. Normal: 90 minutes (+-15)Long: 120 minutes (+-15)

Focused test effort of fixed durationNo interruptions or other simultaneous tasks

Brief enough for accurate reportingBrief enough to allow flexible schedulingBrief enough to allow flexible schedulingBrief enough to allow course correctionLong enough to get solid testing doneg g g gLong enough for efficient debriefingsBeware of overly precise timing

HELSINKI UNIVERSITY OF TECHNOLOGY20Juha Itkonen, 2009

SoberIT

Page 21: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Reviewable results

CharterCharterTest notes

What was tested, howWhat was tested, howReproducibilityWhat needs more testing, new test ideas, etc.

Bugs and issuesTest dataEffort breakdown

Duration (hours:minutes)Test design and execution (percent)Test design and execution (percent)Bug investigation and reporting (percent)Session setup (percent)

HELSINKI UNIVERSITY OF TECHNOLOGY21Juha Itkonen, 2009

SoberIT

p (p )Charter / opportunity (percent/percent)

Page 22: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Debriefing

The test lead reviews session notes to assure that he The test lead reviews session notes to assure that he understands it and that it follows the process

Results of the sessionWhat was tested and howOther quality information

hThe tester answers any questionsSession metrics are checkedCh t b dj t dCharter may be adjustedSession may be extendedNew sessions may be charteredNew sessions may be charteredCoaching / mentoring happens

HELSINKI UNIVERSITY OF TECHNOLOGY22Juha Itkonen, 2009

SoberIT

Page 23: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Session-Based Testing – Way to Manage ET

Enables planning and tracking exploratory testingEnables planning and tracking exploratory testingWithout detailed test (case) designs

Planning and tracking testing work in small chunks Dividing testing work in smaller tasksTracking testing work in time-boxed sessions

Can help getting testing done when resources are scarceCan help getting testing done when resources are scarceEfficient – no unnecessary documentationUtilizes tester’s skills and experience during test executionAgile – it’s easy to focus testing to most important areas based on the test results and other information

Changes in requirements increasing understanding revealed Changes in requirements, increasing understanding, revealed problems, …

HELSINKI UNIVERSITY OF TECHNOLOGY23Juha Itkonen, 2009

SoberIT

Page 24: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Highlights of our ET Research

1 – "Exploratory Testing: A Multiple Case Study“

2 – "Defect Detection Efficiency: Test Case Based vs. Exploratory Testing“

3 – “How Do Testers Do It? An Exploratory Study on Manual Testing Practices"g

HELSINKI UNIVERSITY OF TECHNOLOGY24Juha Itkonen, 2009

SoberIT

Page 25: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Motivation

ET is widely used approach in the industryET is widely used approach in the industryBased on our earlier researchPractitioner reports in the literature and seminars

Anecdotal results of high defect detection efficiency of ET exist

Reported by consultants and practitionersReported by consultants and practitioners

Existing research ignores testing not based on pre-designed test cases

T ti i h lit t i i l i ibl t t Testing in research literature is mainly visible as test case design techniques

Besides test cases, many other factors have strong effect on the results of testing

E.g. different testers find different defects even if using the same technique

HELSINKI UNIVERSITY OF TECHNOLOGY25Juha Itkonen, 2009

SoberIT

q

Page 26: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Results of a multiple case studyItkonen and Rautiainen. "Exploratory Testing: A Multiple Case Study", ISESE, 2005.

Case study in three software product companiesHow do they use ETWhy do they use ET

What benefits they feel it provides

What challenges did they face when using ET

HELSINKI UNIVERSITY OF TECHNOLOGY26Juha Itkonen, 2009

SoberIT

Page 27: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Ways of utilizing ET

1 Session-based exploratory testing1. Session-based exploratory testing2. Functional testing of individual features3 Exploratory smoke testing3. Exploratory smoke testing4. Freestyle exploratory testing

Unmanaged ET as part of other dutiesg pExtending test cases

5. Outsourced exploratory testingAdvanced users, strong domain knowledgeReal use scenarios

Exploratory regression testing6. Exploratory regression testingby verifying fixes or changes

HELSINKI UNIVERSITY OF TECHNOLOGY27Juha Itkonen, 2009

SoberIT

Page 28: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Reasons for using ET

Writing detailed test cases for everything is difficult and Writing detailed test cases for everything is difficult and laborious

The software can be used in so many ways or there are so many combinations between different features many combinations between different features

With ET it is easy to test from the user’s viewpointET is a natural way of testing, which enables utilizing the testers’ experience and creativity during testingtesters experience and creativity during testingET enables quick feedback on new features

from testers to developersET adapts well to changing situations

Where the requirements and the tested features change often, and the specifications are vague or incomplete

ET enables learning about the systemWhich can be utilized in other tasks, such as customer support and training

HELSINKI UNIVERSITY OF TECHNOLOGY28Juha Itkonen, 2009

SoberIT

Page 29: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Perceived benefits of ET

ET is more versatile and goes deeper into the tested ET is more versatile and goes deeper into the tested features

Testing things that would not be documented as test casesDependencies with existing features based on experience and knowledge

Test cases typically based on new requirementsRegression testing

ET is an effective way of finding defectsRequires experienced testerRequires experienced testerEven if test cases are used the defects are found by exploringEasier to achieve a destructive attitude

ET i d i f h ll li f h ET gives a good picture of the overall quality of the system in a short timeET enables testing the intangible properties

HELSINKI UNIVERSITY OF TECHNOLOGY29Juha Itkonen, 2009

SoberIT

g g p pE.g., “look and feel” of the system

Page 30: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Perceived shortcomings of ETCoverage is the biggest shortcomingg gg g

Planning and selecting what to testTracking the coverage of performed tests

A ll f dAre all new features coveredAre all ways the user could use the system covered

Reliance on the expertise and skills of an individual Reliance on the expertise and skills of an individual tester

Human errorsDifferent ways of testing (also a strength)Difficulty of getting (time of) good domain experts

Repeatability of found defectsRepeatability of found defectsNot a significant issue

HELSINKI UNIVERSITY OF TECHNOLOGY30Juha Itkonen, 2009

SoberIT

Page 31: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Results from a controlled student experiment

What is the benefit of designing the test cases beforehand?

Juha Itkonen, Mika V. Mäntylä and Casper Lassenius

"Defect Detection Efficiency: Test Case Based vs. Exploratory Testing, ESEM 2007 ESEM, 2007.

HELSINKI UNIVERSITY OF TECHNOLOGY

Page 32: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Research problem

Do testers f i l f i l i performing manual functional testing with pre-designed test cases find more or different defects compared to testers working p g

without pre-designed test cases?

HELSINKI UNIVERSITY OF TECHNOLOGY32Juha Itkonen, 2009

SoberIT

Page 33: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Research questions

How does using pre-designed test cases affectHow does using pre-designed test cases affect

1. the number of detected defects?1. the number of detected defects?

2. the type of detected defects?

3. the number of false defect reports?

False reports are:False reports are:• incomprehensible• duplicate• reporting a non-existent defect

HELSINKI UNIVERSITY OF TECHNOLOGY33Juha Itkonen, 2009

SoberIT

Page 34: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

N diff i thNo difference in thetotal number of detected defects

Found defects per subject

Testing approach

Feature Set

Number of defects

x̄ σ FS A 44 6 28 2 172ET FS A 44 6.28 2.172FS B 41 7.82 2.522

ET

Total 85 7.04 2.462 FS A 43 5.36 2.288FS B 39 7.35 2.225

TCT

Total 82 6.37 2.456 x̄ = mean and σ = standard deviation

Difference shows 0.7 defects more in the ET approachT-test significance value of 0.088

Not statistically significant

HELSINKI UNIVERSITY OF TECHNOLOGY34Juha Itkonen, 2009

SoberIT

Page 35: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Differences in defect Types

Compared to TCT ET detectedCompared to TCT, ET detectedMore defects that were obvious to detect More defects that were difficult to detectMore user interface and usability issuesMore low severity defects

These are descriptive rather than conclusive findingsNot statistically significantNot statistically significant

HELSINKI UNIVERSITY OF TECHNOLOGY35Juha Itkonen, 2009

SoberIT

Page 36: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

TCT produces more false defect reports

False defects per Testing Feature False reports are:i h ibl

psubject

gApproach Set

x̄ σ FS A 1 00 1 396

• incomprehensible• duplicate• reporting a non-existent defect

FS A 1.00 1.396FS B 1.05 1.191 ET Total 1.03 1.291 FS A 1.64 1.564 FS B 2.50 1.867 TCT Total 2 08 1 767Total 2.08 1.767

TCT produced on average 1.05 more false reports than ETp

per subject and testing sessionDifference is statistically significant with a significance value 0.000

M Whit U t t

HELSINKI UNIVERSITY OF TECHNOLOGY36Juha Itkonen, 2009 SoberIT

Mann-Whitney U test

Page 37: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Conclusions

The data showed no benefits from using pre-The data showed no benefits from using pre-designed test cases

in comparison to freestyle exploratory testing approachDefect type distributions indicate certain defect types might be better detected by ETmight be better detected by ET

But no significant differences

Test-case based testing produced more false defect reports

Perhaps test cases made testing more straightforward and mechanic leading e g to higher number of and mechanic leading, e.g., to higher number of duplicate reports

HELSINKI UNIVERSITY OF TECHNOLOGY37Juha Itkonen, 2009

SoberIT

Page 38: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

How Do Testers Do It?

An Exploratory Study on Manual Testing Practices

Itkonen J, M. V. Mäntylä and C. Lassenius. "How Do Testers Do It? An Exploratory Study on Manual Testing Practices", ESEM, 2009.

HELSINKI UNIVERSITY OF TECHNOLOGY

Page 39: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Research Goal and Methods

How do software engineering professionals perform How do software engineering professionals perform manual testing activities?

Identiying and classifying the actual testing practices that professionals use in manual testing

Experience based testingFunctional system level testingFunctional system level testing

Qualitative field observations in industrial environment

Studying real testing work by observing professionalsFour companies, 11 observation sessions

S bj f d l f i l i Subjects are software development professionals in varying roles

Who do testing as part of their normal duties

HELSINKI UNIVERSITY OF TECHNOLOGY

Who do testing as part of their normal duties

39Juha Itkonen, SoberIT -2009

Page 40: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

ResultsResults

22 l i i id ifi d22 Manual testing practices identifiedClassification of the practices

HELSINKI UNIVERSITY OF TECHNOLOGY

Juha Itkonen, SoberIT - 2009 40

Page 41: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

P i Cl ifi iPractice ClassificationExploring weak areas

Aspect oriented testing

Test session strategies

ExploratoryUser interface exploring

Top-down functional exploring

Simulating a real usage scenario

Smoke testing by intuition and experienceSmoke testing by intuition and experience

Documentation basedData as test cases

Exploring high-level test cases

Checking new and changed features

T ti lt ti

Exploratory

Testing alternative ways

Exploring against old functionality

Simulating abnormal and extreme situations

Persistence testing

Test execution techniques

Feature interaction testing

Defect based exploring

Comparison

Comparing with another application or version

Comparing within the softwareComparison g

Checking all the effects

End-to-end data check

InputTesting input alternatives

Testing boundaries and restrictions

HELSINKI UNIVERSITY OF TECHNOLOGY41Juha Itkonen, SoberIT - 2009

Input g

Covering input combinations

Page 42: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

P i Cl ifi i• Structuring exploratory testing based on the

UI components and features Practice ClassificationExploring weak areas

Aspect oriented testing

UI components and features. • Covering UI features• Identifying missing, extra, and defective

features

Test session strategies

ExploratoryUser interface exploring

Top-down functional exploring

Simulating a real usage scenario

Smoke testing by intuition and experience

features.

• Pre-defined set of test data is used• Data describes the important cases Smoke testing by intuition and experience

Documentation basedData as test cases

Exploring high-level test cases

Checking new and changed features

T ti lt ti

Data describes the important cases• Cover the defined cases in the test data• Applied in situations where data was complex,

but functions were relatively simple to

Exploratory

Testing alternative ways

Exploring against old functionality

Simulating abnormal and extreme situations

Persistence testing

y pperform.

• Comparing similar features in different places

Test execution techniques

Feature interaction testing

Defect based exploring

Comparison

Comparing with another application or version

Comparing within the software

of the same system • Testing for consistency

• Testing the boundary values and other Comparison g

Checking all the effects

End-to-end data check

InputTesting input alternatives

Testing boundaries and restrictions

• Testing the boundary values and other restrictions of input data

• Covering explicit and implicit restrictions of input data

HELSINKI UNIVERSITY OF TECHNOLOGY42Juha Itkonen, SoberIT - 2009

Input g

Covering input combinations

input data• Finding out if there are ways exceed the

stated or implicit limits and restrictions

Page 43: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

P i Cl ifi iPractice ClassificationExploring weak areas

Aspect oriented testing• Giving an overall structure to test ti k

Test session strategies

ExploratoryUser interface exploring

Top-down functional exploring

Simulating a real usage scenario

Smoke testing by intuition and experience

execution work• General guideline of how to proceed in

testing and what aspects to cover• Used in combination with free Smoke testing by intuition and experience

Documentation basedData as test cases

Exploring high-level test cases

Checking new and changed features

T ti lt ti

• Used in combination with free exploratory testing or some of the test execution techniques

Exploratory

Testing alternative ways

Exploring against old functionality

Simulating abnormal and extreme situations

Persistence testing

Test execution techniques

Feature interaction testing

Defect based exploring

Comparison

Comparing with another application or version

Comparing within the software

Testing individual or tightly related features, or such small details as input valuesComparison g

Checking all the effects

End-to-end data check

InputTesting input alternatives

Testing boundaries and restrictions

values

HELSINKI UNIVERSITY OF TECHNOLOGY43Juha Itkonen, SoberIT - 2009

Input g

Covering input combinations

Page 44: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

P i Cl ifi i

Guide and give structure to exploratory testing.

• Guide experience-based testing using documentation

• Test cases on varying detail level Practice ClassificationExploring weak areas

Aspect oriented testing

• Test cases on varying detail level, release notes, or defect reports

• Documentation is used as a checklist or as high level test cases

Test session strategies

ExploratoryUser interface exploring

Top-down functional exploring

Simulating a real usage scenario

Smoke testing by intuition and experience

as high-level test-cases

• Exploring one isolated functionality or a Smoke testing by intuition and experience

Documentation basedData as test cases

Exploring high-level test cases

Checking new and changed features

T ti lt ti

• Exploring one isolated functionality or a single function

• Based on hypothesis of a certain type of defects or a certain situation where

Exploratory

Testing alternative ways

Exploring against old functionality

Simulating abnormal and extreme situations

Persistence testing

defects or a certain situation where defects are often revealed

• Evaluating the test outcomes

Test execution techniques

Feature interaction testing

Defect based exploring

Comparison

Comparing with another application or version

Comparing within the software

• Evaluating the test outcomes • Making difference between correct,

expected; and incorrect, erroneous behavior during testingComparison g

Checking all the effects

End-to-end data check

InputTesting input alternatives

Testing boundaries and restrictions

behavior during testing

• Detailed testing of individual input values or set of related inputs

HELSINKI UNIVERSITY OF TECHNOLOGY44Juha Itkonen, SoberIT - 2009

Input g

Covering input combinations

values or set of related inputs• Covering the details of a feature and

selecting relevant values for testing

Page 45: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Conclusions

Exploratory approaches to testing are used in industryExploratory approaches to testing are used in industryIn practice, testers apply numerous execution-time practices

Testers do not mechanically rely on test-case documentationTesters need and use testing techniques even if applying experience based and exploratory testing approachesp p y g pp

Execution time techniques are partly similar to test-case design techniques

but were strongly experience based and applied in the nonbut were strongly experience based and applied in the non-systematic fashion during test executionrole of evaluation and comparison practices was emphasized

HELSINKI UNIVERSITY OF TECHNOLOGY45Juha Itkonen, 2009

SoberIT

Page 46: Juha Itkonen Helsinki University of Technology ygy …tie21201/s2009/luennot/itkonen23112009.pdf · Juha Itkonen Helsinki University of Technology ... Introduciton to experience based

Questions and more discussion

Contact information

Juha Itkonen

[email protected]

+358 50 577 1688

h // b i h fi/ji k

HELSINKI UNIVERSITY OF TECHNOLOGY

http://www.soberit.hut.fi/jitkonen

46Juha Itkonen - 2009 SoberIT