25
© 1997, BAK. 1 The DESMET Methodology EEL UNIVERSITY K E Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman Susan Linkman

© 1997, BAK. 1The DESMET Methodology EEL UNIVERSITY KE Evaluating software methods and tools using the DESMET Methodology Barbara Kitchenham Steve Linkman

Embed Size (px)

Citation preview

© 1997, BAK. 1 The DESMET Methodology

EELUNIVERSITY

K EEvaluating software methods

and tools using the DESMET Methodology

Barbara Kitchenham

Steve Linkman

Susan Linkman

© 1997, BAK. 2 The DESMET Methodology

EELUNIVERSITY

K E

Agenda

Evaluation methods

• Methods

• Selecting an appropriate metods

© 1997, BAK. 3 The DESMET Methodology

EELUNIVERSITY

K E

Evaluation methods

Two aspects:

• Nature of evaluation outcome– Assessment of suitability

• qualitative/subjective

– Measurable benefits• quantitative/objective

• Organisation of evaluation– formal experiment

– case study

– survey

© 1997, BAK. 4 The DESMET Methodology

EELUNIVERSITY

K E

Qualitative methodsFeature analysis

User requirements mapped to method/tool features

Subjective assessment

• how well is feature supported?

• how usable is functionality?

Problems:

• Selection of features

• Subjectivity of rating

• Collation of results

• Too many features

© 1997, BAK. 5 The DESMET Methodology

EELUNIVERSITY

K E

Quantitative methods

Measured benefits of method/tool

Objective assessment

• measure quality and/or productivity

• compare results using different method/tool

Problems

• Not all benefits are quantitative

• Some quantitative benefits are hard to measure

© 1997, BAK. 6 The DESMET Methodology

EELUNIVERSITY

K E

Hybrid methods

Specific techniques

Benchmarking

• objective performance measures

• subjective selection of “tests”

Qualitative effects analysis

• subjective expert opinion

• about quantitative benefits

© 1997, BAK. 7 The DESMET Methodology

EELUNIVERSITY

K E

Formal experiment

Scientific paradigm

Many subjects (engineers)

Perform specified task(s)

Subjects assigned at random to method

Randomisation and replication

• minimise bias

• ensure results are trustworthy

Best for precise answers to limited questions

© 1997, BAK. 8 The DESMET Methodology

EELUNIVERSITY

K E

Case studies

Method/tool tried out on “real” project

• Results scale to real world

Limited replication so problems with comparisons

© 1997, BAK. 9 The DESMET Methodology

EELUNIVERSITY

K E

Surveys

For “mature” methods/tools

People/groups that use method or tool polled

Database of results analysed

© 1997, BAK. 10 The DESMET Methodology

EELUNIVERSITY

K E

Nine evaluation methods

Feature analysis– Formal Experiment

– Case Study

– Survey

– Screening-mode

Quantitative evaluation– Formal Experiment

– Case Study

– Survey

Qualitative effects analysis

Benchmarking

© 1997, BAK. 11 The DESMET Methodology

EELUNIVERSITY

K E

Problem

9 Evaluation methods

• Embarrassment of riches

Which method should you use?

It depends what you want to do

© 1997, BAK. 12 The DESMET Methodology

EELUNIVERSITY

K E

7 Selection Criteria

Evaluation project goals

Evaluation capability of organisation

Nature of evaluation object

Nature of impact

Scope of impact

Maturity of evaluation object

Learning curve

© 1997, BAK. 13 The DESMET Methodology

EELUNIVERSITY

K E

Evaluation goals

Choice of methods for individual project

Selection of methods & tools for an organisation

Monitoring changes as part of process improvement program

• evaluation of proposed change

• effect of adoption of change

Selection of method/tool for resale

© 1997, BAK. 14 The DESMET Methodology

EELUNIVERSITY

K E

Evaluation capabilityCharacteristics of an organisation affect its ability

to perform evaluations

Four types of organisation capability:

1. Severely limited– each project is different

2. Qualitative evaluation capability– project follow same standards

3. Quantitative & qualitative– projects all keep project metrics

4. Full evaluation capability– the organisation maintains store of project data

© 1997, BAK. 15 The DESMET Methodology

EELUNIVERSITY

K E

Nature of evaluation object

Method (or method/tool combination)– likely to have major impact

– quantitative assessment advisable

Tool– comparing alternatives suggests feature analysis

– tool v. no tool suggests quantitative assessment

Generic method– e.g. object-oriented v. structured methods

– can only try-out specific methods/tools

– generic assessment needs expert opinion

© 1997, BAK. 16 The DESMET Methodology

EELUNIVERSITY

K E

Scope impact

Product granularity:

• whole product

• modules

Extent of impact

• seen immediately

• seen over several phases or whole lifecycle

• seen on subsequent projects

© 1997, BAK. 17 The DESMET Methodology

EELUNIVERSITY

K E

Impact on selection of method

Formal experiments more viable for impacts with small scope

• easier to impose necessary control

• easier to provide replication

Case studies appropriate for larger scope

For impacts affecting later projects

• e.g. effect of re-usability

• need to consider surveys

© 1997, BAK. 18 The DESMET Methodology

EELUNIVERSITY

K E

Maturity of item

If currently in wide-spread use:

• surveys are possible

If new method/tool

case study or formal experiment

© 1997, BAK. 19 The DESMET Methodology

EELUNIVERSITY

K E

Learning time

Learning time

• time to understand principles

• time to become proficient

Long learning reduces feasibility of formal experiment

© 1997, BAK. 20 The DESMET Methodology

EELUNIVERSITY

K E

Feasibility of selection

Other non-technical factors affect method selection:

• Timescales for evaluation

• Level of confidence required in result

• Cost of evaluation

© 1997, BAK. 21 The DESMET Methodology

EELUNIVERSITY

K E

Timescales for evaluationLong (3 months plus):

– Cases study (quantitative or qualitative)

Medium (several months)– Feature analysis - survey

Short (several weeks)– Experiments (quantitative or qualitative)

– Benchmarking

– Feature analysis - screening mode

Very short (a few days)– Quantitative survey

– Qualitative Effects Analysis

© 1997, BAK. 22 The DESMET Methodology

EELUNIVERSITY

K E

Risk of “wrong” result

Very High:– Qualitative Effects Analysis

– Feature analysis - screening mode

High:– Quantitative case study (“sister project”)

– Feature analysis case study

Medium– Quantitative case study (“organisation baseline”)

– Feature analysis survey

© 1997, BAK. 23 The DESMET Methodology

EELUNIVERSITY

K E

Risk of wrong result- continued

Low:– Quantitative case study (“within project baseline”)

– Formal feature analysis experiment

– Quantitative survey

Very Low– Formal quantitative experiment

© 1997, BAK. 24 The DESMET Methodology

EELUNIVERSITY

K E

Cost of an evaluation

High:– Formal experiment

Medium:– Case study

– Feature Analysis Survey or Screening-mode

– Benchmarking

Low (assuming infrastructure exists):– Quantitative Survey

– Qualitative Effects Analysis

© 1997, BAK. 25 The DESMET Methodology

EELUNIVERSITY

K E

Summary

There is no best evaluation method

An appropriate evaluation method is context dependent

“Appropriate” technical choice can be infeasible if it:

• takes too long

• costs too much

• doesn’t provide sufficiently trustworthy results