Lecture 4: Software Lifecycles Waterfall Model Why not a waterfall

Preview:

Citation preview

1

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Lecture 4:Software Lifecycles

➜ The Software Process�Waterfall model� Rapid Prototyping Cycle� Phased Models:

� Incremental Development� Evolutionary Development� Spiral Model

� V-model and Systems Engineering� The ‘essential’ software process

➜ Verification and Validation

2

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Waterfall Model

requirements

design

code

integrate

test

Source: Adapted from Dorfman, 1997, p7see also: van Vliet 1999, p50

maintain

3

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Why not a waterfall?➜ Waterfall model describes a process of stepwise

refinement� Based on hardware engineering models�Widely used in defense and aerospace industries

➜ But software is different:�No fabrication step

� Program code is just another design level� Hence, no ‘commit’ step - software can always be changed…!

�No body of experience for design analysis (yet)� Most analysis (testing) is done on program code� Hence, problems not detected until late in the process

�Waterfall model takes a static view of requirements� ignores changing needs� Lack of user involvement once specification is written

� Unrealistic separation of specification from design� Doesn’t accommodate prototyping, reuse, etc.

Source: Adapted from Blum 1992, pp28-31see also: van Vliet 1999, p50-1

4

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Prototyping lifecycle

➜ Prototyping is used for:� understanding the requirements for the user interface� examining feasibility of a proposed design approach� exploring system performance issues

➜ Problems:� users treat the prototype as the solution� a prototype is only a partial specification

Source: Adapted fromvan Vliet 1999, p53

documentrequire-ments

design code test integrate

require-ments

designprototype

buildprototype

testprototype

5

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

design code test integrate O&Mreqts

Phased Lifecycle Models

Requirements

design code test integrate O&M

Source: Adapted from Dorfman, 1997, p10

design code test integrate O&M

design code test integrate O&M

design code test integrate O&M

design code test integrate O&Mreqts

design code test integratereqts

version 1

version 2

version 3

Release 1

release 2

release 3

release 4

lessons learnt

lessons learnt

Incremental development(each release adds more

functionality)

Evolutionary development(each version incorporates

new requirements)

Source: Adapted fromDorfman, 1997, p10

see also: van Vliet 1999, p56

6

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

The Spiral ModelDetermine goals,

alternatives,constraints

Evaluatealternatives

and risks

PlanDevelop

andtest

budget1budget2budget3budget4 prototype1 prototype2 prototype3 prototype4

alter

nativ

es 4

alter

nativ

es 3

Alte

rn-

ative

s 2

constraints4

constraints3

Constr-

aints2

alterna

tives

constr

aints

risk analysis4risk analysis3

riskanalysis2riskanalysis1

concept ofoperation

soft

ware

requ

iremen

ts

validated

requirements

soft

ware

desig

n

validated,

verified design

deta

iled

desi

gn

code

unit

test

systemtest

acceptancetest

requirements,lifecycle plandevelopment planintegration and test plan

implementation plan

Source: Adapted fromPfleeger, 1998, p57

see also: van Vliet 1999, p63

7

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Comments on phased models➜ Incremental development

� avoids ‘big bang’ implementation� but:

� assumes all requirements known up-front

➜ Evolutionary development� allows for lessons from each version to be incorporated into the next� but…

� hard to plan for versions beyond the first;� lessons may be learnt too late

➜ Spiral model� incorporates prototyping and risk analysis� but…

� cannot cope with unforeseen changes (e.g. new business objectives)� not clear how to analyze risk

8

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

V-Model

systemrequirements

softwarerequirements

preliminarydesign

detaileddesign

code anddebug

unittest

componenttest

softwareintegration

acceptancetest

systemintegration

“analyseand

design”

“testand

integrate”

time

Leve

l of

abs

trac

tion

Source: Adapted fromForsberg & Mooz 1997

9

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

The “essential” software processSource: Adapted from Blum, 1992, p32

see also: van Vliet p11

ProblemStatement

ImplementationStatement

System

Corr

espo

nden

ce

Corr

ectn

ess

Valid

ation

Verifica

tion

Real World

10

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Verification and Validation

➜ For V&V, we need to worry about:� The properties of the computer hardware (C)� The properties of the program (P)� The properties of the machine in the application domain (the specification, S)� The properties of the domain, independent of the machine (D)� The requirements for the machine (R)

➜ Demonstrating that P satisfies R is then a two stepprocess:� Do C and P imply S? (Verification)� Do S and D imply R? (Validation)

Application Domain Machine DomainSource: Adapted from Jackson, 1995, p170-171

11

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Validation Example➜ Requirement R:

� “Reverse thrust shall only be enabled when the aircraft is moving on therunway”

➜ Domain Properties D:�Wheel pulses on if and only if wheels turning�Wheels turning if and only if moving on runway

➜ Specification S:� Reverse thrust enabled if and only if wheel pulses on

➜ S + D imply R� But what if the domain model is wrong?

Source: Adapted from Jackson, 1995, p172

12

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Summary➜ Software is different

�many assumptions from other engineering models don’t apply� there is no fabrication step� the underlying science of software behaviour is not well developed� (software engineering is still an immature discipline)

➜ Many different views of the software process� waterfall model is too rigid (doesn’t allow for change)� other models incorporate prototyping, evolution, risk, etc.� no lifecycle model is perfect

➜ Essential process:� describe the problem� describe the solution� verify (does the solution solve the stated problem?)� validate (did we solve the right problem?)

13

University of Toronto Department of Computer Science

© 2001, Steve Easterbrook

Referencesvan Vliet, H. “Software Engineering: Principles and Practice (2nd Edition)”

Wiley, 1999.Chapter 3 provides a very good overview of lifecycle models.

Blum, B. “Software Engineering: A Holistic View”. Oxford UniversityPress, 1992.

Dorfman, M. “Requirements Engineering”. In Thayer, R. H and Dorfman,M. (eds.) “Software Requirements Engineering, Second Edition”. IEEEComputer Society Press, 1997, p7-22

Forsberg, K and Mooz, H. “System Engineering Overview”. In Thayer, R.H and Dorfman, M. (eds.) “Software Requirements Engineering, SecondEdition”. IEEE Computer Society Press, 1997, p44-72

Jackson, M. “Software Requirements & Specifications: A Lexicon ofPractice, Principles and Prejudices”. Addison-Wesley, 1995.

Pfleeger, S. “Software Engineering: Theory and Practice”. Prentice Hall,1997.

Recommended