12
SOFTWARE PROCESS—Improvement and Practice, Vol. 3, 201–212 (1997) Managing Software Quality by Standardization Peter Middleton* The Queen’s University of Belfast, 103 Botanic Avenue, Belfast BT7 1JP, U.K. Research Section This paper examines the efficiency and effectiveness of a prescriptive systems development methodology in practice. The UK government’s mandatory Structured Systems Analysis and Design Method (SSADM) was examined to determine its value to software projects. The evidence was collected from interviews with 17 project managers, discussions with participants on three large software projects and from observing 90 end users in training. The conclusions are that prescriptive information systems methodologies are unlikely to cope well with strategic uncertainty, user communication or staff development. The recommendations are to focus more on soft organizational issues and to use approaches tailored to each project. 1997 John Wiley & Sons Ltd Softw. Process Improve. Pract., 3, 201–212 (1997) KEY WORDS: information systems management; methodology; public sector; SSADM; project management 1. INTRODUCTION This paper examines the use of prescriptive method- ology as a way of improving software quality. It does this by evaluating how effective the UK government’s Structured Systems Analysis and Design Method (SSADM) is in raising the perform- ance of software developers. The Structured System Analysis and Design Method (SSADM) is important because it is manda- tory for UK central government software develop- ment projects. Its sponsorship by the Central Computer and Telecommunications Agency (CCTA) and the National Computer Centre (NCC), means that it strongly influences the UK IT industry. SSADM uses a waterfall life cycle. It prescribes * Correspondence to: Peter Middleton, School of Management, Queen’s University of Belfast, 103 Botanic Avenue, Belfast BT7 1JP, U.K. CCC 1077-4866/97/040201-12$17.50 1997 John Wiley & Sons, Ltd. in detail how analysis and design are to be carried out. It takes a rational, engineering view of the world. It is driven by the analysis of the data within a proposed information system. The idea of defining a way to develop software successfully and then training people to follow it is an attractive one. The arguments for a mandatory, prescriptive methodology are at first sight persuas- ive and include: 1. Financial and technical resources could be harnessed to create the ‘best’ method possible. 2. It would facilitate the mobility of labour and the formation of project teams. 3. It would aid industrial collaboration by provid- ing a common framework for complex software projects. 4. Systems written using a standard methodology could be maintained by a wider range of sup- pliers. 5. A commonly used standard would increase the speed of software development because support tools would become available.

Managing software quality by standardization

Embed Size (px)

Citation preview

Page 1: Managing software quality by standardization

SOFTWARE PROCESS—Improvement and Practice, Vol. 3, 201–212 (1997)

Managing SoftwareQuality byStandardization

Peter Middleton*The Queen’s University of Belfast, 103 Botanic Avenue, BelfastBT7 1JP, U.K. Research Section

This paper examines the efficiency and effectiveness of a prescriptive systems developmentmethodology in practice. The UK government’s mandatory Structured Systems Analysis andDesign Method (SSADM) was examined to determine its value to software projects. Theevidence was collected from interviews with 17 project managers, discussions with participantson three large software projects and from observing 90 end users in training. The conclusionsare that prescriptive information systems methodologies are unlikely to cope well withstrategic uncertainty, user communication or staff development. The recommendations areto focus more on soft organizational issues and to use approaches tailored to each project. 1997 John Wiley & Sons Ltd

Softw. Process Improve. Pract., 3, 201–212 (1997)

KEY WORDS: information systems management; methodology; public sector; SSADM; project management

1. INTRODUCTION

This paper examines the use of prescriptive method-ology as a way of improving software quality. Itdoes this by evaluating how effective the UKgovernment’s Structured Systems Analysis andDesign Method (SSADM) is in raising the perform-ance of software developers.

The Structured System Analysis and DesignMethod (SSADM) is important because it is manda-tory for UK central government software develop-ment projects. Its sponsorship by the CentralComputer and Telecommunications Agency(CCTA) and the National Computer Centre (NCC),means that it strongly influences the UK IT industry.

SSADM uses a waterfall life cycle. It prescribes

* Correspondence to: Peter Middleton, School of Management,Queen’s University of Belfast, 103 Botanic Avenue, BelfastBT7 1JP, U.K.

CCC 1077-4866/97/040201-12$17.50 1997 John Wiley & Sons, Ltd.

in detail how analysis and design are to be carriedout. It takes a rational, engineering view of theworld. It is driven by the analysis of the datawithin a proposed information system.

The idea of defining a way to develop softwaresuccessfully and then training people to follow itis an attractive one. The arguments for a mandatory,prescriptive methodology are at first sight persuas-ive and include:1. Financial and technical resources could be

harnessed to create the ‘best’ method possible.2. It would facilitate the mobility of labour and

the formation of project teams.3. It would aid industrial collaboration by provid-

ing a common framework for complexsoftware projects.

4. Systems written using a standard methodologycould be maintained by a wider range of sup-pliers.

5. A commonly used standard would increasethe speed of software development becausesupport tools would become available.

Page 2: Managing software quality by standardization

Research Section P. Middleton

To examine the validity of these arguments theexperience of using SSADM within the UK willbe analysed.

Since 1981 the UK software industry has goneinto rapid decline and it now has virtually nointernational standing (Holway 1992). It was duringthis period that SSADM was introduced. SSADMis claimed to be the most popular third partydevelopment methodology within the UK, holding25% (Ashworth 1992) or 41% (Springett 1993) ofthe market.

The government is the largest purchaser ofsoftware in the country, spending about £1 billiona year (Ashworth 1992). However, the House ofCommons Public Accounts Committee highlightedsoftware as the major culprit in an examination ofproject cost over-runs. The National Audit Officehas also produced reports detailing the seriouswaste caused by the poor management of govern-ment software projects (National Audit Office 1987,1989, 1990).

This research was undertaken to determine howSSADM has performed in practice by interviewingusers of SSADM.

1.1. SSADM in the context of othermethodologies

According to Jones (1990) a methodology is ‘a bodyof knowledge and techniques. . . methodologiesshould be algorithmic, in the sense that theyfurnish rules for achieving their desired result’.DeMarco and Lister (1987) would agree, defininga methodology as ‘a proven method for undertakinga repeated task’.

Song and Osterweil (1994) carried out research‘. . .aimed at identifying the differences betweenSoftware Design Methodologies (SDM) rather thanevaluating SDMs’ (p. 383). They wished to makeobjective comparative assessments of SDMs usinga clearly defined comparison process. Their work‘. . . does not address how easily and effectivelySDMs can be applied in practice’ (p. 383).

Song and Osterweil (1994) concluded that ‘awell defined SDM can still be muddled in variousways. . . . it is unrealistic to suppose that all rulesof a method can all be perfectly well defined’ and‘. . . comparisons are affected largely by differencesin application domains and project personnel’.

The above problems with the detailed analysisof methodologies are confirmed by Avgerou and 1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

202

Cornford (1993). For example, methodologies canhave varying degrees of prescription: they can bemore or less contingent on the context in whichthey are to be used; they can be complete or onlyprovide rules for parts of the task, and they canhave an engineering or a sociological bias.

There are many views on how to developinformation systems and these different pers-pectives have been captured in a variety ofways. The methodologies, life cycles and processimprovement models include for example: ad hoc(Jones 1990); waterfall (Royce 1970); participative(Mumford and Weir 1979); soft systems (Checkland1981); prototyping (Naumann and Jenkins 1982);incremental (Gilb 1988); spiral (Boehm et al. 1984);reuse (Matsumoto and Ohno 1989); formal(Andrews and Ince 1991); rapid application de-velopment (Martin 1991); object-oriented (Coadand Yourdon 1991) and software capability(Humphrey 1990).

Each of these different approaches is availablein many different variants. It is also commonfor these various perspectives to be ‘mixed andmatched’ to produce new hybrid models.

SSADM would be classified as a heavily prescrip-tive methodology, although some streamlining isallowed. It takes a rational, engineering view ofthe world. It is driven by analysis of the datawithin a proposed information system and itrequires a ‘waterfall’ life cycle to be followed.

2. BACKGROUND

The UK government is not alone in trying touse methodology to raise the effectiveness andefficiency of software developers. Capers Jones(1986) examined the impact of standards andformal development methods in over 100 largeenterprises in the United States and Europe. Heconcluded that people felt a certain comfort fromtheir existence but the evidence on their benefitswas ambiguous. Boehm (1981) provides compre-hensive data which indicates that methodology isfar less important than the ability of the developersand the complexity of the project.

DeMarco (1982) summarizes his experience ofmethodology in the following terms: ‘The idea thata single methodology should govern even twodifferent projects is highly suspect: The differencesbetween projects are much more important thanthe similarities.’ (p. 131).

Page 3: Managing software quality by standardization

Research Section Managing Software Quality by Standardization

DeMarco goes on to point out that the need totailor a methodology is always recognized, butthat the senior and lower levels of the hierarchyinterpret this differently. Data reported by DeMarcoand Lister (1987) indicates that working conditionsare critical for raising productivity. They alsoobserved that detailed prescriptive methodologiesreduce rather than increase productivity. Thereasons they identified for this were: a morass ofpaperwork; a paucity of methods; an absence ofresponsibility, and a general loss of motivation(DeMarco and Lister 1987 p. 116).

SSADM is commonly perceived to be ‘prescrip-tive, burdensome and difficult to apply’ (Thomson1990). Other criticisms are: that staff do not reallyunderstand SSADM and are just ‘going throughthe motions’ (Crinnion 1991) or ‘learn it by rote,then use it as an excuse not to think’ (Holloway1993); that the top down structured approach ofSSADM is too rigid and does not reflect the waypeople work in practice (Whitefield and Sutcliffe1992, Trull 1990); that it attempts to substitutemethodology for management (Simpson 1990), andit puts too much emphasis on functionality, analysisand design at the expense of people and organiza-tional issues (Cockcroft 1990).

On the positive side, Hares (1990, p. 39) asserts:‘Failure to produce high quality deliverables isdue to poor application of the method, not themethod itself’. Young (1993) comments: ‘I stronglybelieve that any method is better than none andSSADM is certainly worth using if the alternativeis ad hoc software development’.

The SSADM Version 3 manuals (Longworth andNicholls, 1986 a,b), the handbook (Longworth et al.1988) and the SSADM Version 4 manuals (CCTA1990) offer no empirical data to underpin themethodology. The official training materials (AIMSSystems 1990) from the creators of SSADM v.4also offer no supporting data or references toexplain the construction of SSADM. The largenumber of books which have emerged for theSSADM training market (for example, Cutts 1991,Downs et al. 1992, Eva 1990, Skidmore et al. 1992)tend to focus on presenting the methodology ratherthan evaluating or criticizing it.

3. RESEARCH METHODOLOGY

The initial idea for collecting data on how method-ologies functioned in practice was to approach the

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

203

Civil Service at a senior level and carry out aproject with their formal co-operation. This methoddid not provide the levels of insights required andtherefore this approach to collecting raw data wasabandoned. A more informal method was thenadopted for this research study which collectedinformation from three different types of source.

First, to establish how SSADM performs inpractice, three multi-million pound SSADM projectswere followed over a 3 year period. They were allpublic sector although outside central government –housing, education and local government. Theprogress of the projects was followed by meetingswith users, developers and sponsors. These werecarried out at 3 monthly intervals using semi-structured interviews. The people were approachedinformally and assured of complete confidentiality.

Secondly, the author was awarded a contract toprovide SSADM training to 90 end users, to enablethem to understand their SSADM documents.These training courses were in seminar form ingroups of 10 people each lasting for 2 days.The course training materials were live SSADMdocuments produced by the end users’ projectteams. The comments of the participants werewritten down as were any observations about howthey coped with the SSADM material.

Thirdly, to provide a broader sample, 17 semi-structured, 2 hour interviews with other projectmanagers were carried out. These were from 12public sector and 5 private sector organizations. Thepublic sector organizations were the informationsystems units of a range of government depart-ments. They all used SSADM except for their smallprojects. Of the private sector organizations onlyone of them used SSADM, and then only ifrequested by a public sector client.

The reason for including non-SSADM projectsin this sample was to try to establish whatdifference SSADM was making to the completesoftware development process. The private sectorrepresentatives were the software developmentorganization of a major multinational company,two systems software companies that export theirproducts world wide and two national bespokesoftware developers. These last two companieswere accredited with the ISO9000 quality standard.

The questions asked in the 17 interviewsincluded:

What was the methodology used for analysisand design?

Page 4: Managing software quality by standardization

Research Section P. Middleton

How much was the methodology tailored?Life cycle used?What was the background of the project leader?What was the role of the user?Which methodology was used for project man-agement?How much was this methodology tailored?How does this project compare to others?How is service delivery measured?

Table 1 shows a summary of the 17 Interviews.All respondents were project managers and came

from an IT background.In all cases SSADM was reported to be modified.Respondents were asked to give a response on a5 point scale if appropriate and then encouragedto talk as they wished about the topics raised.

This approach to collecting empirical data is notperfect. It is open to interviewer bias distortingthe answers given by respondents. The peopleinterviewed are not chosen at random; they willtend to self select. There is a limit to how muchhard data can be obtained, for example, respondentssometimes did not know the budget for the systemor the timetable. The main justification for thismore informal approach is because it was seen asthe only way to produce reasonably comprehen-sive data.

Table 1. Summary of the 17 interviews

Reference Public/ Months Computer language Project IT staff Method Project managementprivate duration

1 Private 30 COBOL 12 MAP none2 Private 25 COBOL 5 Yourdon Prince3 Private 22 ‘Client’ 18 SSADM4 Prince4 Private 10 C 9 Own Task5 Private Variety of small projects6 Public 30 Oracle 10 SSADM3 Prince7 Public 7 Oracle 7 SSADM3 Prince8 Public 12 Oracle 5 SSADM4 Prince9 Public 10 Oracle 6 SSADM3 Prompt

10 Public 24 Quickbuild 5 SSADM3 Prompt11 Public 4 Network 5 ad hoc Prince12 Public 24 Oracle 3 SSADM3 Prince13 Public 28 Proprietary 10 Own Prince14 Public 24 Dataflex 10 SSADM4 Prince15 Public 27 Dbase 2 ad hoc none16 Public Variety of small projects17 Public 12 Package 3 SSADM4 Prince

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

204

4. KEY FINDINGS

4.1. Modifying SSADM

SSADM sees the software development process asa cascade flowing from a clear strategic directionand firm requirements (CCTA 1990, F-OVE-6):

Strategic planningu

Feasibility studyu

Requirements analysisu

Requirements specificationu

Logical system specificationu

Physical designu

Construct and test

This investigation found that in the central andlocal government sites visited there were roughly100 small PC-based projects for one large main-frame based project. For these small projectsSSADM was either disregarded or tailored beyond

Page 5: Managing software quality by standardization

Research Section Managing Software Quality by Standardization

recognition. The approaches used were either adhoc, prototyping, or incremental development. Thisgoes far beyond the ‘streamlining’ recommendedfor using SSADM.

Even when the core ‘waterfall’ life cycle wasretained, considerable changes to SSADM werebeing made. There were no reports of ‘pure’SSADM being used and all respondents stated thatsignificant modifications to the methodology weremade. Samples of quotes were:

‘Entity Life History diagrams not done – notime – wouldn’t add that much value.’‘Used prototyping with no particular process.Couldn’t fit SSADM to Oracle easily.’‘Because a modified package solution – noEntity Life History diagrams, no LogicalDesign, no analysis done, no technicaloptions needed.’

The need by all respondents to modify SSADM inmost cases significantly would indicate that itsprescriptive approach is found hard to apply.

4.2. Iteration

SSADM recommends that the work for each ofSSADM’s five modules is completed in sequence:

‘It should be noted that the applicationsprojects are essentially linear, albeit withsome opportunities for integrated tasks.’(CCTA 1990, F-OVE-6).‘The resulting input to the next module mustcontain all the required information.’ (CCTA1990, F-OVE-11).

In practice this linear approach was not happening,for the following reasons. First, in every case therewas a preferred computer language – either becauseof their existing skills or due to an organizationalstandard. Secondly, the needs of the lengthybudgeting and procurement procedures requireddecisions to be made on hardware early in theprocess. Often, as will be discussed in the nextsection, the ‘upstream’ work could not be com-pleted.

4.3 Strategy

SSADM states that:

‘It is assumed that business planning, ISstrategy and tactical planning, will have been

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

205

carried out before an SSADM project isinitiated. Whether formally or informally, thetypes of analysis implied by these tasks; mustbe undertaken before an SSADM project canbe initiated.’ (CCTA 1990, F-OVE-6).

The interviews showed that this was rarely the case.

‘When a change in strategy – users don’tknow how to interpret – can’t see what to do.’‘Often strategy not defined; yet have to act.’

Developers mentioned that strategies were oftencontradictory and vague at key points which werevital for implementation. The strategies were alsoliable to confuse the line managers who were notreally sure what they meant. They were also non-existent on occasions when action was unavoidable.There were several mentions of ‘planners blight’,when action was halted while the strategy wascompleted, which was disruptive. Finally, in fourcases, there were fundamental changes of strategyin the middle of projects due to changes in thesenior management.

4.4. Requirements Analysis

SSADM advocates the use of a prioritized list ofrequirements. Requirements can be added to anddeleted from this list as the project progresses.Requirements are to be quantifiable and measur-able, for example, Step 370 (CCTA 1990, F-RD-7) requires:

‘. . .ensure that all requirements, particularlynon-functional requirements, have been ident-ified, are described correctly, and are fullydetailed.’

This need to start with firm requirements was inevery case a stumbling block. The interviews found:

‘Users sign off documents – this doesn’t meanthey understand or agree.’‘Users often don’t understand their own sys-tems.’‘Internal politics override administrative andtechnical sense.’

In summary the reasons this part of the method-ology caused so many problems are the following:

Page 6: Managing software quality by standardization

Research Section P. Middleton

1. The users did not know what they wanted.2. The users did not know the possibilities of

the technology.3. Users’ perceptions changed as the system was

being developed.4. The developers did not understand the intri-

cacies of the users’ work.5. There were constant changes in the external

environment that were not anticipated.

This need in SSADM for developers to establishfirm requirements was for all practical purposesimpossible to achieve. As a consequence of thisthe following inevitably happened with SSADMprojects – either the users were obliged to take thesystem they asked for even though it did not meettheir requirements or the projects degenerated intoa ‘code and test’ cycle to try to create a usefulproduct. This caused particular problems whenthere was a legal contract with an outside devel-oper, based on the firm set of requirements. Therequirements, although firm, were of poor qualityand frequent alterations were needed.

4.5. Approach to users

In SSADM users are asked their requirementswhich are then catalogued and prioritized. Thisprocess is described in CCTA (1990, F-CON-4):

‘Users needs are given high priority in SSADMand user involvement is defined and highlyvisible. They have a major involvement inthe expression of their business needs, in thedecision-making process at all levels, andthroughout all phases of the method.’

This approach tended to lead to communicationbeing ritualized with long formal meetings anddocuments signed off unread. The lack of partialdeliveries of software, the long time scales and thelarge quantities of documentation sapped moraleand were the reasons given for the low commitmentto the projects.

The better projects, defined as delivering workingsystems within reasonable cost and time para-meters, did much more than ask or involve theusers. To establish the depth of communicationnecessary the successful projects ensured thatdevelopers and users shared the same office space,went on joint site visits and that the developers 1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

206

spent time doing the users’ job. They also wentto talk to the users’ customers to learn theirperspectives. Joint informal meetings with theproject sponsors helped the team to gain a depthof insight and consensus into what their objec-tives were.

It would appear that, in practice, the need wasto develop trust and a shared vision of what theproject was trying to achieve; users are part of thesystem and therefore it is necessary that theircapabilities are explicitly grown with the system.The SSADM recommendation of asking and involv-ing users was observed to be too superficialan approach. The perceived need was to fostercommitment, develop the users’ skills and toconsciously handle the politics of the project. Theevidence from these projects is that SSADM tendsto limit the contribution of the user to ‘involvement’and ‘expression’ rather than the needed partici-pation and commitment.

4.6. The use of diagrams

SSADM states that:

‘SSADM’s graphical notations can be readilyunderstood by users, and greatly contributeto effective communication between them andthe analysis team.’ (CCTA 1990, F-CON-4).

Diagrams and conventions for their use are centralto SSADM, but practical evidence shows that theymay not contribute as expected for the followingreasons that emerged from the observations andthe interviews:

1. End users easily confuse the different dia-gramming techniques and are rarely clear ifthey are describing an existing system ordesigning a new one. Users who are asked tocommunicate through an unfamiliar methodwith apparently arcane conventions, often justretreat from the project and virtually all com-munication is lost.

2. The lack of effective communication is con-firmed by routinely finding several errors perpage on diagrams which have been qualityassured. This high level of errors for thesediagramming techniques has also been reportedby Gilb (1988) and Parkinson (1991).

The most successful projects, in terms of practical

Page 7: Managing software quality by standardization

Research Section Managing Software Quality by Standardization

results delivered, focused on creating a supportiveatmosphere in which communication could flour-ish. There seems to be no reason why users cannotscribble and express themselves as they wish.These ideas can then by refined as necessary. Theimportant point is to ensure that the users enjoythe process and may become curious about howto develop their system skills. Boddie (1987)observes that formal communication is often inef-fective, compared to using ‘social’ mechanisms ofleadership and peer group example.

SSADM focuses the developer on the technicaldrawing techniques and not on the soft skills offacilitating the human process of communication.By encouraging this orientation, real communi-cation is lost, which is partly why SSADM diagramstend to be inaccurate in practice.

4.7. SSADM with large systems

These are systems with over three years develop-ment time and budgets over £1 million. Followingthree of these projects for three years showed thatthey all failed to deliver the system to users’ ontime and to meet users’ needs. The large systemsadhered to SSADM very closely. The reasons forthis were:

1. because of the amount of money involved,staff at all levels become very cautious aboutdeviating from the recommended way;

2. using SSADM would provide political protec-tion should the project run into trouble;

3. large projects are of necessity more formalizedand it was easier to agree to use the com-plete method;

4. in all three cases this was the biggest projectthe staff had ever attempted and they werewilling to trust SSADM even if its recommen-dations did not seem sensible.

In all three projects technical problems emergedvery late on in the development process. Forexample, on one project the database responsetimes were hopelessly slow. This was in spite ofthe extensive use of experienced SSADM consult-ants. This was only solved by an expensive upgradeof the hardware. In another a serious design flawwas not picked up until just before implementation.Because the problems were discovered late on inthe project they were costly and disruptive.

All three projects experienced severe problems 1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

207

in their relationships with users. The users for twoof the large systems who were ‘involved’ withtheir SSADM project were also trying to underminethe project, by actively lobbying to obtain theirown separate system. After two years the SSADMprojects had produced nothing tangible and theyhad lost confidence in the process. In the third theusers had to wait four years for any tangibleresults that they could work with. This haddamaged their morale and that of the developers.

4.8. Staff turnover and capability

SSADM is intended to compensate for staff turnoverand make inexperienced staff more productive(Longworth 1989, 1992).

In this sample of projects examined, staff turnoverwithin the developers was not observed to be thereason that the projects were running into problems.A main reason was that the developers hadgenerally only two or three years of projectexperience and junior staff often less than that.The development of their skills did not seem tohave been very methodical. Handling IT projectswith or without SSADM takes a lot of skill whichthe inexperienced people did not have. The twomost successful projects both had project managerswith over 10 years good experience.

4.9. Learning SSADM

The introduction to the four volumes of the SSADMmanual which weigh 10 pounds states:

‘This document has been produced as aReference manual for analysts trained inSSADM. It does not, by itself, constitute anadequate training guide and is not thereforesuitable for use by trainees.’ (CCTA 1990, 1-INT-vii).

To understand SSADM v.4 it is necessary to attendan authorized three week training course and workthrough three files (AIMS Systems 1990). Thetraining, although expertly carried out, lackedconviction because there was no empirical vali-dation of the methodology. There were no refer-ences showing SSADM’s development or compari-sons with other approaches. There was no data ondeveloper productivity, product quality, timelinessof delivery or user satisfaction. SSADM is such a

Page 8: Managing software quality by standardization

Research Section P. Middleton

large and sophisticated methodology that studentstended to be overwhelmed with the detail andcomplexity. As Skidmore and Wroe (1990) observe,SSADM models often become more elaborate thanthe system they are attempting to explain.

5. DISCUSSION

SSADM is a data driven, ‘waterfall’ methodologywhich takes a rational and technical view of theworld. It is prescriptive although streamlining isallowable, but the results of this present studyraise questions about key parts of the methodology.

5.1. The ‘waterfall’ approach?

The drawbacks of this ‘waterfall’ approach havebeen well documented by Jones (1990), Parnas andClements (1986), Spence and Carey (1991) andothers. The difficulties confirmed by this researchwere those of managing ever shifting requirements,poor relationships with users and the emergenceof serious problems late in a project. This indicatesthat the waterfall method may not be the best wayto develop the majority of public sector IS projects.

5.2. A recommended and efficient way?

There are many different models of the softwareprocess, for example: ad hoc; waterfall; participative;soft systems; prototyping; incremental; spiral;reuse; formal; rapid application development;object-oriented, and software capability. All ofthese approaches have different cost profiles,strengths and weaknesses. Each model has manyvariants and to some extent they can be combinedto produce hybrid approaches.

Given the wide range in the size, complexity,risk, context and urgency of the IS projects in thepublic sector, it would seem sensible for thedevelopers and users to agree the optimumapproach for their particular needs. It is suggestedthat, rather than streamlining a prescriptiveapproach, a specific development model for eachproject should be created.

5.3. Strategic Stability?

The assumption of a stable and coherent strategiccontext within which an SSADM project would 1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

208

take place was shown to be invalid in practice.While it may be possible to improve the strategieswithin organizations, it seems unrealistic to assumea clear strategy before starting an IS project.Therefore a software development model which ismore comfortable with strategic ambiguity anduncertainty is required. The models most at homein this environment would be rapid applicationsdevelopment (Martin 1991) and incremental devel-opment (Gilb 1988, Humphrey 1990).

5.4. Firm requirements?

This research shows that good quality, firm,detailed requirements are very hard to obtain.This has been observed by others, for example,Humphrey (1990) states:

‘For large-scale programs, the task of statinga complete requirement is not just difficult;it is practically impossible. Unless the jobhas previously been done and only modestchanges are needed, it is best to assume therequirements are wrong’ (p. 25).

As this seems to the general case then the bestsoftware development approach would seem to bethe incremental one. This entails defining a smallcore of major requirements which are to bedelivered within a few months, rather than years.The system is then grown rather than built.

5.5. Project size?

Examining the range of projects being worked onshowed that there were roughly 100 small PCprojects to one large mainframe project. SSADM,which has its roots in MODUS from the mainframesof the early-1970s (Hares 1990), did not adapteasily to these microcomputer-based projects. Itwould therefore seem sensible to revise SSADMto focus on the majority of projects which aresmall. This would help the junior staff learngood practice early on. Tailoring SSADM takesconsiderable skill (Ince 1991) which these staff areunlikely to possess.

There seemed no reason why the larger projectscould not each be broken down into several smallerones and handled with an incremental developmentprocess. The data from Putnam and Meyers (1992)show that larger projects require exponentially

Page 9: Managing software quality by standardization

Research Section Managing Software Quality by Standardization

more effort than smaller ones. Many of the objec-tives of SSADM could be achieved by restrictingdevelopers and users to small projects until theirskills develop. This indicates that SSADM shouldhave small projects as its primary rather thansecondary target: – an indication that can beutilized at a later stage of this investigation.

5.6. One track approach?

SSADM implies a one track approach and doesnot encourage the creation of alternative scenariosfor the project as it develops. In contrast, experi-enced analysts were always planning what actionto take if problems arose with staff turnover,hardware performance, procurement delays andso on. The successful analyst is careful to focushis attention on high risk areas, not on the wholeproject as indicated by SSADM.

5.7. Staff development?

Cheap literature and authorized training is claimedto be one of the main advantages of SSADM asan open methodology. The problem is that if thestandard approach is flawed, as this evidenceindicates, then the books, training and consultancyare obliged to propagate the errors.

6. RECOMMENDATIONS

This present work confirms the findings of Boehm(1981), Jones (1986) and DeMarco and Lister (1987)that people rather than methodology are thekey factors in raising productivity. SSADM andmethodologies in general should be grounded inthe real world of unruly users and uncooperativecustomers. This led to the following recommen-dations:

(i) While SSADM and indeed any methodologywill be flawed or inappropriate for manyprojects, simply abandoning methodologiesmay not be the best answer. There is valuein providing a framework or series oftemplates within which to discuss and man-age information systems projects.

(ii) Tailored life cycles should be used for eachproject. The proposed approach: evolution-ary, participative, soft systems or others as

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

209

appropriate, would be explicitly stated andrelated to the context and risk factors ofthe project.

(iii) SSADM should be significantly reduced insize so that it can be assimilated andunderstood. This could be achieved byremoving the amount of prescription, whichpractitioners find hard to apply.

(iv) The evidence suggests that making SSADMor any information systems methodologymandatory is counter-productive, because itreduces ‘ownership’ of the projects by thedevelopers and hinders organizational learn-ing.

(v) Project register: for funds to be releasedfor a software project, large quantities ofdocumentation have to be created and sub-mitted. A few parameters of budget, size,time scales should be held centrally andupdated as the project progresses. Thiswould quickly provide a profile of actualperformance and a basis for benchmarking.

(vi) Process maturity: assessment of the softwaredevelopment capability of the various partsof the public sector, would provide a firmerbase for training and development plans. Ifthis sample is typical, virtually all of thepublic sector would rank at the bottom of theSoftware Engineering Institute’s SoftwareProcess Maturity model.

(vii) User communication: the emphasis needs tobe shifted from diagramming techniquesand CASE tools to the sociology of projects.The evidence of the poor relationships withusers indicates that much larger and cheapergains could be made from tackling the‘soft’ organizational rather than the ‘hard’technical issues within IS development.

(viii) Project managers: the need for a ‘cadre’of well trained and experienced projectmanagers was mentioned by several prac-titioners. The use of a fast track, careerdevelopment scheme to improve the skillsof staff is required.

7. CONCLUSION

SSADM has done a service in promoting an orderedapproach to software development and spreadingthe use of valuable techniques. SSADM offers

Page 10: Managing software quality by standardization

Research Section P. Middleton

political protection to civil servants should projectsgo wrong. It also helps to give the appearance ofadministrative control over the complex process ofsoftware development. It is therefore useful to thepublic sector.

The evidence presented in this study wouldconfirm the observations of others that SSADM isflawed (Thomson 1990, Crinnion 1991, Holloway1993, Trull 1990, Simpson 1990, Cockcroft 1990).

SSADM has three main weaknesses.First, it is based on a waterfall model of software

development which is appropriate for only aminority of projects. This finding is confirmed bydata from Verner and Cerpa (1997) that showsthat only 27% of organizations use the waterfallapproach for all projects.

Secondly, as Avgerou and Cornford (1993) pointout, there are limitations in trying to standardizeprofessional practices for activities which are solittle understood. The fact that SSADM has noempirical base and has not been systematicallymonitored in use by the CCTA would indicatethat it is unlikely to be an effective way to directthe efforts of software developers.

Thirdly, the high level of prescription whichaccounts for much of the size and complexity ofthe methodology is not useful for practitioners.This is because the prescriptions are based onthe assumption that the waterfall life cycle isappropriate for many projects when clearly it isnot. Also the techniques prescribed in such detail,without any empirical justification, are not foundto produce benefits and are therefore ignored.

The idea of a ‘best’ method is misleading becauseof the diverse range of projects and developers.The generic lesson from this research is that anorganization is probably unwise to use a heavilyprescriptive methodology to improve its softwaredevelopment performance. If an organization feelsthat methodology is the appropriate route it shouldensure there is some empirical data to underpinthe proposed approach and that there is someobjective monitoring of the methodology whenin use.

REFERENCES

AIMS Systems, 1990. SSADM Version 4 Training Manuals(Vols. 1, 2 & 3). Aims Systems, England.

Andrews, D. and D. Ince. 1991. Practical Formal Methodswith VDM. McGraw Hill, London.

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

210

Ashworth, A. 1992. The Current Position of SSADM. InNCC Software Seminar, ICL Belfast.

Avgerou, C. and T. Cornford. 1993. Developing InformationSystems: Methodologies, Techniques and Tools. Macmillan,London.

Boddie, J. 1987. Crunch Mode. Building Effective Systems ona Tight Schedule. Yourdon Press, Prentice-Hall, EnglewoodCliffs, NJ.

Boehm, B. 1981. Software Engineering Economics. PrenticeHall, Englewood Cliffs, NJ.

Boehm, B.W. et al. 1984. A software development environ-ment for improving productivity. Computer, 17, 6, 30–42.

CCTA. 1990. SSADM Version 4 Reference manuals (Vols.1, 2, 3 & 4). NCC Blackwell, Oxford.

Checkland, P. 1981. Soft Systems Methodology. Wiley,Chichester.

Coad, P. and E. Yourdon. 1991. Object-Oriented Analysis.Prentice-Hall, Englewood Cliffs, NJ.

Cockcroft, M. 1990. Structured methods – not seeing thewood for the trees. In Southcourt Conference MakingStructured Methods Work, London.

Crinnion, J. 1991. Evolutionary Systems Development.Pitman, London.

Cutts, G. 1991. SSADM. 2nd edn,. Blackwell ScientificPublications, Oxford.

DeMarco, T. 1982. Controlling Software Projects: Manage-ment Measurement and Estimation. Prentice-Hall, Engle-wood Cliffs.

DeMarco, T. and T. Lister. 1987. Peopleware: ProductiveProjects and Teams. Prentice-Hall, Englewood Cliffs,Dorset House, New York.

Downs, E., P. Clare and I. Coe. 1992. SSADM: Applicationand Context 2nd edn, Prentice Hall, Hemel, Hempstead.

Eva, M. 1990. SSADM: A Practical Approach. McGrawHill, Maidenhead.

Gilb, T. 1988. Principles of Software Engineering Manage-ment. Addison-Wesley, Wokingham, United Kingdom.

Hares, J.S. 1990. SSADM for the Advanced Practitioner.Wiley, Chichester.

Holloway, S. 1993. The method and the madness.Government Computing, 7, 2, 10.

Page 11: Managing software quality by standardization

Research Section Managing Software Quality by Standardization

Holway, R. 1992. A review of the financial performanceof United Kingdom computing services companies. In TheHolway Report, Richard Holway Ltd., Farnham, Surrey.

Humphrey, W.S. 1990. Managing the Software Process.Addison-Wesley, Reading, MA.

Ince, D. 1991. The making of a modern methodology.Informatics, 11(9), 24–26.

Jones, G.W. 1990. Software Engineering. Wiley, New York.

Jones, T.C. 1986. Programming Productivity. McGraw-Hill,New York.

Longworth, G. 1989. Getting the System you want; a User’sGuide to SSADM. NCC Publications, Manchester.

Longworth, G. 1992. Introducing SSADM (Version 4).NCC Blackwell, Manchester.

Longworth, G. and D. Nicholls. 1986a. SSADM Manual(Version 3). NCC Publications, Manchester.

Longworth, G. and D. Nicholls. 1986b. SSADM Manual(Version 3). Vol. 2 Techniques and Documentation. NCCPublications, Manchester.

Longworth, G., D. Nicholls, and J. Abbott. 1988. SSADMDeveloper’s Handbook. NCC Publications, Manchester.

Martin, J. 1991. Rapid Application Development. Macmillan,New York, NY.

Matsumoto, Y. and Y. Ohno. 1989. Japanese Perspectivesin Software Engineering. Addison-Wesley, Singapore.

Mumford, E. and M. Weir. 1979. Computer Systems inWork Design – The ETHICS Method. Associated BusinessPress, London.

National Audit Office. 1987. Inland Revenue: Control ofMajor Development in the use of Information Technology.HMSO, London.

National Audit Office. 1989. Department of Social Security:Operational Strategy. HMSO, Session 1988–89, HC111,London.

National Audit Office. 1990. Managing Computer Projectsin the National Health Service. HMSO, London.

Naumann, J.D. and A.M. Jenkins, 1982. Prototyping: the

EDITOR’S COMMENT

SSADM (Structured Systems Analysis and Design Method) is a software development methodology widely

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

211

new paradigm for systems development. MIS Quarterly,6(3) (September), 277–291.

Parkinson, J. 1991. Making CASE Work. Blackwell, Oxford,United Kingdom.

Parnas, D.L. and P.C. Clements. 1986. A rational designprocess: how and why to fake it. IEEE Transactions onSoftware Engineering, 12, 2, 251–257.

Putnam, L.H. and W. Meyers. 1992. Measures for Excel-lence: Reliable Software on Time and on Budget. YourdonPress, Englewood Cliffs, NY.

Royce, W.W. 1970. Managing the Development of LargeSoftware Systems: Concepts and Techniques In WESCON.

Simpson, I. 1990. Quoted in G. Black. Promises to befulfilled. Financial Times, October 19, 7.

Skidmore, S., R. Farmer and G. Mills. 1992. SSADMVersion 4 Models and Methods. NCC, Manchester.

Skidmore, S. and B. Wroe. 1990. Introducing SystemsDesign. NCC Blackwell, Manchester.

Song, X, and L.J. Osterweil. 1994. Experience with anapproach to comparing software design methodologies.IEEE Transactions on Software Engineering, 20, 5, 364–384.

Spence, I.T.A. and B.N. Carey. 1991. Customers do notwant frozen specifications. Software Engineering Journal,6, 6, 175–180.

Springett, P. 1993. The method and the madness.Government Computing, 7, 6, 10.

Thomson, I. 1990. SSADM: the last word. GovernmentComputing, 4, 5, 28–29.

Trull, H. 1990. Quoted in G. Black. Promises to befulfilled. Financial Times, October 19, 7.

Verner, J.M. and N. Cerpa. 1997. Prototyping: does yourview of its advantages depend on your job. Journal ofSystems Software, 36, 1, 3–16.

Whitefield, A. and A. Sutcliffe. 1992. Case study in humanfactors evaluation. Information and Software Technology, 34,7, 443–453.

Young, G. 1993. Quoted in P. Springett. The methodand the madness. Government Computing, 7, 6, 10.

Page 12: Managing software quality by standardization

Research Section P. Middleton

used throughout the European Community and elsewhere, and is even mandatory for all UK centralgovernment software development projects. Based on an impressively broad background in using andtraining SSADM and the collection of data and experience from a variety of large and small SSADMprojects, this paper identifies a number of difficulties and flaws when using SSADM. Although some ofthe conclusions seem fairly obvious, they still contradict a number of the official SSADM guidelines, asexemplified here. In addition, the conclusions are based on broad and systematically investigated practicalexperience and not (!) based just on anecdotal evidence as is often the case in these kinds of studies.

—WS

1997 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 3, 201–212 (1997)

212