Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
2012-05-22
1
ETSF01: Software Engineering Process –
Economy and Quality
Dietmar Pfahl Lund University
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 13 Software Quality
ETSF01 / Lecture 5, 2011
The magic triangle
Project
Quality and Scope
Time Effort (Cost)
Trade-offs!
ETSF01 / Lecture 5, 2011
Operational Definition of Software Quality
• Recommended elements of a quality attribute specification: – Definition/description of the quality attribute – Scale: What is the unit of measurement? – Now: What is the current value (i.e., achieved without the
new software)? – Test: How will the actual value of the quality attribute be
assessed (measured)? – Target range: What is the expected (planned) value range? – Minimally acceptable: What is the lowest acceptable value?
Is there a margin for trade-off if another quality attribute is higher than desired?
ETSF01 / Lecture 5, 2011
Defining Software Quality – Example
Project Mission QR-03 [Accuracy] • Definition/description:
– The prediction quality of the estimates generated by the AESP software tool shall be PRED(0.3)=0.8,
– i.e., the accuracy of an effort estimate generated by the tool shall be within +/‐30% of the actual effort in at least 80% of the tool applications (à in different software projects)
• Scale: – Unit of measurement for effort: Person-Months (PM) – Unit of measurement for accuracy (= abs(actual – estimated effort)): PM – Unit of measurement for relative accuracy (=abs(act. – est. eff.)/act. eff.):
none (dimensionless – or: percentage)
ETSF01 / Lecture 5, 2011
Defining Software Quality – Example
Project Mission QR-03 [Accuracy] • Now: Here we would write down the accuracy of effort estimations made in the
past without the AESP tool (e.g., our subjective estimates). Say we had these results in the past 5 projects:
Proj1: Subjective (expert) estimate = 20 PM / Actual = 25 PM à rel. Accuracy = abs(25-20)/25 = 0.25 (25%) Proj2: Subjective (expert) estimate = 30 PM / Actual = 50 PM à rel. Accuracy = abs(50-30)/50 = 0.4 (40%) Proj3: Subjective (expert) estimate = 12 PM / Actual = 10 PM à rel. Accuracy = abs(10-12)/10 = 0.2 (20%) Proj4: Subjective (expert) estimate = 20 PM / Actual = 30 PM à rel. Accuracy = abs(30-20)/30 = 0.33 (33%) Proj5: Subjective (expert) estimate = 24 PM / Actual = 40 PM à rel. Accuracy = abs(40-24)/40 = 0.4 (40%) ...
• Based on the above 5 cases, without the AESP tool: PRED(0.3)=0.4 – i.e., only in 2 out of 5 cases the accuracy of the subjective estimates was
within +/-30% of the actual effort
2012-05-22
2
ETSF01 / Lecture 5, 2011
Defining Software Quality – Example
Project Mission QR-03 [Accuracy] • Test:
– The actual value of the quality attribute will be assessed by randomly picking 10 out of 60 cases from the PROMISE data repository and calculating the accuracy of estimates for each picked case produced by AESP based on the remaining 59 cases in the data repository (à leave-one-out cross-validation)
• Target range: – Expected value ranges for
effort estimates à [0, 1000] (in person-months) for rel. accuracy à [0, 0.5] (dimensionless)
• Minimally acceptable: – The lowest acceptable value is PRED(0.3)=0.8 – No trade-offs will be accommodated
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 13.5 ISO 9126 (Software Product Quality)
ETSF01 / Lecture 5, 2011
• ISO 9000-2005: Degree to which a set of inherent characteristics fulfills requirements
Definition of “(Software) Quality”
• ISO 8402-1986: The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs
• ISO 9126-1991: The totality of features and characteristics of a software product that bear on its ability to satisfy stated or implied needs
ETSF01 / Lecture 5, 2011
Modes of product quality assessment
• During software development, – to assist developers to build software with the
required qualities • During software acquisition,
– to allow a customer to compare and select the best quality product
• Independent evaluation, – by assessors rating a software product for a
particular community of users
ETSF01 / Lecture 5, 2011
ISO 9126: Software Quality Characteristics
Functionality Does the software provide the functions to satisfy user needs?
Reliability Can the software maintain its level of performance?
Usability How easy is the software to use?
Efficiency Relates to the physical resources used during software execution
Maintainability Relates to the effort needed to make changes to the software
Portability How easy can the software be transferred to a new environment?
ETSF01 / Lecture 5, 2011
ISO 9126 – Characteristics and Attributes Characteristics Attributes Functionality Suitability Interoperability Accuracy
Security Compliance
Reliability Maturity Recoverability Fault Tolerance
Compliance
Usability Understandability Learnability Operability
Attractiveness Compliance
Efficiency Time Behaviour Resource Behaviour
Compliance
Maintainability Analyzability Stability Changeability
Testability Compliance
Portability Adaptability Installability Co-existence
Replaceability Compliance
1:n relation between Characteristics and Attributes (Sub- Characteristics)
(cf. Textbook pp. 299-301)
2012-05-22
3
ETSF01 / Lecture 5, 2011
ISO 9126 – Context
process quality
Process
Process measures (ISO 15504)
internal quality
attributes
external quality
attributes
quality in use
attributes
influences
depends on
influences
depends on
influences
Software products Effect of software product
Internal measures (e.g., size, complexity)
External measures (e.g., response time, memory utilisation)
Quality in use measures Context of
use
ETSF01 / Lecture 5, 2011
Quality in use
• Effectiveness – ability to achieve user goals with accuracy and completeness
• Productivity – avoids excessive use of resources in achieving user goals
• Safety – within reasonable levels of risk of harm to people, business, software, property, environment, ...
• Satisfaction – happy users!
‘Users’ include those who maintain software as well as those who operate it.
ETSF01 / Lecture 5, 2011
Using ISO 9126 – 5 Steps
1. Judge the importance of each quality characteristic for the application, e.g., – Safety critical systems – ‘reliability’ very important – Real-time systems – ‘efficiency’ important
2. Select relevant external measures for the important quality characteristics, e.g., – Mean-time between failures for ‘reliability’ – Response-time for ‘efficiency’ (à ‘time behaviour’)
ETSF01 / Lecture 5, 2011
Using ISO 9126 – 5 Steps (cont’d)
3. Map measure onto ratings to show degree of user satisfaction – Example for measure
‘response-time’
ETSF01 / Lecture 5, 2011
Using ISO 9126 – 5 Steps (cont’d)
4. Identify the relevant internal measures and the intermediate products in which they would appear to get indicators of external quality attributes (only relevant in development mode – not when evaluating existing software) – E.g., for estimating ‘time behaviour’, at the
software design stage the execution time for each transaction could be calculated
ETSF01 / Lecture 5, 2011
Using ISO 9126 – 5 Steps (cont’d)
Response (secs)
Quality score
<2 5 2-3 4 4-5 3 6-7 2
8-9 1 >9 0
5. Overall assessment of product quality: – Map measures to a Likert
scale value, i.e., rate the importance of each quality in the range 1-5
– Rate the importance of each quality attribute on a Likert scale (1-5)
– Multiply quality and importance scores – see next slide
2012-05-22
4
ETSF01 / Lecture 5, 2011
Rated and weighted quality scores
Product A Product B
Product quality
Importance rating (a)
Quality score (b)
Weighted score (a x b)
Quality score (c)
Weighted score (a x c)
Usability 3 1 3 3 9
Efficiency 4 2 8 2 8
Maintain-ability
2 3 6 1 2
Overall totals
17 19
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 13.7 Quality Management Systems
ETSF01 / Lecture 5, 2011
ISO 9000:2000 Standard Family
• Since 2000, the ISO 9000 family consists of a core of three International Standards plus many associate quality standards, technical reports and guides (two of which are mentioned below).
• The family consists of: – ISO 9000: 2000 Quality management system – Fundamentals and vocabulary – ISO 9001: 2000 Quality management system – Requirements* – ISO 9004: 2000 Quality management system – Guidelines for performance
improvement • ISO 90003: 2004 explains how ISO 9001: 2000 can be applied to software and
related services • Associated with the above are:
– ISO 10012 Quality assurance requirements for measuring equipment – Metrological confirmation system for measuring equipment
– ISO 19011 Auditing quality and environmental management systems
* An update was issued in 2008 with clarifications but no new requirements.
ETSF01 / Lecture 5, 2011
ISO 9001:2000 principles
• Understanding the requirements of the customer
• Leadership to provide unity of purpose and direction to achieve quality
• Involvement of staff at all levels
• Focus on the individual which create intermediate and deliverable products and services
• Focus on interrelation of processes that deliver products and services
• Continuous process improvement
• Decision-making based on factual evidence
• Mutually beneficial relationships with suppliers
ETSF01 / Lecture 5, 2011
ISO 90003: 2004 – Elements
• Systemic Requirements and Guidelines • Management Requirements and Guidelines • Resource Requirements and Guidelines • Realization Requirements and Guidelines • Remedial Requirements and Guidelines
Details can be found at URL: http://www.praxiom.com/iso-90003.htm
ETSF01 / Lecture 5, 2011
ISO 9000:2000 Standard Family – Summary
• ISO 9001 is one of a family of standards that specify the characteristics of a good quality management system (QMS)
• Can be applied to the creation of any type of product or service, not just IT and software
• Does NOT set universal product/service standards
• DOES specify the way in which standards are established and monitored
2012-05-22
5
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 13.8 Process capability models
ETSF01 / Lecture 5, 2011
Software Process Assessment with CMMI
Staged
ML 1
ML2
ML3
ML4
ML5
Defines 5 maturity levels (MLs); in order to achieve a maturity level all process areas associated to this level, plus all process areas associated with levels below must have a certain minimal capability.
Continuous
A maturity profile is established based on the capabilities of individual process areas
PA PA
Proc
ess
Are
a C
apab
ility
0
1
2
3
4
5
PA
ETSF01 / Lecture 5, 2011
CMMI Family
• 4 different models - for different application scopes:
– CMMI-SE/SW/IPPD/SS – CMMI-SE/SW/IPPD – CMMI-SE/SW – CMMI-SW – All models have a continuous and
staged representation. • Definitions:
– SS: Supplier Sourcing – IPPD: Integrated Product and
Process Development – SE: Systems Engineering – SW: Software Engineering
• Assessment is done via – document inspection, – questionnaires, and – interviews
ETSF01 / Lecture 5, 2011
Causal Analysis and Resolution Organizational Innovation and Deployment 5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
(Repeatable)
Quantitative Project Management Organizational Process Performance
Requirements Development Technical Solution Product Integration Verification Validation Organizational Process Focus Organizational Process Definition Organizational Training Risk Management Integrated Project Management (for IPPD*) Integrated Teaming* Integrated Supplier Management** Decision Analysis and Resolution Organizational Environment for Integration*
Requirements Management Project Planning Project Monitoring and Control Supplier Agreement Management Measurement and Analysis Process and Product Quality Assurance Configuration Management
1 Performed
Process Areas Level CMMI Levels and Process Areas (staged)
* Integrated Product/Process Development (IPPD) – add-on to the Engineering processes ** Acquisition – add-on to the Engineering processes
ETSF01 / Lecture 5, 2011
Progression of Process Maturity Levels
CMM & CMMI: • Start is always
at ML 1
1. initial
2. managed
3. defined
4. quant. managed
5. optimizing
basic project management
process definition
quantitative process management
process control
ETSF01 / Lecture 5, 2011
Stage “Managed”
input (requirements)
control (budget, schedule, standards)
resources (staff, tools)
construct the system
output (code, documentation)
2012-05-22
6
ETSF01 / Lecture 5, 2011
Stage “Defined”
design & define
code & unit test
integrate/ system test
requirements
design methods
tools, staff etc
system design
tested modules inspection criteria
tools, staff etc. test plans
tools, staff etc. software system
ETSF01 / Lecture 5, 2011
Stage “Quantitatively Managed”
design & define
code & unit test
integrate/ system test
requirements
design methods
tools, staff etc
system design
tested modules inspection criteria
tools, staff etc. test plans
tools, staff etc. software system
manage directives
design faults
directives
code faults
dire
ctiv
es
syst
em fa
ilure
s
ETSF01 / Lecture 5, 2011
Optimize old development system
new development system
Stage “Optimizing”
ETSF01 / Lecture 5, 2011
CMMI – What we now about the projects
In Out ML 1
target
Pro
ba
bili
ty
target
Pro
ba
bili
ty
target
Pro
ba
bili
ty
target
Pro
ba
bili
ty
target
Pro
ba
bili
ty
In Out ML 2
In Out ML 3
In Out ML 4
In Out ML 5
ETSF01 / Lecture 5, 2011
Software Process Assessment with CMMI
Staged
ML 1
ML2
ML3
ML4
ML5
Defines 5 maturity levels (MLs); in order to achieve a maturity level all process areas associated to this level, plus all process areas associated with levels below must have a certain minimal capability.
Continuous
A maturity profile is established based on the capabilities of individual process areas
PA PA
Proc
ess
Are
a C
apab
ility
0
1
2
3
4
5
PA
ETSF01 / Lecture 5, 2011
CMM Assessment Results (continuous)
2012-05-22
7
ETSF01 / Lecture 5, 2011
CMMI Assessment Follow-Up Activities
• Action plan – generated by SEPG, assessment team, and key personnel from organization/projects
– Address findings (how to address weaknesses)
– Strategy for addressing additional KPAs
– Detailed actions, responsibilities, budget, and schedule
– Reviewed/approved by management
Lack of Compliance (Maturity Gap)
Importance
KPA x4
KPA x5 KPA x3
KPA x2
KPA x6
KPA x1
medium high low
low
medium
high
ETSF01 / Lecture 5, 2011
Other Process Assessment Approaches
• BOOTSTRAP (originating from an EU research project) • SPICE (ISO 15504)
– Different structure of processes than in CMMI (roughly following ISO 12207)
– 6 Maturity levels (beginning at Level 0) • Company-specific assessment methods:
– Nortel, Siemens, Trilium, … • “Light-weight” assessment methods (incl. self-assessment)
– Developed by SEI – Developed by consulting companies
ETSF01 / Lecture 5, 2011
Assessment Method SPICE Software Process Improvement and Capability Determination
(ISO 15504)
ETSF01 / Lecture 5, 2011
Process Categories ISO 15504
• Customer-supplier (CUS) • Engineering (ENG) • Project (PRO) • Support (SUP) • Organizing (ORG)
Customer-supplier process category: • CUS.1 Acquire software product and/or
service • CUS.2 Establish contract • CUS.3 Identify customer needs • CUS.4 Perform joint audits and reviews • CUS.5 Package, deliver, and install the
software • CUS.6 Support operation of software • CUS.7 Provide customer service • CUS.8 Assess customer satisfaction
http://www.rad.fr/spice1.htm
ETSF01 / Lecture 5, 2011
SPICE Levels and Performance Attributes CMMI Level Spice Level Attribute Comment
0. Incomplete The process is not implemented or is unsuccessful
1. Performed 1. Performed 1.1. Process performance
The process produces its defined outcomes
2. Managed 2. Managed 2.1. Performance Management
The process is properly planned and monitored
2.2. Work product management
Work products are properly defined and reviewed to ensure they meet requirements
3. Defined 3. Established 3.1. Process definition
The processes to be carried out are carefully defined
3.2. Process deployment
The processes defined above are properly executed by properly trained staff
4. Quantit. Managed
4. Predictable 4.1. Process measurement
Quantitatively measurable targets are set for each sub-process and data collected to monitor performance
4.2. Process control
On the basis of the data collected by 4.1 corrective action is taken if there is unacceptable variation from the targets
5. Optimizing 5. Optimizing 5.1. Process innovation
As a result of the data collected by 4.1, opportunities for improving processes are identified
5.2. Process optimization
The opportunities for process improvement are properly evaluated and where appropriate are effectively implemented
ETSF01 / Lecture 5, 2011
ISO 15504 Process Assessment
• For each process in the relevant Process Reference Model (e.g., ISO 12207):
– For each set of attribute level criteria assess whether:
N: not achieved 0-15% P: partially achieved >15%-50% L: largely achieved >50%-85% F: fully achieved >85%
• Example assessment result: – http://www.sqi.gu.edu.au/AppraisalAssistant/ss/i15504result.jpg
2012-05-22
8
ETSF01 / Lecture 5, 2011
ISO 15504 Process Assessment Example
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 13.9 Techniques to help enhance software quality
ETSF01 / Lecture 5, 2011
Relative Cost of Faults
1-2
5
10
Maintenance 200
20
50
Source: Davis, A.M., “Software Requirements: analysis and specification” (1990)
Requirements
Design
Unit test
Integration test
System test
Acceptance test
Code
ETSF01 / Lecture 5, 2011
Reviews & Inspections complement testing
ETSF01 / Lecture 5, 2011
Inspection Process Fagan Inspection (1976)
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 13.10 Testing
ETS200 – Software Testing
2012-05-22
9
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 11 Managing people in software environments
ETSF01 / Lecture 5, 2011
Main topics of chapter 11
• People selection (àskills and personality) • People development (à skill development) • Models of motivation (à focus on the individual) • Organisational issues • Issues of health and safety • Ethical and professional concerns
ETSF01 / Lecture 5, 2011
Impact of Skills & Experience on Cost
4.14
3.37
2.21
1.85
1.72
1.67
1.64
1.60
1.57
1.49
1.48
1.29
1.28
1.27
1 1.5 2 2.5 3 3.5 4 4.5
Personnel Capability
Personnel Experience
Product Complexity
Required Reliability
Use of Software Tools
Execution Time Constraint
Required Reuse
Multisite Development
Main Storage Constraint
Platform Volatility
Personnel Continuity
Required Development Schedule
Database Size
Documentation
Cost
Fac
tor
Productivity Range
COCOMO Cost Drivers
ETSF01 / Lecture 5, 2011
SWEBOK Knowledge Categories www.swebok.org
IEEE, together with industry partners (Rational, SAP, Boeing etc...), have specified Software Engineering Body of Knowledge which is meant to be a standard for training and certification of software developers.
ETSF01 / Lecture 5, 2011
Hawthorne effect
• 1920’s – series of experiments at the Hawthorne Plant of Western Electric, Chicago
• Found that simply showing an interest in workers increased their productivity
• This illustrated how the state of mind of workers influenced their productivity
ETSF01 / Lecture 5, 2011
Oldham-Hackman job characteristics
• Identified the following characteristics of a job which make it more ‘meaningful’
– Skill variety – Task identity – Task significance
• Two other factors contribute to job satisfaction: – Autonomy – Feedback
Motivation!
2012-05-22
10
ETSF01 / Lecture 5, 2011
Ethical and professional concerns
• Ethics relates to the moral obligation to respect the rights and interests of others – goes beyond strictly legal responsibilities
• Three groups of responsibilities: – Responsibilities that everyone has – Responsibilities that people in organizations have – Responsibilities relating to your profession
ETSF01 / Lecture 5, 2011
Professional ethics
• Professionals have knowledge about the technical domain that the general public does not
• Ethical duty of the expert to warn lay people of the risks involved in a particular course of action
• Many professions, or would be professions, have codes of conduct for their members e.g.
– http://www.acm.org/about/code-of-ethics – www.computer.org/computer/code-of-ethics.pdf
ETSF01 / Lecture 5, 2011
Software Project Management
Chapter 12 Working in teams
ETSF01 / Lecture 5, 2011
Evolution of a Group
Five basic stages: • Forming – members of the group get to know one another and
try to set up some ground rules about behaviour • Storming – conflicts arise as various members of the group try
to exert leadership and the group’s methods of working are established
• Norming – conflicts are largely settled and a feeling of group identity emerges
• Performing – the group has settled its relationships and expectations and works efficiently and effectively; the emphasis is now on the tasks at hand
• Adjourning – the group dissolves and shares their experience with others
TUCKMAN B (1965) "Developmental Sequence in Small Groups" Psychological Bulletin 63 pp. 384-399
ETSF01 / Lecture 5, 2011
Remarks on Group Evolution
• ALL stages are important and should be performed
• The storming stage is particularly critical – The “level of conflict” is an indicator for future success
• Low conflict level in the group is often a signal for little involvement
• Conflicts are often useful to get solidarity (if conflicts are managed reasonably and not intensified)
– “The threshold theory of conflict” (pp. 80-82 in Group Dynamic, Forsyth)
– Management plays a crucial role during storming
ETSF01 / Lecture 5, 2011
Balanced Teams
• Meredith Belbin studied the performance of top executives carrying out group work at the Hendon Management Centre
– Tried putting the ‘best’ people together in ‘Apollo’ teams – almost invariably did badly
– Identified the need for a balance of skills and management roles in a successful team
Company Worker
Resources Investigator
Co-Ordinator (Chairperson)
Monitor Evaluator
Shaper
TeamWorker
Plant
Completer Finisher
The Successful
Team
www.belbin.com
2012-05-22
11
ETSF01 / Lecture 5, 2011
Management team roles (Belbin)
• The co-ordinator – good at chairing meetings
• The ‘plant’ – an idea generator • The monitor-evaluator – good
at evaluating ideas • The shaper – helps direct team’s
efforts • The team worker – skilled at
creating a good working environment
• The resource investigator – adept at finding resources, including information
• The completer-finisher – concerned with getting tasks completed
• The implementer – a good team player who is willing to undertake less attractive tasks if they are needed for team success
• The specialist (added in 1996) – the ‘techie’ who likes to acquire knowledge for its own sake
ETSF01 / Lecture 5, 2011
Group Performance vs. Task Types
• Additive tasks – The effort of each participant
is summed • Compensatory tasks
– The judgements of individual group members are averaged
– Errors of some are compensated for by judgements of others
• Disjunctive tasks – There is only one correct
answer and someone must: • Come up with right
answer • Persuade/convince the
others that it is correct • Conjunctive
– The task is only finished when all components have been completed
Some tasks are better carried out collectively while other tasks are better delegated to individuals
ETSF01 / Lecture 5, 2011
Obstacles to good team decisions
• Inter-personal conflicts – see earlier slide on team formation
– Conflicts tend to be dampened by emergence of group norms – shared group opinions and attitudes
• Risky shift – people in groups are more likely to make risky decisions than they would as individuals
ETSF01 / Lecture 5, 2011
Delphi approach
• To avoid dominant personalities the following approach is adopted
1. Enlist co-operation of experts 2. Moderator presents experts with problem 3. Experts send in their recommendations to the moderator 4. Recommendations are collated and circulated to all experts 5. Experts comment on ideas of others and modify their own
recommendation if so moved 6. If moderator detects a consensus, stop; else back to 4
ETSF01 / Lecture 5, 2011
Team Spirit
• Group members are aware of and appreciate the actions of other members contributing to group success
– Creation of a ‘collective mind’
• Approaches to promote team spirit:
– Egoless programming
– Chief programmer teams
– XP – Scrum
ETSF01 / Lecture 5, 2011
Rest of this week
• Read textbook chapters 11-13 • Suggested exercises:
– No specific exercises recommended
• Finish and submit these deliverables: – Tool Version 2 (TV2) – Burndown Chart (BC2) – User Manual (UM)
• Meet project supervisors • Plan/start acceptance testing of your
Supplier Group’s TV2
2012-05-22
12
ETSF01 / Lecture 5, 2011
Next week
• Read textbook chapters 11-13 • Perform acceptance testing of your
Supplier Group’s TV2 • Submit these deliverables:
– Acceptance Test Report (ATR) – Retrospective Report (RR)
• Lecture 6: – Course Review and Exam Outlook