15
UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – [email protected]

UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – [email protected]

Embed Size (px)

Citation preview

Page 1: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

UKSMA 2005

Lessons Learnt from introducing IT Measurement

Peter Thomas – [email protected]

Page 2: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Topics

• The IT organisation

• Measurement – in particular Function Points

• Lessons Learnt

• The future

• Summary

Page 3: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

The IT organisation

• As is– Dedicated local staff committed to customer

departments– Little formal Project Management discipline,

need to react to customers needs and wants

• To be– Delivery split between own and IT Services

companies– Move work to more formally controlled

projects with a measured project delivery rate

Page 4: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

FPs enable Productivity Measurement,but other measurements are required to ensure overall

delivered quality.

• Function Points may be used to measure the productivity of the software delivery process– Productivity = FP/100 Hours of project effort

• Note there is no measure against people

• A basket of measurements should be put in place to indicate when action is required to keep the software delivery process in balance. – A minimum set of measurements in the basket are:

• Schedule eg Schedule Days/1000FP• Defect Rates eg Defects per delivered FP• Customer Satisfaction

– Focusing on one element in the basket can lead to aberrant behaviour

Page 5: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Function Point Analysis have more than one flavour

• A fully auditable value – expensive and not really required for tracking trend line behaviour

• A reasonable approximation – this is much easier and cheaper if all data and transactions are treated as Average (FP Light)

• A rough guess – this is of limited value other than an indicator of the effort to produce a better value

Page 6: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

FP Light was chosen as the output product Sizing Measure

• Advantages– Less effort to count

• Saving is around 50% compared to full IFPUG count

• Disadvantages– Less accurate as a sizing metric

• Accuracy is within 15-20 %

• Comment:– The accuracy of the FP light number is sufficient since the

accuracy obtained on many other measures which go to make up the required metrics set is often even less.

Page 7: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Each Data Flow and Data Store has a set number of FPs.

Content Management

Product History

UpdateProduct

Maintain Policy

ProductInquiry

Administrator

CustomerProducts

Policy

Agent

Select Sales

Product Information

Product Information

Policy Information

Data Flows

Data

ILF = 10 FP

EIF = 7 FP

EI = 4 FP

EO = 5 FP

EQ = 4 FP

Function Point Light Definitions

Administrator

Policy Details

internal

external

Page 8: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Lessons Learnt - Training

• FP training for IT professionals is available from a number of suppliers but staff churn means that the training requirement is an on-going requirement

Page 9: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

culture, comparison

• Culture– The management team using the metrics needs to provide

sponsorship, promotion and direction to enable them to be successfully embedded in the culture

– (This presentation is a variant of the one shared with them)

• Comparison– When using productivity to assess the software process (and/or

setting productivity targets), areas of the organisation with similar project attributes should be chosen.

• There are hard and easy function points – compare with care

– Many factors affect the productivity achieved on any given project.

• Project Type, Platform, Architecture, Software framework, None coding effort, Skill and experience levels, Process, Tools etc etc etc

Page 10: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Exceptions & Effort

• Exceptions – FP Lite Not a useful metric for every component or every project

• must decide how to handle these exceptions

• Project effort– It is important for consistency that the set of activities for which

the effort is measured against FP size is the same on a project by project basis within component areas.

– For accurate assessment of productivity the resource must correspond to the delivery FP counted.

a suitable level of detail to allow analysis by role and stage• where appropriate “non countable” activities and their resource should be

excluded from the measurement• Event / Milestone which determines the start and end of the project and or

phase also needs agreeing. These should fit with the hand off and hand in of work given to the external teams.

Page 11: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Process

• To count FPs, the analysis workshops need to be built into the standard development lifecycles.

• Metrics group must have a method of recognising when a project is due an FP count.

• A FP number should be required before project closure is accepted.

• The metrics group and FP Analysts provide but do not own the data.

• They are not in a position to make decisions based on the measurements, ie to change the software development lifecycles.

• Techniques to utilise the data must developed• What are the questions? Is the data sufficient?• What can be safely compared & summed

Page 12: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Human Resources

• Human Resources– If project staff fear that they are not producing enough

Function Points, they will be tempted to inflate the value whenever possible to make the figures reflect themselves in a positive light.

– If FP counting is applied inappropriately it will lose credibility

• The Metrics group will need management support to ensure – counts are performed according to the rules.– exceptions are recognised– inappropriate comparisons are avoided– analysis is appropriate

» not simply a global average

Page 13: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Boundaries & documentation

• Boundaries– FPs need to count flows across and data

within application boundaries• The setting of a standard set of boundaries within

NUL is therefore vital for consistency of FP counts

• Process Documentation– FP counting is made easier with FP friendly

documentation.• Every effort should be made to make the standard

documentation set FP friendly.

Page 14: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

The future use of Function Points can be expanded to provide project support

• Requirements validation– Function Point analysis tests the requirements (and high level design)

documentation and models for usability and completeness. The Analyst can raise queries and issues which can avoid the project becoming troubled.

• Need an early FP count and a repeat count for each major change

• Help in the validation of project estimates.– Carrying out an early Function Point estimate will allow project

estimates to be validated based on historical productivity figures.• Need a database of historical project data and an early FP count.

• Help in tracking of projects using a modified Earned Value process.– Function points delivered during the project lifecycle may be

reported against expected delivery.

Page 15: UKSMA 2005 Lessons Learnt from introducing IT Measurement Peter Thomas – thomapf@uk.ibm.com

Summary

• The FP rollout is going well– The organisation now has the capability to size projects

and software– Therefore can implement the correct measures to decide

whether initiatives are improving or worsening the IT capability

• Lessons Learnt– Executive and management support can resolve these

• Future holds additional benefits– Better estimating– Better project management discipline, particularly

requirements & test management