10
Together. Free your energie Implementing a level 5 metrics programme @Capgemini Netherlands Selecting The Right Set... Niteen Kumar - 26/11/2013

Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

  • Upload
    nesma

  • View
    174

  • Download
    0

Embed Size (px)

DESCRIPTION

As Capgemini, as part of the CMMI ML5 program, further enhances systematic measurement programs to monitor, control and improve process and product quality, they also keep in mind the right tradeoff between benefit and cost to achieve this. The presentation emphasizes on selection of the right set of X (leading) and Y (Lagging) factors distributed over Cost, Time and Quality and trade off between them. Every engagement has an end goal that needs to be achieved within a certain budget, with available resources and within a given timeline. In an perfect world, an engagement manager would have as much time, money and resources needed to achieve the highest-quality result; however, this is seldom the case in reality. Think of the three factors – cost, quality and time – as legs of a stool drifting away from each other. When the X and Y factors are identified it gives the engagement manager the opportunity of making an informed decision in selecting the right delivery strategy.

Citation preview

Page 1: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

Together. Free your energies

Implementing a level 5 metrics programme @Capgemini Netherlands Selecting The Right Set...Niteen Kumar - 26/11/2013

Page 2: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

2 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE LEADING & LAGGING

INDICATORS

LAGGING INDICATORS

SPECIFIC MEASUREABLE ATTAINABLE

RELEVANT TIME BOUND

Are the measurement target oriented?

Can it be measured? Does it cost too much?

What story will it tell? By When?

Page 3: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

3 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE LEADING & LAGGING

INDICATORS

REGRESSION EQUATION

COST QUALITY SCHEDULE

APPLICATION DEVELOPMENT KPI PORTFOLIO

APPLICATION MAINTENANCE KPI PORTFOLIO

Page 4: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

4 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE LEADING & LAGGING

INDICATORS

REGRESSION EQUATION

LIST OF X FACTORS

code complexity, encapsulation, program language & tools, code review checklist, coding skills and experiences with the program

languages and tools used, code review skills and experiences, % of tickets having existing solution in KEDB,

quality of reused source code, requirements volatility, integration test methods and tools, Integration test skills and

experiences with methods and tools used, quality of reused test cases ,domain, requirements volatility, quality attributes,

readability of documents, architecture measures, code complexity, encapsulation, requirements methods and tools, Right shore Ratio requirements inspection checklist, high-level design methods and tools, high-level design inspection checklist, detailed design methods and tools, detailed design review/inspection checklist, program language & tools, code review checklist

usage, domain experiences, requirements skills and experiences with methods and tools used, requirement inspection skills and experiences, high-level design skills and experiences with the methods and tools used, high-level design inspection skills and experiences, detailed de-sign skills and experiences with the methods and tools used, detailed design

review/inspection skills and experiences, # of CR’s Rolled Back, coding skills domain, architecture measures, high-level design methods and tools, high-level design skills and experiences with methods and tools used, quality of reused high-level

design documents, Rework Effort

Page 5: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

5 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE DEVELOPMENTPROJECT –

INDICATORS

REGRESSION EQUATION

CONTRIBUTION MARGIN DELIVERED DEFECT DENSITY

COST QUALITY SCHEDULE

COST OF QUALITYX - FACTORS

X - FACTORS

X - FACTORS

Requirements Volatility Skill Index Reusability Effort by SDLC Phase Review , Rework Effort Resource Cost Code Complexity / Quality Overrun / Underrun # of times resource changed during build

Rework Effort Test Coverage Testing Rate Review Effort Skill Level Code Complexity / Quality Test Preparation Effort

Resource Availability Requirements Volatility Skill Index Reusability Rework Effort # of times resource changed during build

Y - FACTORS

% EFFORT VARIANCE

Y - FACTORS

DEFECT REMOVAL EFFICIENCY

Y - FACTORS

% SCHEDULE VARIANCE The “X” factors influencing the outcome of “Y” was identified during the workshops.

The identified “X” factors are logical in nature and may change during statistical validation

APPLICATION DEVELOPMENT KPI PORTFOLIO

Page 6: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

6 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE DEVELOPMENTPROJECT –

INDICATORS

REGRESSION EQUATION

APPLICATION DEVELOPMENT KPI PORTFOLIO

SPRINT   COST

SPRINT NUMBER

i

Total # ofFeatures / Use Case Planned

i

Estimated Size (SP)

i

Actual Size (SP) i

Planned Productivity

Factori

% Completion

i

Total Planned

Effort(P.Hrs)

i

ACTUAL EFFORT For Below Activities (Person Hours)P.Hrs

Total Actual Effort

(P.Hrs)i

Total # of User

Stories MODIFIEDDuring the Iteration.

i

TotalFeatures / Use Case

COMPLETEDi

TotalFeatures / Use Case

ACCEPTEDi

Overall Effort Variance in

%i

TOTAL AT ENGAGEMENT LEVEL g   902,00 1985,00 2177,00 1561,00 1153,00 1412,00 1805,00 1089,00 1478,00 1498,00 7935,00 174,00 628,00 534,00  

i i i i i i h SCOPE DSGN MODL COD TST-P TST-E REFTRSCRUMMASTER

REV REW h h h h i

SCOPE & REFTR -1

g g g g g 50,00 25,00 g g g g g 38,00 g g g 63,00        

1 8 139 152 7 60 245,00 7,00 56,00 70,00 4,00 2,00 19,00 25,00 10,00 25,00 5,00 223,00 4 7 6 -

2 12 200 175 8 100 67,00 9,00 27,00 48,00 45,00 22,00 27,00 23,00 18,00 22,00 2,00 243,00 0 12 8 263%

3 9 150 130 7,5 100 120,00 11,00 12,00 19,00 27,00 19,00 16,00 32,00 26,00 21,00 21,00 204,00 5 9 10 70%

QUALITY DETAILS QUALITY

Total Number Of Planned Test

Casesi

Total Number Of Test Cases Executed

i

Total Number Of INTERNAL Defects

i

Total Number Of EXTERNAL

Defectsi

DOD Performedi

% DOD Steps Performed

i

Total # Of Impediments

Reportedi

Total # Of Impediments

Removedi

Defect Removal Effeciency (%)

i

1289 1084 1443 472 46 - 284 163  

h h h h h h h h i

                 

28 20 17 9 YES 70 4 3 65%

34 30 21 16 NO 100 7 7 50%

12 12 25 12 YES 100 3 3 50%

Page 7: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

7 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE MAINTENANCEENGAGEMENT

INDICATORS

REGRESSION EQUATION

APPLICATION MAINTENANCE KPI PORTFOLIO

CONTRIBUTION MARGIN

DEFECT REMOVAL EFFICIENCY FOR RELEASE DELIVERED DEFECT DENSITY FOR RELEASE

COST QUALITY SCHEDULE

COST OF QUALITYX - FACTORS

X - FACTORS

X - FACTORS

Idle Time (under discussion) Resource Cost Right shore Ratio Skill Index Effort Spent on KT % of tickets having existing solution in KEDB # of modules reoccurring impacted System Downtime % Additional Work

Rework Effort Test Coverage Testing Rate Test Preparation Effort System Downtime # of CR’s Rolled Back % RCA Compliance # of reoccurring modules impacted

Resource Availability Skill Index Reusability Rework Effort % of tickets having existing solution in KEDB Elapsed Time to Assign / Investigate / Testing / Implementation Per Ticket # of times incident/service request assigned within and between teams

Y - FACTORS

% EFFORT VARIANCE FOR

KT and RELEASE

PRODCTIVITY (AET)

% BACKLOG OF TICKET

Y - FACTORS

% INCIDENT REDUCTION % FIRST TIME PASS % OF SYSTEM S’FULLY TRANSITIONED DURING KT STAGE

Y - FACTORS

% SCHEDULE VARIANCE FOR KT PHASE

% RESPONSE & RESOLUTION COMPLIANCE The “X” factors

influencing the outcome of “Y” was identified during the workshops.

The identified “X” factors are logical in nature and may change during statistical validation

% SCHEDULE VARIANCE FOR RELEASE

Page 8: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

8 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE INDICATORSMAINTENANCE

PROJECT

REGRESSION EQUATION

APPLICATION MAINTENANCE KPI PORTFOLIO

INCIDENT / PROBLEM MANAGEMENT - LEADING & LAGGING INDICATORS

                                                              

    5811 926 5895 842 20545 3 64 98,91 263 95,54 0 0    

Reporting Month

Priority

Tickets Received Current Month

Backlog Tickets of Previous

Month

Number of Tickets

Resolved

Number of Backlog Tickets

Effort Spent in Closing The TicketP. Hours

Average Effort per

Ticket

# of Response

Breach

% SLA Compliance

# of Resolution

Breach

% SLA Compliance

FTR %First Time Right

AverageElapsed Time To

Assign TicketHours : Mins:

Sec

AverageElapsed Time

To ClosureH:MM:S

Jun-13 P0       -   -   -   -   -    Jun-13 P1 11 1 12 0 29,75 2,48 2 83,33 0 100,00   - 1:30:22 4:54:00Jun-13 P2 13 4 17 0 40,5 2,38 3 82,35 0 100,00   - 0:33:00 11:33:00Jun-13 P3 807 132 845 94 2552 3,02 5 99,41 45 94,67   - 0:02:00 30:10:00Jun-13 P4 44 35 47 32 216 4,60 1 97,87 0 100,00   - 0:02:00 149:38:00Jun-13 P5       -   -   -   -   -    Jun-13 S0       -   -   -   -   -    Jun-13 S1 2 1 3 0   0,00 0 100,00 0 100,00   - 0:05:00 10:56:00Jun-13 S2 10 2 10 2 4 0,40 6 40,00 2 80,00   - 43:34:00 52:14:00Jun-13 S3 437 82 475 44 1134 2,39 0 100,00 55 88,42   - 0:02:00 24:25:00Jun-13 S4 53 46 52 47 259 4,98 0 100,00 0 100,00   - 0:16:00 169:45:00Jun-13 S5       -   -   -   -   -    

Page 9: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

9 Capgemini Leading and Lagging Indicators – NESMA Presentation Niteen Kumar

MEASUREMENT OBJECTIVE

SCOPE INDICATORSMAINTENANCE

PROJECT

REGRESSION EQUATION

Example: Delivered Defect Density

RAE = Requirement Analysis EffortTPE = Test Preperation EffortCRE = Code Review Effort

* Example

*DDD = 3.0-0.05*RAE-0.06*TPE – 0.025CRE

Page 10: Implementing a level 5 metrics programme - Niteen Kumar - NESMA 2013

The information contained in this presentation is proprietary.© 2012 Capgemini. All rights reserved.

www.capgemini.com

About Capgemini

With more than 120,000 people in 40 countries, Capgemini is one of the world's foremost providers of consulting, technology and outsourcing services. The Group reported 2011 global revenues of EUR 9.7 billion.Together with its clients, Capgemini creates and delivers business and technology solutions that fit their needs and drive the results they want. A deeply multicultural organization, Capgemini has developed its own way of working, the Collaborative Business ExperienceTM, and draws on Rightshore ®, its worldwide delivery model.

Rightshore® is a trademark belonging to Capgemini