Upload
isabelmargarido
View
209
Download
1
Tags:
Embed Size (px)
DESCRIPTION
Presentation done at SEPG Europe 2013 in Amesterdam, organised by the CMMI Institute. This presentation gives valuable lessons that can be applied by any organisation that wants to improve processes.
Citation preview
Recommendations to Avoid Problems and Difficulties in Implementing CMMI® High Maturity Levels
Isabel Lopes Margarido [email protected] Faculty of Engineering, University of Porto
Raul Moreira Vidal FEUP
Marco Vieira FCTUC/CISUC
SEPG Europe 2013: 15th of November | Amsterdam, Netherlands
João Pascoal Faria FEUP/INESC TEC
introduction
methodology
problems and recommendations
conclusions
agenda
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 2/22
introduction method problems conclusions
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 3/22
introduction method problems conclusions
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 4/22
Campo, SEPG EU 2011
objectives
learn solutions to avoid them
understand problems and difficulties
gain additional knowledge about CMMI
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 5/22
introduction method problems conclusions
motivation
typically organisations using CMMI improve performance
many programs failed in CMMI level 5 organisations
SEI was concerned with high maturity
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 6/22
introduction method problems conclusions
high maturity levels dependencies
dependent on lower maturity levels
maturity Level (ML) 2 M&A
building blocks for ML4: QPM, OPP (PPM, PPB)
knowledge base for quantitative continuous improvement at ML5 (CAR, OPM)
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 7/22
introduction method problems conclusions
case studies
3 organisations appraised at CMMI-DEV ML 5:
2 organisations, 1 business unit
real problems and difficulties from industry
documents, tools, interviews
how CMMI implementation was conducted and the processes were defined
what processes, metrics and tools were developed
how people were actually using them
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 8/22
introduction method problems conclusions
analysis
analysed an SEI report (McCurley
and Goldenson 2010)
verified which problems were common
to the case study, literature and SEI survey
verified which of the recommendations were sustained by literature and SEI survey
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 9/22
introduction method problems conclusions
problems (p) and recommendations (r) entry conditions
P1: avoid underestimation R1: plan: maturing levels, analysing and understanding HML,
building and maturing PPB and PPM
P3: understand quantitative nature of level 4 R3: involve statistician R4: six sigma R5: review goals
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 10/22
introduction method problems conclusions
P2: don’t begin the house by the roof R2: ML 2 and 3 need to mature
process definition and implementation
P4: copied processes
R6: reflect organisation culture
R7: involve experts and process users
P5: multicultural environments
R8: share processes, lessons learnt
P6: impose processes
R8 and R9: goals specific for business units related to organisation business goals
R10: indicators at different report levels
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 11/22
introduction method problems conclusions
process definition and implementation
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 12/22
introduction method problems conclusions
P7: dissemination problems R11: commitment from entire organisation R12: training contents, specialised training R13: coaching and monitoring
P8: lack of institutionalisation R14: top management set goals for gradual institutionalisation, monitor and reward R13 and R15: give time for metrics and processes to mature
P9: meaningless uncorrelated metrics
P10: metrics definition (collect and analyse data)
R16: use goal question metric or equivalent
R17: unambiguous, repeatable, understandable
R18: size metrics according with work product
R19: interpret in context
R20: variables normalisation
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 13/22
metrics definition introduction method problems conclusions
P11: first data uncorrelated
P12: metrics categorisation
P13: baselines not applicable
R21: several cycles
R22: let PPM and PPB become stable
R23: categorise data
R24: aggregate normalised data
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 14/22
metrics definition introduction method problems conclusions
P14: abusive elimination of outliers
R25: quarantine
R26: maintain data points that are unique but recurrent
P15: not all projects are measurable
R27: specific based measures normalised to get derived measures; R14
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 15/22
metrics usage introduction method problems conclusions
P16: effort estimates
R28: use expert judgment when needed
R29: any related historical data
R30: iterative planning, real time sampling
P17: people behaviour
R12, R13 and R31: personal data not used to evaluate people
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 16/22
metrics usage introduction method problems conclusions
P18: tools setup R32: give time for tool setup R33: do not use data collected when defects affect metrics
P19: overhead P20: tools requirements R34: only collect necessary data R35: automated and unperceptive data collection R36: discipline people, change mentality
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 17/22
tools setup introduction method problems conclusions
59% found in the case studies, literature and survey several problems shared between the studied
organisations 37,5% found in literature were found in the case studies 53,3% found in the survey were found in the case studies
16,7% also found in literature
“the percentage of problems shared in more than one organisation/source indicates that they can occur when implementing HML, so organisations should be aware of them.” (Lopes Margarido et al., December 2013, SQP)
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 18/22
problems analysis introduction method problems conclusions
wide variety of implementation methods
variance of performance results
SCAMPI:
evaluates a sample
does not evaluate performance
future research
framework to evaluate quality of implementation of CMMI practices
definition published in PROFES 2012
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 19/22
summary and future research introduction method problems conclusions
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 20/22
think about... introduction method problems conclusions
questions
http://paginas.fe.up.pt/~pro09003/
partially sponsored by:
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 21/22
references C. Hollenbach and D. Smith. (2002) A portrait of a CMMI level 4 effort Systems
Engineering. 52-61. J. McCurley and D. R. Goldenson (2010) Performance Effects of Measurement
and Analysis: Perspectives from CMMI High Maturity Organizations and Appraisers. CMU/SEI.
Lopes Margarido et al. “Lessons Learnt in the Implementation of CMMI® Maturity Level 5”, presented at QUATIC, Lisbon, Portugal, 2013.
Lopes Margarido et al. “Towards a Framework to Evaluate and Improve the Quality of Implementation of CMMI® Practices”, presented at PROFES, Madrid, Spain, 2012.
M. Campo. “Why Maturity Level 5?”, Crosstalk, January/February 2012. 15-18. M. Schaeffer, "DoD Systems Engineering and CMMI," presented at the CMMI
Technology Conference and User Group, 2004. P. Leeson, "Why the CMMI® does not work," presented at the SEPG Europe,
Prague, Czech Republic, 2009. R. Radice, "Statistical Process Control in Level 4 and Level 5 Software
Organizations Worldwide," presented at the Software Technology Conference, 2000.
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 22/22
acronyms
CMMI – Capability Maturity Model Integration
FEUP – Faculty of Engineering, University of Porto
M&A – Measurement and Analysis
ML – Maturity Level
OPM – Organisational Performance Management
OPP – Organisational Process Performance
P – Problem
PPB – Process Performance Baselines
PPM – Process Performance Models
QPM – Quantitative Project Management
R - Recommendation
SEI – Software Engineering Institute
SQP – Software Quality Professional
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 23/22
images http://api.ning.com/files/hpf*xOTebDs- F23o6FETZ3j*3sNiONWjfXjTJCzprPjU5bS1
WJoGgWBjMPIOiQkm3SbZ41ijncrJ4K2aT-6dM9QURwHK3led/Dissemination2.jpg -26-06-2010
http://blog.pmtech.com.br/wp-content/uploads/Square-Paradox.jpg – 29-04-2011
http://www.signsexpressshop.co.uk/prodpics/1103.gif – 29-04-2011
Benjamin Haas/Shutterstock, http://cynthiayildirim.posterous.com/how-can-we-measure-the-size-of-
the-universe – 29-04-2011
http://ryanstephensmarketing.com/blog/wp-content/uploads/2009/10/one_size_fits_all.JPG
http://evolvingwe.com/wp-content/uploads/2010/11/image3.png – 29-04-2011
http://www.screenhog.com/sketch/LightbulbIdea.jpg – 21-04-2010
http://igraduatedwhatnow.files.wordpress.com/2009/11/thank_you_small.jpg – 02-05-2010
http://www.articulate.com/rapid-elearning/wp-content/uploads/2008/08/summary-objectives450.gif –
adapted, 25-05-2011
http://www.braxtechconsulting.com/Portals/22771/images/cmmi_certificate.jpg.png - 10-04-2012
http://erikhatch.org/wp-content/uploads/2012/02/no_time_1_.jpg - 31-08-2012
http://blog.inktechnologies.com/wp-content/uploads/2012/06/Statistics.jpg - 31-08-2012
http://images.yourdictionary.com/images/definitions/lg/erase.jpg -31-08-2012
https://earlychildcare.files.wordpress.com/2009/09/child-misbehaving.jpg - 31-08-2012
http://www.thaiworldview.com/jpg/img024.jpg - 31-08-2012
http://image.shutterstock.com/display_pic_with_logo/5880/5880,1268065242,15/stock-photo-
background-concept-wordcloud-illustration-of-scientific-method-research-glowing-light-48226942.jpg -
02-09-2012
http://scm-l3.technorati.com/11/04/08/31079/slow-computer.jpg?t=20110408013303 - slow computer
18-10-2013
15th of November, 2013 ©Isabel Lopes Margarido, SEPG Europe 24/22