14
Sponsored by the U.S. Army Evaluation Center and G6 Spring 2002 http://www.armysoftwaremetrics.org Volume 6, Number 1 Army Metrics (Continued on page 2) Background The environment of the TACOM- ARDEC Software Enterprise is dynamic, continually adapting to the changing busi- ness opportunities in the Army research and acquisition communities. Projects are created and terminated to meet the evolv- ing technical needs of Army customers. To perform effectively in a dynamic environ- ment, the Software Enterprise must con- tinually collect, analyze, and use measure- ment data to support various management decisions. The Software Enterprise consists of fourteen diverse projects. Customer needs determine which engineering processes these projects employ, such as systems * Tank, Automotive, and Armament Command – Armament Research and Development Engineering Center engineering, software engineering, acquisi- tion support services, or a mix of all three. Each project is assigned to one of three cat- egories: Mission Support - Development or maintenance of software products Acquisition Services - Independent Verification & Validation, testing, or project management support for Army program offices Internal Support - Quality assurance, configuration management, and various services for other Software Enterprise projects The size of the TACOM-ARDEC projects ranges from a single employee to 40 employees, including contractor support. In the interview that follows, Cheryl Jones, head of the Software Enterprise performance management project, de- scribes how the Software Enterprise struc- tured the overall organizational measure- ment process and defined a standard set of organizational measures. She also ex- plains some of the challenges, the lessons learned, and how the organization contin- ues to monitor and respond to its measure- ment needs. Where did you start in planning such a challenging measurement program? The objective of the measurement program is to improve the overall perfor- mance of the Software Enterprise, so we started at the top level of the organization. The first step was to define the informa- Tailoring and Implementing an Organizational Measurement Process In February 2002, the Software Enterprise of TACOM-ARDEC,* Picatinny Arsenal, New Jersey was the first U.S. government organization to successfully achieve a formal, level 3 assessment under the Capability Maturity Model-Integrated (CMMI). This was the culmination of several years of organizational process improvement activities. As part of the process improvement project, the organization began collecting measurement data in July 2001. Data collection was actually the apex of an extensive effort to define and implement an organizational measurement process. The major challenge in establishing this process was to implement a new way of doing business that used objective, quantitative data to support management decisions at all levels of the organization. New business procedures required cultural change by requiring all members of the organization to collect, report, and assess data for defined organiza- tional measures. The underlying, critical task was to define a set of measures that not only addressed the management goals of TACOM-ARDEC, but also supported the information needs of fourteen diverse projects within the Software Enterprise.

Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 1

Sponsored by the U.S. Army Evaluation Center and G6 Spring 2002http://www.armysoftwaremetrics.org Volume 6, Number 1

ArmyMetrics

(Continued on page 2)

Background

The environment of the TACOM-ARDEC Software Enterprise is dynamic,continually adapting to the changing busi-ness opportunities in the Army researchand acquisition communities. Projects arecreated and terminated to meet the evolv-ing technical needs of Army customers. Toperform effectively in a dynamic environ-ment, the Software Enterprise must con-tinually collect, analyze, and use measure-ment data to support various managementdecisions.

The Software Enterprise consists offourteen diverse projects. Customer needsdetermine which engineering processesthese projects employ, such as systems

* Tank, Automotive, and Armament Command – Armament Research andDevelopment Engineering Center

engineering, software engineering, acquisi-tion support services, or a mix of all three.Each project is assigned to one of three cat-egories:

• Mission Support - Development ormaintenance of software products

• Acquisition Services - IndependentVerification & Validation, testing, orproject management support for Armyprogram offices

• Internal Support - Quality assurance,configuration management, and variousservices for other Software Enterpriseprojects

The size of the TACOM-ARDECprojects ranges from a single employee to40 employees, including contractor support.

In the interview that follows, CherylJones, head of the Software Enterpriseperformance management project, de-scribes how the Software Enterprise struc-tured the overall organizational measure-ment process and defined a standard set oforganizational measures. She also ex-plains some of the challenges, the lessonslearned, and how the organization contin-ues to monitor and respond to its measure-ment needs.

Where did you start in planning such achallenging measurement program?

The objective of the measurementprogram is to improve the overall perfor-mance of the Software Enterprise, so westarted at the top level of the organization.The first step was to define the informa-

Tailoring and Implementing anOrganizational Measurement Process

In February 2002, the Software Enterprise of TACOM-ARDEC,* Picatinny Arsenal, New Jerseywas the first U.S. government organization to successfully achieve a formal, level 3 assessment

under the Capability Maturity Model-Integrated (CMMI). This was the culmination of severalyears of organizational process improvement activities. As part of the process improvementproject, the organization began collecting measurement data in July 2001. Data collectionwas actually the apex of an extensive effort to define and implement an organizational

measurement process. The major challenge in establishing this process was to implement a new way of doing business thatused objective, quantitative data to support management decisions at all levels of the organization. New business proceduresrequired cultural change by requiring all members of the organization to collect, report, and assess data for defined organiza-tional measures. The underlying, critical task was to define a set of measures that not only addressed the management goals ofTACOM-ARDEC, but also supported the information needs of fourteen diverse projects within the Software Enterprise.

Page 2: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

2 Insight

About Insight

Insight is the newsletter of the U.S. Army Software Metrics Office. The newsletteris intended to provide information on efforts to measure software. Readers areencouraged to submit articles for the newsletter that address any facts, opinions,or points of interest.

Letters and articles for publication should be addressed to: Insight Editor,U.S. Army Software Metrics Office, ATTN: CSTE-AEC-MA, Park Center IV, 4501Ford Avenue, Alexandria, VA 22302-1458.

Insight is mailed upon request. Past issues are also available. Please indicatethe volume/number and date of the issue(s) you would like to receive. Call (703)681-3823 or visit http://www.ArmySoftwareMetrics.org.

Trademarks and Endorsements

The Army Software Metrics Office is sponsored by the U.S. Army EvaluationCenter and G6.

All product names referenced in this issue are trademarks of their respectivecompanies. The mention of a product or a business in Insight does not constitutean endorsement by the Army Software Metrics Office or the Department ofDefense. The opinions expressed represent the viewpoints of the authors andare not necessarily those of the Department of Defense.

Tailoring and Implementing an Organizational Measurement Process (from p. 1)

tion needs of the organization, then iden-tify a common set of measures that wouldbe reported by all fourteen of the projectswithin the organization.

However, like most technology-basedorganizations, the Software Enterprise is acomposite of many types of projects. Itwas difficult to define a set of commonorganizational measures without violatingthe first principle of the Practical Softwareand Systems Measurement (PSM) process:“The project information needs and objec-tives shall drive the measurement require-ments.” Therefore, all of our decisions indefining a top-down organizational mea-surement process were evaluated for theirimpact at the project level.

We wanted to ensure that the organi-zational measurement process providedeach project with the necessary data tomake informed project management deci-sions. In an optimum measurement pro-cess, the same data can be used to support

the information needs of higher manage-ment levels. Based on these concepts, wecontinued our planning process with thesetwo steps:

• Define the information needs of theprojects.

• Identify project measures that addressboth the organizational and project-specific information needs.

How did you structure the organiza-tional measurement process?

The purpose of our measurement pro-cess is to define and improve the collectiveperformance of all activities in the Soft-ware Enterprise. To accomplish this, wedesigned measures to monitor two levelsof performance: the Software Enterprise asa whole and the individual projects. Orga-nizational measures monitor the overallperformance of the Software Enterprise,and project-level measures address thefourteen individual projects.

Software Enterprise managers definedthe organizational information needs andthe six common measures that were docu-mented in the Organizational MeasurementPlan (see Table 1 on page 3). These com-mon measures were selected to monitor thecombined performance of all projectswithin the Software Enterprise. The diffi-culty in collecting and reporting these mea-sures varied with each project. The com-mon organizational measures are closelyrelated to the information needs and mea-sures of a software development process,so those projects in the Mission Supportcategory had the least difficulty in report-ing the measures because these projectsdevelop or maintain software products.However, many of the projects in the Ac-quisition Services and Internal Supportcategories had different information needs.

We structured the organizational mea-surement process to collect its input, orbase measures, from two sources:

• Software Enterprise activities thatwere common to all projects, includingtraining, customer satisfaction, andoverall financial performance

• Project management activities that de-fine the individual information needs

The Software Enterprise uses the in-formation gained from the organizationalmeasurement efforts to support SoftwareEnterprise and project management activi-ties, including:

• Process Performance - Provide feed-back on the organization’s ability tomeet defined objectives.

• Process Improvement - Identify thoseprocess areas that need improvement,define requirements for training, andidentify resources required to supportprocess improvement.

• Customer Satisfaction - Evaluate cus-tomer satisfaction with the servicesand products provided by the SoftwareEnterprise.

• Business Cases - Develop additionalbusiness cases to acquire future work.

Page 3: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 3

(Continued on page 4)

• Project Estimation and Planning -Organizational and other historicaldata will be used as a basis for estima-tion during future project planning.

• Project Monitoring and Control -Ensure that projects are completingdefined tasks within budget and onschedule, while meeting performanceand quality objectives.

Project leaders define their ownproject-level measurements and documentthem in their Project Measurement Plan.The plan must address the common orga-nizational measures, as well as any addi-tional measures based on project-specificinformation needs. Project leaders are re-sponsible for collecting their project dataand delivering the six organizational mea-sures to the organizational measurementdatabase for aggregation and analysis. Thekey to success is to define organizationalmeasures that are useful at the projectlevel, as well as the organizational level.

Managers evaluate data on the perfor-mance of individual projects at SeniorManagement Reviews (SMRs) that areheld every quarter. The SMRs also supporttwo secondary activities:

• Process Improvement - The measuresare used to report the process maturityof individual projects. This data isused to identify key process areas thatneed improvement, define require-ments for training, and identify

project resource and staffing require-ments. The measures also provide anopportunity to disseminate lessonslearned among other projects.

• Project Planning - The measures areentered into the organizational mea-surement data repository. The Soft-ware Enterprise will use this data andany other historical data as a basis forfuture project planning.

Software Enterprise managers reviewthe organizational measurement needs andprocesses at least every six months. Peri-odic reviews of individual measurementprocesses are also scheduled with the indi-vidual projects. These reviews ensure thatthe measurement processes conform toplans and that project-level performancemeasurements are integrated with busi-ness-area performance requirements.

What guidance did you use to plan theorganizational measurement process?

We followed three process guidelines:ISO/IEC 15939, CMMI, and PSM.

ISO/IEC 15939, Software Measure-ment Process, is the international standardthat defines the required tasks and out-comes of a measurement program.

The CMMI Measurement and Analy-sis process was used to plan for the pro-cess improvement effort and provided thebaseline for evaluating the measurementprocess. The CMMI was selected because

it provides a single model that integratesthe various technical activities of systemsengineering and software engineering. TheCMMI concepts were extrapolated to pro-vide a process model that could be used toevaluate the Acquisition Support projectsin the Software Enterprise. (For more in-formation, see the CMMI article on page11 of this issue of Insight or visit www.sei.cmu.edu/cmmi/.)

The PSM Guidebook provided thedetailed “how-to” instructions for imple-menting an effective measurement process.The PSM process allows managers to tai-lor their measurement process to the ap-propriate information needs and character-istics of their activity. The nine principlesdefined in the PSM Guidebook were thefoundation for all of the Software Enter-prise measurement efforts (see Table 2,page 4.)

How did you select the organizationalmeasures?

We used the PSM process. PSM en-sures that an iterative measurement pro-gram is tailored to the characteristics andinformation needs of both the individualprojects and the Software Enterprise as awhole. The PSM process was originallydeveloped to describe the measurementprocess for a single project, so of course itwas directly useful in defining project-specific measures. The challenge was toapply the PSM process to an organiza-tional measurement program that spansfourteen projects.

The primary activities of the PSMmeasurement process (shown in Figure 1,page 5) are:

1. Obtain and Sustain Commitment bygaining organizational support formeasurement and analysis by address-ing and continually updating the infor-mation needs.

2. Plan Measurement by identifyingand prioritizing the organizational andproject information needs, selectingmeasures that address these informa-tion needs, and integrating the mea-

Table 1. All projects report data for a set of six common organizational measures.

Page 4: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

4 Insight

surement into existing managementand technical processes.

3. Perform Measurement by collectingand processing the measurement data,

analyzing the data, and converting thedata into management information forappropriate courses of action at boththe project and Software Enterpriselevels.

4. Evaluate Measurement by continu-ally assessing and improving the se-lected measures, the indicators, andthe collection and analysis process.

5. Integrate the Measurement Processby establishing those activities as es-sential steps in the organization’s pro-cesses. This activity was an essentialstep in the success of the Software En-terprise organizational measurementprocess.

How did you tailor your organizationalmeasures to the needs of the SoftwareEnterprise?

In the PSM approach, all measuresare driven by the organization’s informa-tion needs. We followed this approach byfirst identifying the information needs thatour measurement process would address.Software Enterprise executives partici-pated in a March 2001 workshop to iden-tify and prioritize the top-level informationneeds for our organization. Table 3 (onpage 5) outlines the information needs theyidentified.

Based on these information needs,they identified a set of common organiza-tional measures that every project shouldcollect. Participants worked with an Ex-ecutive Measurement Planning Template toidentify and rank candidates for organiza-tional measures, completing the followingcategories for each nominated measure:

• Business Goals• Software Process (Strategy) Goals• Relative Importance• Indicator to Support Analysis• Base Measures• Availability of the Base Measures in

the Existing Process

The end product was an Organiza-tional Measurement Plan that describestwenty-three selected organizational mea-sures, the organizational measurement re-pository, and the procedure for incorporat-ing project data into that repository. Al-though twenty-three organizational mea-

Table 2. The nine PSM measurement principles ensure that the information needsof the specific projects and the organization are addressed.

Tailoring and Implementing an Organizational Measurement Process (from p. 3)

Page 5: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 5

Figure 1. The Software Enterprise used the PSM process to design an iterativemeasurement program that is tailored to the characteristics and informationneeds of both the Software Enterprise and the individual projects within it.

(Continued on page 6)

sures were identified, only six were consid-ered to be common to the informationneeds of all fourteen projects. These sixcommon organizational measures wereslated for collection (to the extent appli-cable) by all fourteen projects. The otherseventeen measures were identified forreporting only if they provide informationthat could improve either the project’s ororganization’s defined processes.

The workshop participants also estab-lished process implementation goals, suchas providing tools and software to assistwith establishing and formalizing the mea-surement process.

How did you tailor your organizationalmeasures to the needs of the projects?

TACOM-ARDEC requires projectmanagers to define and tailor a set of issue-driven measures to manage their process.However, identifying information needshas been one of the most difficult tasks forthe project measurement programs. Man-gers of the smaller projects usually havenot implemented a formal measurementand analysis process, so, for them, PSM isa new way of doing business. A major dif-ficulty that these projects had was to ex-plain their information needs in enoughdetail to select appropriate measures.

To strengthen this process, SoftwareEnterprise Measurement and Analysisleads also attended a series of measure-ment planning workshops to identifyproject-specific information needs and toselect appropriate measures based on bothproject-specific needs and the organiza-tional measures. During the workshop,project leaders completed a spreadsheetthat identified each information need as itrelated to key processes and products.They also reviewed sample rationales formeasurement, including:

• To provide feedback on high-priority,project-specific information needs

• To meet organizational requirements,satisfy customer goals, provide feed-back on a process element or product

• To provide feedback on a process im-provement area

Project leaders were then responsiblefor identifying their project-specific mea-sures, organizational measures applicableto their project, and their use of the organi-zational measures. Projects are allowed to

tailor the organizational measures to meettheir specific requirements, but eachproject should develop one list of mea-sures—rather than separate lists of projectand organizational measures.

Table 3. Software Enterprise managers identified these organizational informationneeds during planning workshops, and then selected measures to support them.

Page 6: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

6 Insight

What other kinds of guidance have youprovided to the projects?

We work closely with the projects tohelp guide measurement planning as wellas the data collection process. We oftenfocus on the following areas:

• Measurement should be a daily activityof all team members, not the sole re-sponsibility of a single “measurementanalyst.”

• Measurement specifications and mea-surement plans serve different func-tions. Measurement specifications rep-resent the technical description of thedata or “what” needs to be collected;the measurement plan specifies detailson “how.”

• The objectives and tasks of eachproject must be clearly defined. Wecreated a common Work BreakdownStructure for the Software Enterpriseto help projects identify their tasks—especially the Acquisition Servicesprojects.

• Projects must understand the differ-ence between “process” and “prod-ucts” and define appropriate measures.

• Specific measures may be used formore than one information need, butthe projects must differentiate betweenthe information that is obtained to ad-dress each information need.

It’s also important to address howmeasurement is integrated into eachproject’s overall process, especially regard-ing risk management and estimation proce-dures. Many projects were missing a link tothe risk measures. Any measures that areidentified as part of the risk profile (mea-sures and thresholds) must be specified aspart of the measurement plan.

Launching a measurement programoften changes existing procedures, caus-ing cultural impact and opposition.How did you address this?

Implementing any new processchanges the way people conduct their day-

to-day activities. The challenge is to over-come the personal resistance that is usuallyinherent in change. The Software Enter-prise used two guidelines to minimize per-sonal resistance: start small and gain thesupport of the project personnel.

PSM guidance clearly states that allmeasurement processes must “start small.”A small initial effort has the least impacton the resources and workload of projectpersonnel. So we’ve used an incrementalapproach to implement our organizationalmeasurement process. Right now, in thefirst phase, all projects are required to col-lect a small set of six measures. Projectsare also encouraged to start small withtheir own list of additional project mea-sures. If a project has an extensive list ofproject measures, they are encouraged toprioritize the measures for incrementalimplementation.

During the planning phase, we evalu-ated how the measurement process wouldimpact cost, schedule, and personnel. Thepotential resource impact was monitored ateach step of the planning process—fromspecification of the organizational mea-sures, through the early draft plans, to thefirst implementation of data collection.The impact was minimized by a step-by-step approach in which all participantslearned from early mistakes and improvedtheir practices. For example, the initialdata collection effort allowed the projectrepresentatives to deliver any data that wascurrently available. In subsequent data de-liveries, Software Enterprise representa-tives worked with the projects to refine andstandardize the data to conform to the sixcommon base measures.

A major difficulty in minimizing thesize of the organizational measurementprogram was to define a single, concise setof measures that could address the infor-mation needs of all fourteen projects andhigher management levels in the organiza-tion. The underlying problem is that theobjectives of different groups in the sameorganization are not the same and may of-ten be contradictory.

For example, every organization mea-sures schedule. However, the underlyingdata and the importance of schedule mea-sures will vary within the organization.Most project managers are primarily con-cerned with developing a product that willmeet the stated requirements for functionalcapability and reliability. The importanceof schedule and cost objectives is deter-mined by other sources, such as the cus-tomer and senior managers. Senior manag-ers in the organization may be primarilyconcerned with estimating the length oftime until delivery of the next major soft-ware product. The marketing or businessmanager may be concerned with the timeit takes to market a new capability and theimpact of a possible delay on marketshare. A process manager may be con-cerned with an overall increase or de-crease of the average software develop-ment time and its impacts on other pro-cesses in the organization. The organiza-tional measurement process must select acommon schedule measure that will ad-dress all of these information needs.

Efforts to gain the support of theproject personnel were focused on provid-ing the participants with an understandingof the measurement process and the poten-tial benefits to their projects. During themonths of planning, formal training pro-grams were conducted for both the PSMand CMMI processes, as well as measure-ment planning workshops that helpedproject representatives identify effectiveinformation needs and measures. Theseworkshops ensured that project personnelwere part of the measurement planningprocess. Then, during the early stages ofimplementation, Software Enterprise rep-resentatives provided one-on-one coach-ing with project personnel.

The measurement process will be-come increasingly successful as projectmanagers experience the benefits first-hand, when the initial set of measures isused more extensively to support theproject decision-making process.

Tailoring and Implementing an Organizational Measurement Process (from p. 5)

Page 7: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 7

What tools do you use to support theactual collection of measurement data?

The PSM Insight tool was selected tosupport the centralized data managementeffort of the Software Enterprise. PSMInsight is a desktop tool that the ArmySoftware Metrics Office developed to sup-port the PSM process. PSM Insight is aWindows-based program that usesBorland’s Delphi, Version 2.0. Windowsprovides a familiar, easy-to-use interfaceto the advanced Delphi data managementcapabilities. This combination provides anovice user with a high level of flexibilityin data management, including data modi-fication, data browsing, and sophisticatedgraphing capabilities. It can be used ateach individual’s desktop, and training isavailable in a program that can be down-loaded from the web at www.psmsc.com.

PSM Insight supports many of the ob-jectives of the Software Enterprise mea-surement process, including:

• Allowing project personnel to defineand collect their own data for the sixorganizational measures and theirproject measures. PSM Insight wastailored to the defined organizationaland project-specific measures. Tem-plates were defined, along with validentries for data validation.

• Minimizing the size and cost of themeasurement effort by allowing theprojects to work with any data thatmay be available in existing projectmanagement processes. Many projectsalready had cost, schedule, quality,and other records that had been col-lected in Excel, Access, MicrosoftProject, or other formats. PSM Insightprovides the capability to automati-cally access these different databaseswith dissimilar data formats, import,and normalize the data to support thesix common organizational measures.The PSM Insight electronic importmodule handles ASCII-delimited filesprecisely and imports “comment” datawhen needed.

• Allowing project personnel to makemistakes while working with their data

and learning the new process. PSMInsight responds to missing or poten-tially corrupt data with a warning tothe user. Also, because missing valuescan lead to misinterpreting a graph,PSM Insight handles blank or missingvalues and reports them in indicatorlegends.

• Providing project personnel the abilityto calculate derived measures fromtheir existing base measures. This fea-ture allows the collected data elementsto be processed with a user-definedformula and displayed as derived mea-sures in report formats.

What analyses are you doing on thedata?

PSM Insight supports data analysis byproject personnel by providing a series ofpre-defined information needs, measures,and indicators that can be tailored to sup-port analysis of the six organizational mea-sures and unique project measures. ThePSM Insight tool can query data for spe-cific information and generate tailoredgraphics and reports.

The Software Enterprise currently usesthree types of regular reports:

• Level 1 reports provide a combinationof all data that has been reported bythe projects. This report addresses or-ganizational information needs.

• Level 2 reports provide the first-levelmanagers in the Software Enterprisewith the data that has been reported by

each project. These reports allow theorganizational managers to identifypotential problems with a project ifthe data exceeds the quantitative deci-sion criteria that are established foreach of the six common measures inthe Organizational Measurement Plan.

• Level 3 reports provide only the mea-surement indicators to the Measure-ment and Analysis process lead withineach project. The Measurement andAnalysis lead uses these indicators forproject analysis and for reporting atthe SMR and other project briefings.

What benefits have you received so far?

The Software Enterprise has alreadyreceived benefits from the analysis of or-ganizational data. They include:

• Better definition of the tasks per-formed in each of the project’s pro-cesses. This benefit was first realizedin the measurement planning processand verified as data was reported.

• Early and improved visibility into theperformance of each project.

• Improved communication betweenSoftware Enterprise managers andproject personnel. The data resultshelp to focus attention on the impor-tant information needs of the projectsand the overall organization.

• A baseline of actual data to improvethe accuracy of estimates for the cost,schedule, and performance of futureprojects.

Cheryl Jones is a software engineer in the Software-Intensive Systems Evaluationbusiness area at TACOM-ARDEC in Picatinny Arsenal, New Jersey. She is thelead for the performance management project and is responsible for measure-ment and analysis, risk management, and estimation within the Software Enter-prise. Ms. Jones is also the project manager of Practical Software and SystemsMeasurement (PSM) and one of the authors of Practical Software Measurement:Objective Information for Decision Makers. She is a technical expert to the U.S.Technical Advisory Group to International Standards Organization SC7, SoftwareEngineering, and serves as the U.S. Head of Delegation to Working Group 13,Software Measurement Progress, which is developing ISO/IEC 15939. Ms. Jonesholds a B.S. degree in Computer Science/Mathematics from the University ofGeorgia and an MBA in Management Information Systems from the University ofRhode Island.

Page 8: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

8 Insight

OPM Defined

Organizational performance measure-ment (OPM) can be defined in generalterms as the quantitative characterizationof an organization’s accomplishment ofsome aspect of its goals. This definitionwarrants a closer look. Note that the focusis on the organization or enterprise, ratherthan on a specific project or program.“Quantitative” indicates that the character-ization is more discriminating and infor-mative than the flat alternatives of success/failure or yes/no answers. Establishedgoals are an indispensable part of the defi-nition; measurement is meaningful only ifthe organization has a reference point forcomparison and judgment. Also, the orga-nization must realize that, although mea-surement addresses some aspect of theorganization’s goals, performance is multi-dimensional and the aspect to measure isnot always obvious.

The ultimate objective of OPM is toimprove organizational performance. Or-ganizational performance is determined bythe wise selection of projects that the or-ganization will undertake, as well as thelevel of quality that is achieved in execut-ing them. Successful OPM supports thisprocess by evaluating prospective projectsagainst the organization’s goals and abili-ties, then setting and tracking success cri-teria for project execution. Richard Lynchand Kelvin Cross, authors of Measure Up:

Yardsticks for Continuous Improvement,summarize it this way: “The purpose ofperformance measurement … is to moti-vate behavior leading to continuous im-provement in customer satisfaction, flex-ibility, and productivity.”1 In other words,OPM must lead to management action;otherwise, the measurement process pro-vides no return on the investment.

Government Mandates forPerformance Measurement

Both commercial and government or-ganizations can implement the same OPMprocesses. In fact, the Government Perfor-mance and Results Act (GPRA) of 1993mandated OPM for government organiza-tions. Measurement guidelines from theGeneral Accounting Office (GAO) anddirectives from the Chief Information Of-ficers of many agencies have been estab-lished to comply with the GPRA.

The GPRA was enacted in response toseveral Congressional findings on poorperformance in federal organizations, in-cluding:• Waste and inefficiency in federal pro-

grams that undermine public confi-dence and reduce the government’sability to address vital public needs

• Insufficient articulation of programgoals by federal managers, and inad-equate information on program per-formance

• Lack of information on program per-formance and results to support Con-gressional policymaking, spendingdecisions, and program oversight

As a result, the GPRA aims to im-prove federal program effectiveness andpublic accountability by promoting a newfocus on results, quality of service, andcustomer satisfaction. The GPRA requiresmost federal agencies to submit a strategicplan, including performance goals. Theymust “express such goals in an objective,quantifiable, and measurable form” and“establish performance indicators to beused in measuring or assessing the rel-evant outputs, service levels, and out-comes of each program activity.” Theagencies must “provide a basis for com-paring actual program results with the es-tablished performance goals; and … de-scribe the means to be used to verify andvalidate measured values.” Beginning in2000, the agencies must submit the resultsof these efforts in an annual performancereport to Congress.2

A Seven-Step Method forOrganizational PerformanceMeasurement

The U.S. General Services Adminis-tration has outlined eight steps for devel-oping and using OPM effectively.3 Adapt-ing these steps to software organizationsresults in the following seven steps:

1. Link engineering processes to organi-zational goals and objectives.

2. Develop performance measures.

3. Develop data collection, analysis, andreporting procedures.

4. Integrate use of performance mea-sures with management processes.

5. Establish a baseline to compare futureperformance.

6. Collect, analyze, and report results.

7. Evaluate the utility of performancemeasures and improve.

This article focuses on steps 1, 2, and4. (These three steps were selected be-cause they were the major tasks of the

An Introduction to OrganizationalPerformance MeasurementBy Dave Zubrow, Software Engineering Institute (SEI), Carnegie Mellon University

Organizations in increasingly competitive environments must seek ways to under-stand and improve their performance. To do this, they must address the challengingtasks of defining organizational performance goals, selecting appropriate measures,and implementing measurement at the organizational level. These tasks are chal-lenging because little guidance is available for implementing organizational-levelmeasurement. Most attention has been given to measurement below the organiza-tional level—at the project, process, and program levels. In this article, we’ll take aquick look at a seven-step method for defining organizational performance measuresand provide some tips for implementing OPM.

Page 9: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 9

TACOM-ARDEC Software Enterprise or-ganizational measurement effort, featuredon page 1 of this issue of Insight.)

Step 1. Link Engineering Processes toOrganizational Goals and Objectives.

Figure 1 diagrams the process of link-ing engineering processes to organizationalgoals. Note that the “Business StrategyGoals” in this figure arethe equivalent of the “In-formation Needs” in theoverall ISO/IEC 15939measurement process,shown in Figure 1 of theSoftware Enterprise ar-ticle, on page 5 of thisissue. As noted earlier,organizational goalsshould be characterizedin terms of outcomes andresults. In this step, wewant to understand howthe software engineeringprocesses contribute toproduct or system devel-opment efforts that, inturn, are used to performa mission. For managersat the organizational orenterprise level, such anunderstanding is vital.

Managers and measurement analystsneed to work together to identify the criti-cal development processes that impact org-anizational goals. Some goals will be moreinfluenced by software development thanothers. These critical few should receivepriority for measurement. Measurementanalysts cannot do this alone. They cannotbe responsible for setting the goals or thepriorities for measurement. It is manage-ment’s responsibility to define the informa-tion they want and need, as illustrated inthe ISO 15939 measurement process.

Organizational performance is charac-terized by the outcomes or results that theorganization achieves. This suggests thatperformance measures should be derivedfrom operational definitions of outcomes.When selecting and defining performancemeasures, preference should be given to

outcome measures. Outcome measures cancharacterize how a system affects a user’scapability to perform a mission, the abilityof a program to meet its performance tar-gets, or the extent to which a service unithas improved customer satisfaction. A keyelement from a measurement perspectiveis that actually collecting the desired mea-sures usually means getting the data from

outside of the development or service or-ganization. Data must be collected fromthe field during operational use or fromcustomers. This contrasts with other typesof measures—such output, process, andinput measures—that have been used tocharacterize performance. None of thesecapture the outcome or results as well asan outcome measure.

Understanding processes requiresknowledge of their efficiency, effective-ness, and the degree to which they are usedand followed (compliance). Characterizingprocess efficiency, effectiveness, and com-pliance requires measures of the inputsconsumed, outputs produced, the qualityof the outputs, and whether or not the pro-cess was followed. Recognizing each ofthese elements is important for OPM, be-cause too often organizations want to only

measure their outputs and inputs and theninfer the outcomes. These organizations as-sume that whatever and however productsand systems are made, they will create thedesired outcomes and results. This is a dan-gerous assumption. Through analysis, wewant to measure both the development pro-cesses and the outcomes to better understandhow the organization delivers capabilities

and value to its cus-tomers and users.

The above discus-sion is silent on whattypes of goals to select.The “balanced score-card” concept is a use-ful tool for selectinggoals and linking themto measures.4 The bal-anced scorecard directsorganizations to con-sider four perspectivesthat contribute to theorganization’s out-comes:

• The financialperspective: How dowe look to our share-holders or sponsors?

• The customerperspective: How do customers see us?

• The internal business perspective:What must we excel at?

• The innovation and learning perspec-tive: Can we continue to improve andcreate value through research, develop-ment, and infrastructure improvement?

Managers should define each quadrantin the balanced scorecard and address theassociated questions. This provides a meansfor checking the coverage of existing organi-zational goals.

The balanced scorecard shows that per-formance is multi-dimensional, and thatprofit alone does not adequately representthe accomplishments and vitality of the or-ganization. It means characterizing aspectsof the organization’s internal developmentprocesses and correlating these with the out-

Figure 1. Measurement planners must decompose organizational goals tothe associated processes and performance attributes.

(Continued on page 10)

Page 10: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

10 Insight

comes that customers realize by using theproducts and services. In other words, theorganization needs to understand how itdelivers value to its customers—under-standing this chain of causality is the foun-dation for selecting meaningful perfor-mance measures.

Step 2. Develop PerformanceMeasures.

Performance measures need to be de-veloped for each goal. Developing mea-sures requires establishing their opera-tional definitions and providing a mecha-nism for their collection. When consider-ing alternatives, managers and analystsmay quickly realize that it is more difficultto measure outcomes than it is to measureoutputs, processes, or inputs. One reasonis that the latter are within the scope andunder the control of the organization.Their use as organizational performancemeasures must be done with caution. Con-sider the following examples.

Example 1

Goal: Improve the nation’s health status.

• Outcomes - trends in health status,infant mortality, trends in prevalenceand incidence of diseases

• Outputs - patients treated, populationvaccinated, women receiving prenatalcare

• Inputs - health care expenditures, percapita ratios

• Process - doctor visits

Example 2

Goal: Improve customer satisfaction.

• Outcomes - trends in customer satis-faction survey data, number of defectsreported after release

• Outputs - number of new features re-leased, resolution time for customerservice calls

• Inputs - dollars spent on customerservice training, dollars spent onquality assurance

• Process - number of work product in-spections performed, number of testsperformed

Example 3

Goal - Increase market share.• Outcomes - percentage of target cus-

tomer base captured, number of com-peting products displaced

• Outputs - volume of product shipped,number of new features and productsreleased to the market

• Inputs - advertising expenses, researchand development expenses

• Process - number of focus groups held,number of sales visits

In each example, notice how the mea-sures become less directly connected to thegoal as they move from outcome to outputto input and process.

Step 4. Integrate with ManagementProcesses.

The use of performance measures—asa basis for management action—must beinstitutionalized. Managers should performa regular review of performance measuresat meetings and report on the measures tothe rest of the organization. Most impor-tantly, they should act on what they find.OPM yields value to the organization onlywhen managers use the measures to decideon continuing a current strategy or alteringits course. Many times, measurement ac-tivities do not generate any value and aresimply a rote activity with no economicjustification. This is a waste of the time andeffort of those providing the measures, in-cluding engineers, project managers, ana-lysts, and others. The steps in the OPMprocess ensure that the measures can yieldvalue. It is up to the managers to use themeasures and realize their value.

In addition to their use with respect tocurrent programs and operations, organiza-tional performance measures should pro-vide input to decision criteria for projectselection, as well as input to future opera-tional and strategic planning. For instance,

by understanding how current projects andprocesses have influenced organizationalperformance, managers are better in-formed to evaluate and select new projectsand initiatives. If the performance mea-sures fail to provide useful information,they should be changed. The right courseof action is to make the performance mea-sures useful and valuable. The wrongcourse is to abandon or ignore them. Thebottom line is that OPM must lead to man-agement action or its value is lost, whilethe costs are still incurred.

Summary

If information is power, then OPMsurely is a strategic weapon. Unfortu-nately, it may be viewed simply as a bu-reaucratic exercise. This is a shame and awaste. Perhaps this happens because OPMis inextricably entwined with our ability tounderstand how our organizations workand our ability to align the developmentprocesses with our goals. Neither task istrivial. The insight and understanding thatcan come from well-designed performancemeasures can greatly aid management inboth its current operational responsibilitiesand its long-term strategic planning.

The author gratefully acknowledges thecontributions of Lara Lutz, of IndependentEngineering, Inc., and Lauren Heinz, of SEI,to this article. Comments and criticisms, how-ever, should be directed to the author.

An Introduction to Organizational Performance Measurement (from p. 9)

1 Lynch, Richard L. and Cross, Kelvin F.,Measure Up: Yardsticks for Continuous Im-provement. Cambridge, MA: Blackwell Busi-ness, 1995.2 Government Performance and Results Actof 1993, at http://www.whitehouse.gov/omb/mgmt-gpra/gplaw2m.html#h1.3 General Services Administration, “Perfor-mance-Based Management: Eight Steps toDevelop and Use IT Performance MeasuresEffectively,” nd. For more extensive discus-sion of these steps, see http://www.itpolicy.gsa.gov/mkm/pathways/pp03link.htm.4 Kaplan, Robert S. and Norton, David P.,“The Balanced Scorecard - Measures thatDrive Performance,” Harvard Business Re-view, January/February 1992.

Page 11: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 11

The Capability Maturity Model Integration (CMMI):An Increased Focus on MeasurementBackground

In August 2000, the Capability Matu-rity Model® Integration (CMMISM) be-came the new baseline for process im-provement for many industry and govern-ment organizations. The CMMI broughtorder to the confusion caused by the manyindependent, process-improvement mod-els that have been inspired by the originalCapability Maturity Model (CMM), whichwas developed by the Software Engineer-ing Institute (SEI) of Carnegie MellonUniversity and released to the public in1991.

Although the CMM was originallydesigned for software engineering, its suc-cess led to the rapid proliferation of morethan 30 CMM-based models for systemsengineering, software acquisition, humanresource management, and other relateddisciplines. Most of these models sharedcommon goals, but their content, struc-ture, and terminology varied widely. Of-ten, several of these models were appliedwithin the same organization. This wasconfusing and expensive. In response,government, industry, and academiateamed up to launch the CMMI Project.The goal was to create a set of integratedbase models that could be tailored to vari-ous disciplines while maintaining compat-ible content, terminology, structure, andassessment techniques.

Features & Structure

The CMMI is based on three sourcemodels: the CMM for Software V2, draftC (SW-CMM V2C), the EIA InterimStandard 731 (EIA IS 731) System Engi-neering Capability Model (SECM); andthe Integrated Product Development Ca-pability Maturity Model, draft V0.98(IPD-CMM). The set of integrated models

in the current version of the CMMI (V1.1)addresses software engineering, systemsengineering, and integrated product andprocess development. SEI has recently re-leased the CMMI-SE/SW/IPPD/SS, whichincorporates supplier sourcing. Details onthe CMMI model process areas, goals, andassociated practices are available at theSEI web site (www.sei.cmu.edu).

An organization can implement theCMMI using one of two approaches: orga-nizational maturity or process capability.Each approach is supported by a differentrepresentation. An organization can workwith either representation because the cov-erage of both is virtually identical.

In the first approach, the organizationas a whole is assigned a CMMI maturitylevel, on a scale of 1 to 5. Each maturitylevel is characterized by a set of processareas, with corresponding goals. Theorganization’s maturity level is determined

by its ability to achieve the goals of thespecified process areas. It must thenmaintain these achievements as it moveson to the next cluster of process areas atthe next highest maturity level. Each pro-cess area resides at only one maturitylevel. This is the staged representation ofthe CMMI (see Figure 1 above and Fig-ure 2 on page 12).

In the second approach, the CMMIrating is based on the organization’s vari-ous, individual process capabilities, ratherthan its performance as a whole. The or-ganization works to show increasing lev-els of sophistication in all process areasthroughout the improvement effort. Thisis the continuous representation (see Fig-ure 3 on page 12).1

The process areas of both represen-tations contain the same generic goals andpractices. For example, some generic

Figure 1. The staged representation of the CMMI rates the overall organization byassigning the appropriate maturity level.

(Continued on page 12)

Page 12: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

12 Insight

goals are to establish an organizationalpolicy, establish a plan to enact the policy,assign responsibility, provide training, per-form the process, measure the process, etc.These types of goals and practices applyto all process areas. However, the stagedrepresentation also includes numerousgoals and practices that are specific to aparticular process area.2

The Expanding Role ofMeasurement

One notable difference between theCMM base models and the CMMI is theintroduction of Measurement and Analysisas a new process area. Several factors ledto the creation of this new process area.

First, measurement is integral to pro-cess improvement. It monitors theorganization’s progress and helps to directand justify the improvement program.Measurement is pervasive at all levels ofthe source CMMs, and it supports at leastone generic practice from level 2 to level5 in the CMMI. In the staged representa-tion, Measurement and Analysis resides atmaturity level 2, which is relatively earlyin the improvement process. The Interna-tional Standards Organization (ISO) hasalso emphasized the importance of a mea-surement process by initiating the stan-dardization effort for measurement, ISO15939.

Second, the importance and practiceof measurement clearly needs to be el-evated. Despite its benefits, measurementis one of the most neglected and misusedmanagement tools. Historic measurementdata is especially vital to efficient processmanagement; organizations that do notimplement an early measurement programwill face additional struggles later. As oneauthor of the CMMI explained, “Organi-zations that have achieved the highestCMM ratings have reported… that a clearfocus on measurement at lower levels

The CMMI: An Increased Focus on Measurement (from p. 11)

Figure 2. In the staged representation, a cluster of process areas is assigned toeach of the maturity levels. Each process area resides at only one maturity level.

Figure 3. In the continuous representation, each process area is assigned theappropriate capability level. Each process area is rated separately.

Page 13: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

Insight 13

Figure 4. The Measurement and Analysis process area is organized around threegoals, each with a corresponding set of practices.

would have saved them significant effortslater on.”3 Organizations that turned to theoriginal CMM for a description of goodmeasurement practices could not find theinformation in a single, obvious location.

The CMMI introduced the Measure-ment and Analysis process area to addressthese needs. Assigning measurement itsown process area raises its stature andhelps to promote management support. Byplacing Measurement and Analysis at ma-turity level 2, organizations will institu-tionalize their measurement program earlyin the improvement process and create auseful foundation for future advances. TheMeasurement and Analysis process areaprovides clear access to detailed measure-ment guidance, based on ISO 15939 andthe Practical Software and Systems Mea-surement (PSM) approach.

The Measurement and Analysis pro-cess area is structured around three goals,each with a cluster of associated practices,depicted in Figure 4.

The first goal is to Align Measurementand Analysis Activities. As Dave Zubrowof the SEI explains, the aim is to establishthe game plan. He writes, “[These prac-tices] address: Why are we measuring?What are we going to measure? How arewe going to measure? And, what will bedone with the data once we have it?....Planning is crucial if we want to achieveour goals. The goal and associated prac-tices ... explicitly recognize this need andits importance.”4

The second goal is to Provide Mea-surement Results. The emphasis, saysZubrow, is to follow through with the planand deliver results into the hands of thosewho will take action. He writes, “The re-sults must be communicated to those need-ing the information. It does no good to theorganization to populate a ‘write-only’ da-tabase.”

The third goal is to Institutionalize aManaged Process. Previously, measure-

ment was only a component of other pro-cesses that needed to be institutionalized.With CMMI, Measurement and Analysis is“a process in its own right and, therefore,must be institutionalized along with theother work processes.”5

The PSM project is also developing atemplate intended as a “jump start” forimplementing a measurement programconsistent with the CMMI.

1 CMMI: An Update (ppt), Jeffrey L. Dutton,Sverdrup Technology, Inc., CMMI ProductDevelopment Team, 20 February 2001.

2 CMMI Overview (ppt), CMMI Project andPractical Software Measurement (PSM), 4June 1998.

3 “The Capability Maturity Model Integration(CMMI): An Interview with Bruce Allgood andLTC Jarzombek,” Army Software Metrics Of-

SM CMMI and CMM Integration are service marks of Carnegie Mellon University.® CMM and Capability Maturity Model are registered in the U.S. Patent and Trademark Office.

fice, Insight newsletter, Vol. 3, No. 4, Spring1999.4 Zubrow, Dave, “The Measurement andAnalysis Process Area in CMMI,” Newsletterof the American Society for Quality-SoftwareDivision, Spring 2001.5 Zubrow, Dave, “The Measurement andAnalysis Process Area in CMMI,” Newsletterof the American Society for Quality-SoftwareDivision, Spring 2001.

Page 14: Tailoring and Implementing an Organizational Measurement ...€¦ · The mention of a product or a business in Insight does not constitute an endorsement by the Army Software Metrics

14 Insight

Army Software Metrics (ASM) NewsletterU.S. Army Software Metrics OfficeATTN: CSTE-AEC-MA, Park Center IV4501 Ford AvenueAlexandria, VA 22302-1458

PRSRT STDU.S. POSTAGE

PAID

PERMIT 1112MERRIFIELD, VA

INSIDE

• Tailoring and Implementing anOrganizational Measurement Process ... 1

• An Introduction to OrganizationalPerformance Measurement ... 8

• The CMMI: An IncreasedFocus on Measurement ... 11

Army Software Metrics TrainingThe next Army Software Metrics course will take place August 5-7, 2002,

at the AEC headquarters in Alexandria, VA. For more information, visithttp://www.armysoftwaremetrics.org or call (703)681-3823 or DSN 761-3823.

PSM Insight training is coming soon to the DC area, including three days of hands-onwork with the tool’s advanced features. For dates and details, visit www.psmsc.com.

this issue of

INSIGHT