9
34 popular government P erformance measurement was once a path-breaking manage- ment technique, undertaken only by the most progressive local govern- ments. Today it is accepted as a profes- sional norm for demonstrating opera- tional accountability for service delivery and for creating an environment for productivity improvement. Although adoption of performance measurement systems is common, full implementation remains rare. 1 “Adoption” refers to the creation and collection of measures for tracking service performance. “Imple- mentation” is the actual use of these measures for improving the efficiency and the effectiveness of service delivery. The distinction is critical. Given the ex- pense of adoption, an adequate return on a local government’s investment hinges on effective implementation. When an organization engages in “benchmarking”— the comparison of its performance with relevant perform- ance standards or the performance of other organizations — the investment is greater, and so is the desire for an ade- quate return. 2 Benchmarking consumes more organizational resources than internal performance measurement, given the difficulty of ensuring data accuracy, reliability, and comparability across multiple organizations. As the return on its investment, an organization hopes to gain ideas for operational improvement. The North Carolina Benchmarking Project is a collaborative effort among participating municipalities that com- pares performance and cost data across ten service areas: residential refuse col- lection, household recycling, yard waste and leaf collection, police services, emer- gency communications, asphalt main- tenance and repair, fire services, building inspections, fleet services, and human resources. 3 The School of Government manages the benchmarking project un- P O P U L A R G O V E R N M E N T Benefiting from Comparative Performance Statistics in Local Government William C. Rivenbark, David N. Ammons, and Dale J. Roenigk The authors are School of Government faculty members specializing in local government administration. Contact them at [email protected], ammons@ sog.unc.edu, and [email protected]. der the guidance of a steering committee consisting of representatives from each participating municipality. This article describes how the munic- ipalities are using performance and cost data from the benchmarking project to improve the efficiency and the effective- ness of service delivery, including how one municipality used the data to prompt an analysis of fleet maintenance. It also discusses the importance of focusing on the higher-order measures of efficiency and effectiveness, and the benchmarking project’s contribution to improving the quality of performance measures. Overview of the North Carolina Benchmarking Project The impetus for the benchmarking project came from two groups: city managers and budget officials. In 1994 Mel Nathanson / Raleigh News & Observer

Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

34 p o p u l a r g ov e r n m e n t

P erformance measurement wasonce a path-breaking manage-ment technique, undertaken only

by the most progressive local govern-ments. Today it is accepted as a profes-sional norm for demonstrating opera-tional accountability for service deliveryand for creating an environment forproductivity improvement. Althoughadoption of performance measurementsystems is common, full implementationremains rare.1 “Adoption” refers to thecreation and collection of measures fortracking service performance. “Imple-mentation” is the actual use of thesemeasures for improving the efficiencyand the effectiveness of service delivery.The distinction is critical. Given the ex-pense of adoption, an adequate return on

a local government’s investment hingeson effective implementation.

When an organization engages in“benchmarking”— the comparison ofits performance with relevant perform-ance standards or the performance ofother organizations — the investment isgreater, and so is the desire for an ade-quate return.2 Benchmarking consumesmore organizational resources thaninternal performance measurement,given the difficulty of ensuring dataaccuracy, reliability, and comparabilityacross multiple organizations. As thereturn on its investment, an organizationhopes to gain ideas for operationalimprovement.

The North Carolina BenchmarkingProject is a collaborative effort amongparticipating municipalities that com-pares performance and cost data acrossten service areas: residential refuse col-lection, household recycling, yard wasteand leaf collection, police services, emer-gency communications, asphalt main-tenance and repair, fire services, buildinginspections, fleet services, and humanresources.3 The School of Governmentmanages the benchmarking project un-

P O P U L A R G O V E R N M E N T

Benefiting from Comparative Performance Statistics in Local GovernmentWilliam C. Rivenbark, David N. Ammons, and Dale J. Roenigk

The authors are School of Governmentfaculty members specializing in localgovernment administration. Contact themat [email protected], [email protected], and [email protected].

der the guidance of a steering committeeconsisting of representatives from eachparticipating municipality.

This article describes how the munic-ipalities are using performance and costdata from the benchmarking project toimprove the efficiency and the effective-ness of service delivery, including howone municipality used the data to promptan analysis of fleet maintenance. It alsodiscusses the importance of focusing onthe higher-order measures of efficiencyand effectiveness, and the benchmarkingproject’s contribution to improving thequality of performance measures.

Overview of the North CarolinaBenchmarking Project

The impetus for the benchmarkingproject came from two groups: citymanagers and budget officials. In 1994

Mel

Nat

hans

on/R

alei

ghN

ews

&O

bser

ver

Page 2: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

s p r i n g / s u m m e r 2 0 0 7 35

the North Carolina League of Munici-palities hosted a meeting of city mana-gers from the state’s larger municipalities,focusing on privatization. Discussions atthis meeting turned to the topics ofcompetition, performance measurement,and, eventually, cooperative bench-marking.4 In 1995, local officials whowere affiliated with the North CarolinaLocal Government Budget Associationmet to discuss the possibility of creatinga benchmarking project. They wantedthe capability of examining the perform-ance of their own organizations in thecontext of performance statistics fromother local governments, thinking thateven good performers could learn fromthe practices of others. The pilot phaseof the benchmarking project started infall 1995 after the Institute of Govern-ment hired a project coordinator.

The following three goals guide thebenchmarking project: (1) develop andexpand the use of performance measure-ment in local government, (2) produce

reliable performance and cost data forcomparison, and (3) facilitate the use ofperformance and cost data for serviceimprovement. By February 2007, theproject had produced eleven reportscontaining data on the performance and the costs of service delivery in par-ticipating municipalities. The largerstory, however, is participating muni-cipalities’ use of statistics on compara-tive performance to enhance their performance measurement systems andto improve service delivery.

Methodology

The findings reported in this article werederived from a review of the experiencesof the fifteen municipalities that partici-pated in the benchmarking project dur-ing fiscal year 2004–5: Asheville, Cary,Charlotte, Concord, Durham, Gastonia,Greensboro, Hickory, High Point, Matthews, Raleigh, Salisbury, Wilmington,Wilson, and Winston-Salem. Municipalrepresentatives were queried in an e-mailsurvey in spring 2005. The survey was

followed by in-person interviews andsubsequent telephone and e-mailcontacts in summer 2005.

Improvement of ServiceEfficiency and Effectiveness

The survey asked local officials whetherthe benchmarking project’s performanceand cost data had supported operationalchange in the service areas under study.When changes had been made, the sur-vey asked for specific examples. Some ofthese examples are noted in the followingsections and are substantiated by clearlydocumented outcomes. Others are morerecent initiatives with promising but un-confirmed results. Operational changestied to the benchmarking project datawere documented in eight of the ten ser-vice areas: residential refuse collection,

Page 3: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

36 p o p u l a r g ov e r n m e n t

household recycling, yard waste and leafcollection, police services, emergency com-munications, asphalt maintenance andrepair, fire services, and fleet maintenance.

Residential Refuse CollectionThe participating municipalities haveused benchmarking data most frequentlyin the service area of residential refusecollection. Hickory, for example, usedthe comparative statistics to justifyautomated collection with one-personcrews. The city reduced its cost per toncollected from $98 in 1995–96 to $69in 2003–4, a savings of $29 per ton.5

Concord used the benchmarking datato negotiate more favorable terms withits private hauler. The city was paying$7.07 per collection point when its refusecollection contract expired. The privatehauler’s proposal for a new contract calledfor payment of $7.76 per collection point.The city countered using data from thebenchmarking project that showed Con-cord’s service costs to be relatively highand the contractor’s service quality to berelatively low in comparison with costsand quality in other municipalities. Theparties agreed to continue the service ata rate of $7.07 per collection point,

subject to adjustments tied to changes inthe Consumer Price Index and fuel prices.

One of the major success storiesduring the decade-long history of thebenchmarking project was in this servicearea. Winston-Salem used a privatehauler to provide residential refuseservice to about 6,500 households. Afterthe benchmarking data revealed under-used capacity within the city’s ownoperations, it discontinued its contractwith the private hauler and extended

service by city crews into the affectedneighborhoods without adding staff or equipment. This move improvedefficiency and produced annual savingsof approximately $395,000.6

Household RecyclingComparative statistics for householdrecycling helped Asheville municipalofficials monitor the effects of serviceexpansion. Program changes yielded anincrease in the rate of waste diversionfrom 14 percent in 1998–99 to 24 percentin 2003–4. The principal impact ofprogram success has been the extendedlife of the Buncombe County landfill.

Benchmarking data helped Wilming-ton officials decide to privatize thehousehold recycling program, producingan annual savings of about $75,000.7

This change in service delivery alsodecreased the cost per ton collected from$308 in 1994–95 to $234 in 2000–1(see Figure 1). Further expansion of theprogram decreased the cost per toncollected to $128 by 2003–4.

Concord has used benchmarking datato assess the possibility of altering truckand crew configurations. Hickory hasused the data to evaluate the cost percollection point, for contract negotiations.

Yard Waste and Leaf CollectionComparative statistics for yard wasteand leaf collection supported the use of seasonal labor in Hickory andjustified a recommendation for a leafmachine in High Point. The program

Taka

akiI

wab

u/R

alei

ghN

ews

&O

bser

ver

Page 4: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

change in Hickory helped reduce the costper collection point from $51 in 2001–2to $30 in 2003–4. Analysis in HighPoint showed that the new equipmentwould reduce the cost per ton collected.

Police ServicesAlthough most of the implementationexamples focus on service efficiency, someare aimed at improving service quality.Greensboro, for example, used the bench-marking results in a management studyof police patrol staffing.8 The study foundthat Greensboro was below average innumber of sworn officers per 1,000 resi-dents and had a slower-than-average re-sponse time for high-priority calls whencompared with Durham, Raleigh, andWinston-Salem. A workload analysisindicated a “patrol-availability factor” ofonly 6.6 percent, signaling little ability toengage in proactive patrol. In response, cityofficials presented staffing options to thecity council (see Table 1). The city councileventually approved an additional thirty-two sworn officers for its police depart-ment to increase proactive patrol and todecrease crime in specified neighborhoods.

Other examples of data use in policeservices included analyzing a proposal

to add a patrol beat in Cary, gaugingthe efforts of community policing inConcord, and investing in a telephone-response unit to reduce calls per officerin Wilmington.

Emergency CommunicationsAsheville eliminated three dispatcherpositions in emergency communicationsfollowing an analysis of the bench-marking results. This action allowed the

Figure 1. Household Recycling in Wilmington

$350 –

300 –

250 –

200 –

150 –

100 –

50 –

0 –

n Wilmingtonu Average

1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

Cos

tPer

Ton

Col

lect

ed

Fiscal Year

$308n

$207n n

n

$234n

n

n

n

u

u

u

u u u

uu

Note: Data were not collected for 1996 and 1997. The service in Wilmington was privatized in themiddle of 1998.

s p r i n g / s u m m e r 2 0 0 7 37

$128

Gen

eFu

rr/R

alei

ghN

ews

&O

bser

ver

Page 5: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

38 p o p u l a r g ov e r n m e n t

city to reallocate more than $100,000to other programs.

Cary officials used the benchmarkingproject’s comparative statistics to identifythe need for an additional supervisoryposition in emergency communications.Concord employed the statistics to makechanges that led to an Insurance ServicesOffice (ISO) rating improvement. AnISO rating indicates the “fire readiness”of individual communities as an infor-mation service to potential insurers.9

Asphalt Maintenance and RepairDeciding on the amount of resources toappropriate for asphalt maintenanceand repair is an annual challenge facedby municipal officials. Typically, admin-istrators urge adherence to a policy thatcalls for the municipality to resurface aspecified number of lane miles every year.Depending on revenue projections, how-ever, municipalities sometimes defer thiscapital investment in favor of other pro-grams. With the support of the bench-marking results, several jurisdictions havesolidified their ongoing commitment toa systematic street-resurfacing program.

Two municipalities have used thecomparative statistics to analyze the

Table 1. Staffing Options for Police Patrol in Greensboro

Additional Administrative CFS Proactive/Officers Time Time* Patrol Time Increase

Current — 20.5% 72.9% 6.6% —

Option 1 17 20.5 68.0 11.5 5%

Option 2 37 20.5 63.0 16.5 10

Option 3 60 20.5 58.0 21.5 15

Option 4 99 20.5 51.0 28.5 22

*CFS = calls for service, the percentage of time that patrol officers spend responding to service calls.

Page 6: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

s p r i n g / s u m m e r 2 0 0 7 39

cost-effectiveness of using in-house crewsversus contract crews for resurfacingprojects. Asheville decided to use contractcrews for additional projects. Concordopted to increase in-house capacity.

Hickory used the comparative sta-tistics to justify a new automated patchtruck for pothole repair. The city reported85 percent of potholes repaired withintwenty-four hours in 1997–98, whichwas well below the benchmarkinggroup average of 96 percent. After thecapital investment, the city reported 97percent of potholes repaired withintwenty-four hours in 2001–2, whichwas slightly above the group average of 95 percent for that fiscal year.

Fire ServicesSome municipalities have used the comparative statistics to analyze theneed for fire personnel. As a result of its analysis of fire inspectors’ work-loads, Cary established a staffing planfor determining when to add new inspec-tors. High Point used the comparative statistics to analyze and approve arequest for twelve new firefighters inresponse to a merger with two volun-teer stations.

The most notable use of comparativestatistics on fire services occurred inHickory. The city’s high cost per responsesuggested the underutilization of per-sonnel and equipment and prompted adecision to begin responding to emer-gency medical calls as well as fire inci-dents. This increase in workload allowedthe fire department to spread its fixedcosts across more calls for service. Thatsubstantially lowered the department’scost per response, from $3,246 in1998–99 to $1,832 in 2003–4. Theworkload change apparently had someimpact on average response time tohigh-priority calls, which increasedfrom 4.0 minutes to 4.4 minutes duringthe same time period.

Fleet MaintenanceAsheville and Hickory used the bench-marking results to establish productivitygoals for billable hours, turnover ofparts, and percentage of rolling stockavailable per day. Also, the bench-marking data prompted an analysis offleet maintenance in Concord, which isdescribed in the following section.10

Jaso

nA

rthu

rs/R

alei

ghN

ews

&O

bser

ver

Page 7: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

40 p o p u l a r g ov e r n m e n t

Analysis of Fleet Maintenance in Concord

Comparative performance and cost datafor fleet maintenance were collected andreported for the first time in the FinalReport on City Services for FY 2001–2002.11 The data for Concord, when com-pared with the data for other participantsin the benchmarking project, revealedseveral potential problems, including lowshop productivity (see Figure 2) and ex-cessive repeat repairs within thirty days(see Figure 3). Concord also had experi-enced a relatively high number of break-downs of equipment while it was in use,which affected the productivity of the de-partments needing to use the equipment.

Scheduled maintenance is the mostcost-effective, productive form of vehiclemaintenance. By emphasizing preven-

Benefits of BenchmarkingThe municipalities were asked to identify the overall benefit of participating inthe benchmarking project. Following are selected responses:

• Reporting on the performance of service delivery within the context ofcomparable performance statistics enhances program accountability.

• Benchmarking has helped change the organizational culture by increasingthe emphasis on performance measurement.

• Benchmarking has given program managers a broader perspective on howservices are provided. They have become more open to the idea thatreviewing processes in other organizations can help them improve theirown service performance.

• Program managers are more concerned with data accuracy and reliabilityand are more open to data analysis.

• Budget staff members have become more knowledgeable about theprograms under study. That helps reduce the communication barriersbetween staff members and program managers.

• Reporting on comparative statistics has spawned other managementinitiatives. For example, citizen surveys have been conducted tosupplement the performance and cost data. The surveys have resulted inallocation of more resources to priority service areas.

• Benchmarking has assisted organizations in progressing towardperformance budgeting. They have used the performance and cost data inreorganization of selected programs, in allocation of additional or fewerresources based on needs assessments, and in contract negotiations withexternal vendors.

One of the best anecdotal observations regarding the value of projectparticipation came from a budget director who said that she crossed herfingers every time she received an information request from the city managerregarding a program. Her hope was that the program would be one of the tencurrently under study in the benchmarking project, making it easier to give atimely and informative response.

Figure 2. Fleet Maintenance inConcord: Hours Billed as a Percentage of AvailableHours, Fiscal Year 2001–2

Figure 3. Fleet Maintenance inConcord: Percentage ofWork Orders RequiringRepeat Repair within 30Days, Fiscal Year 2001–2

80% –

70 –

60 –

50 –

40 –

30 –

20 –

10 –

0 –

1.5% –

1.0 –

0.5 –

0.0 –

n Concord

n Average

n Concord

n Average

53%

68%

1.1%

0.9%

tive maintenance, it decreases the likeli-hood of costly breakdowns. In Concordthe work of fleet maintenance increas-ingly consisted of unscheduled main-tenance (repair of breakdowns) anddecreasingly of scheduled maintenance(preventive maintenance).

AnalysisA review of comparative benchmarkingdata alerted Concord officials to thepresence of underlying problems. Con-cord’s marks did not compare favorablywith those of other municipalities. Closerexamination revealed several causes.Before introducing the measures of thebenchmarking project, fleet managementcollected few measures and was largelyunaware of performance shortcomings.Mechanics’ workdays were not carefullyscheduled, and mechanics were given no

performance targets or guidelines for jobefficiency or shop productivity. Pro-cedures for state inspections, preventivemaintenance, and scheduling and priori-tizing work were inefficient. Also, themethod for buying parts increasedvehicle downtime. Warranty issues werenot weighed against downtime. Lack ofcommunication between fleet mainten-ance and other city functions increasedrepair costs. Fleet maintenance softwarewas cumbersome and consumed exces-sive amounts of management’s time.Furthermore, vital scheduled mainten-ance was not being performed.

ActionsConcord officials acted to increaseaccountability. They set performancestandards for mechanics and providedthem with monthly reports on indi-

Page 8: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

s p r i n g / s u m m e r 2 0 0 7 41

vidual and team productivity accomp-lishments. Also, they created check sheetson preventive maintenance to ensurequality and promote accountability.

Further, they brought federal and stateinspections in-house, saving the cost ofoutsourcing and travel time. They changedpurchasing practices to promote comp-etition. They made arrangements withmultiple vendors for the quick purchaseof parts to increase the percentage ofrepairs completed within twenty-fourhours. They focused greater attention onsystematic replacement of worn equipment,carefully flagging vehicles near retirementto reduce unnecessary maintenance.

Additionally, they reorganized thesecond shift of mechanics and focusedits efforts on preventive maintenancerather than repairs. They created specialforms (called “trouble forms”) to im-prove communication between first- andsecond-shift supervisors. Finally, theydirected special attention to preventive-operation checks in an effort to reduceunscheduled maintenance work orders.

Concord officials found that they

could implement all these changes andstill eliminate one management positionin fleet maintenance.

Outcomes The operational changes helped dropmaintenance costs per mile traveledfrom 18 cents in 2002 to 15 cents in2005. The decrease represents a three-year savings of about $120,000 for fleetmaintenance. Also, the elimination of amanagement position created an annualsavings of approximately $45,000.Hours billed as a percentage of hourspaid to mechanics increased from 53percent in 2001–2 to 70 percent in2003–4. The percentage of work orderscompleted within twenty-four hoursincreased from 81 percent to 86percent, and the percentage of workorders requiring repeat repairs withinthirty days decreased from 1.1 percentto 0.4 percent during the same period.

Another area of improvement wasreplacement of transmissions. After theprogram implemented a preventivemaintenance program on transmissions,

the number of transmissions replaceddecreased from twenty-four in 2002 tofive in 2005.

Concord officials had collected ahost of basic workload measures forfleet maintenance, but they found that higher-order measures of efficiencyand effectiveness were more useful than workload measures in diagnosingproblems and improving operations.Fleet managers now track these mea-sures monthly rather than annually.

Concord officials also found thatsharing performance information andgetting those actually doing the work to buy in to proposed changes—as they did with fleet mechanics—arecrucial to successful performancemanagement.

Utility of Efficiency andEffectiveness Measures

The benchmarking project compilesthree types of performance measures foreach service area under study: workload,efficiency, and effectiveness. Workloadmeasures are important for providinginformation on service demand (forexample, the number of applicationsprocessed, arrests made, meters read,and so forth), but they simply reporthow much work has been done. Effi-ciency and effectiveness measures areconsidered higher-order measures. Theyreport on the relationship betweeninputs and outputs (efficiency), on theone hand, and the quality or impact ofservice (effectiveness), on the otherhand. Municipalities that were moreactive in using performance measures to improve operations tended to relymore on measures of efficiency andeffectiveness, rather than simply on raw workload measures.12

Several municipalities participatingin the benchmarking project wereprompted to make changes in theiroperations when they compared theirefficiency with that of their counterparts.In fact, a majority of respondents indi-cated heavy reliance on efficiency mea-sures. This should not be surprising,given the benchmarking project’s em-phasis on cost accounting from theoutset. Participating officials have gainedconfidence in these measures over theyears and have come to rely on them. Ju

dith

Sivi

glia

/Ral

eigh

New

s&

Obs

erve

r

Page 9: Benefiting from Comparative Performance Statistics in ... · of the benchmarking pr oject started in fall 1995 after the Institute of Gover n-ment hired a project coordinator. The

42 p o p u l a r g ov e r n m e n t

In fact, several of the officials havecredited the benchmarking project withproviding them with the ability to cal-culate accurate and reliable efficiencymeasures for the first time.

The total cost in each service area,including direct costs (personal servicesand operating expenditures), indirectcosts (overhead for staff support), andcapital costs (depreciation), is determinedto ensure comparability across multipleservice providers. The total cost is thenused to calculate the resources consumedper service output.13

Another reason for the heavy relianceon efficiency measures, according to therespondents, is that program managersrespond more readily to changes inefficiency than to changes reflected byquality indicators. Several of the respon-dents reported that elected officials alsotend to focus more on service efficiencythan on service quality. One possiblereason for this focus is that electedofficials are keenly aware of the impor-tance of cost control in the public sector.Understandably, they prefer to avoid anincrease in the property tax rate.

Refinement of Measures

Although respondents reported a hostof benefits from participating in thebenchmarking project (see the sidebaron page 40), one of its fundamentalbenefits has come in helping partici-pating municipalities improve the qualityof their performance measures, not onlyin service areas included in the projectbut in others as well, indirectly.

Within service areas included in thebenchmarking project, participants havetackled some thorny measurement prob-lems and resolved them to the group’ssatisfaction. A review of household re-cycling, for example, revealed problemsof inaccuracy and inconsistency in thecalculation of household recycling part-icipation rates. To remedy this problem,project participants established a uniformeffectiveness measure, reporting thecommunity set-out rate (percentage ofhouseholds setting out recycling bins).

Another example comes from theservice area of fleet maintenance. Afterthe service area became part of the bench-marking project, participants thoughtthat the number of rolling stock units per

full-time-equivalent technician wouldprovide useful feedback on workload andefficiency. Subsequent review revealedthat a more robust measure was neededto track this service dimension. The mea-sure was changed to number of vehicle-equivalent units per full-time-equivalenttechnician, a weighted statistic of themaintenance effort associated with dif-ferent classes of vehicles. This measureis more aligned with industry standards.

Several municipalities credited theirparticipation in the benchmarking projectas a catalyst for improving their perfor-mance measurement systems organization-wide. Local officials reported that theproject’s focus on meaningful perfor-mance statistics has improved the qualityof measures being tracked and reportedeven in service areas outside the project’sscope. This finding provides evidencethat the benchmarking project is makingprogress toward its first goal: to developand expand the use of performancemeasurement in local government.

Conclusion

This review of the benchmarkingexperiences of the fifteen municipalitiesthat participated in the benchmarkingproject in 2005 reveals that the munici-palities have used comparative statisticsat the program level to support a varietyof decisions about service delivery. Prior research has suggested that time is a factor in moving from collection of measures to actual use of them inmanagement decisions.14 Indeed, someof the municipalities that have the mostexperience in performance measurementand the longest participation in thebenchmarking project were among theleaders of this group in the use ofperformance data. Time is no guaran-tee, however. Even some municipalitieswith shorter experience in performancemeasurement have moved beyond re-liance on raw workload measures andnow are using measures of efficiencyand effectiveness as they convert infor-mation into action.

Notes

1. Patricia de Lancer Julnes and MarcHolzer, “Promoting the Utilization of Perfor-mance Measures in Public Organizations: An

Empirical Study of Factors Affecting Adoptionand Implementation,” Public AdministrationReview 61 (2001): 693–708.

2. Comparison of performance statisticsis one of three approaches to benchmarking in the public sector. See David N. Ammons,“Benchmarking as a Performance Manage-ment Tool: Experiences among Municipalitiesin North Carolina,” Journal of PublicBudgeting, Accounting and FinancialManagement 12 (2000): 106–24.

3. For the definition of each service area,see William C. Rivenbark, Final Report on CityServices for Fiscal Year 2003–2004 (ChapelHill, N.C.: School of Government, Universityof North Carolina at Chapel Hill, 2005).

4. Paula K. Few and A. John Vogt, “Mea-suring the Performance of Local Governments,”Popular Government, Winter 1997, 41–54.

5. The savings reported in this article arestated in current dollars rather than constantdollars. If adjusted for inflation, the reportedsavings would be greater.

6. Ann Jones, “Winston-Salem’s Partici-pation in the North Carolina PerformanceMeasurement Project,” Government FinanceReview 13, no. 4 (1997): 35–36.

7. David N. Ammons, “Benchmarking asa Performance Management Tool: Experiencesamong Municipalities in North Carolina,”Journal of Public Budgeting, Accounting andFinancial Management 12 (2000): 106–24.

8. City of Greensboro, Budget andEvaluation and Police Departments, PatrolStaffing Study (Greensboro, N.C.: 2004).

9. David N. Ammons, MunicipalBenchmarks, 2d ed. (Thousand Oaks, Calif.:Sage Publications, 2001).

10. For information on additional casestudies, see William C. Rivenbark, David N.Ammons, and Dale J. Roenigk, Benchmarkingfor Results (Chapel Hill, N.C.: School ofGovernment, University of North Carolina atChapel Hill, 2005).

11. William C. Rivenbark and Matthew H.Dutton, Final Report on City Services forFiscal Year 2001–2002 (Chapel Hill, N.C.:School of Government, University of NorthCarolina at Chapel Hill, 2003).

12. For more information on factors thatincrease the likelihood of local officials usingperformance and cost data to influence manage-ment decisions, see David N. Ammons andWilliam C. Rivenbark, “Factors Influencingthe Use of Performance and Cost Data toImprove Municipal Services: Evidence fromthe North Carolina Benchmarking Project,”Public Administration Review, forthcoming.

13. For more information on the bench-marking project’s cost accounting model, see William C. Rivenbark, ed., A Guide to the North Carolina Local GovernmentPerformance Measurement Project (ChapelHill, N.C.: School of Government, Universityof North Carolina at Chapel Hill, 2001).

14. See note 1.