15
This article was downloaded by: [Brunel University London] On: 04 November 2014, At: 15:22 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK International Journal of Public Administration Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/lpad20 Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat Taewoo Nam a a Myongji University, Seoul, Republic of Korea Published online: 01 Jul 2014. To cite this article: Taewoo Nam (2014) Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat, International Journal of Public Administration, 37:8, 514-527, DOI: 10.1080/01900692.2013.865648 To link to this article: http://dx.doi.org/10.1080/01900692.2013.865648 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

  • Upload
    taewoo

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

This article was downloaded by: [Brunel University London]On: 04 November 2014, At: 15:22Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

International Journal of Public AdministrationPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/lpad20

Assessing Operational and Collaborative PerformanceManagement: A Case Study of PhillyStatTaewoo Nama

a Myongji University, Seoul, Republic of KoreaPublished online: 01 Jul 2014.

To cite this article: Taewoo Nam (2014) Assessing Operational and Collaborative Performance Management: A Case Study ofPhillyStat, International Journal of Public Administration, 37:8, 514-527, DOI: 10.1080/01900692.2013.865648

To link to this article: http://dx.doi.org/10.1080/01900692.2013.865648

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

International Journal of Public Administration, 37: 514–527, 2014Copyright © Taylor & Francis Group, LLCISSN: 0190-0692 print / 1532-4265 onlineDOI: 10.1080/01900692.2013.865648

Assessing Operational and Collaborative Performance Management:A Case Study of PhillyStat

Taewoo NamMyongji University, Seoul, Republic of Korea

Many cities and agencies in the U.S. have adopted the “-Stat” approach, which is called“PerformanceStat” in general. Philadelphia has created PhillyStat as its performance manage-ment and tracking tool, of which unique feature is its twofold review process involving bothoperational and outcome level. This study assesses the capacities of PhillyStat. The assessmentsuggests four implications for jurisdictional governments and agencies employing the “-Stat”approach. First, the “-Stat” approach should evolve toward strategic review beyond day-to–day operational review. Second, the “-Stat” approach should close the gap in diverse views ongovernment performance. Third, the “-Stat” approach should be used as an effective tool forpublic management and leadership. Last, the “-Stat” approach should develop capabilities forcross-organizational collaboration.

Keywords: performance management, performance review, CitiStat, PerformanceStat, casestudy

INTRODUCTION

To better manage and track performance, a growing num-ber of public agencies and government jurisdictions in theUnited States have devised their own performance strategiesbased on the “-Stat” model. The Stat-type performance mea-surement systems are gaining popularity as public adminis-trators continue their quest to improve government efficiencyand effectiveness. The New York City Police Departmentcreated the original CompStat in 1994 to reduce the city’scrime rate. Then the City of Baltimore, Maryland, launchedthe strategy’s first application to an entire governmentaljurisdiction in 2000. Various versions in other cities andagencies replicate the two original models and are adaptedto fit each specific government’s innovation.

To encompass all the various adaptations, Robert D.Behn (2008a) dubbed the class of performance programscalled ∗∗∗∗Stat (e.g., the name of a city plus “-Stat” such asColumbusStat and PhillyStat) as PerformanceStat. Based onhis observation of various practices, Behn (2010) found that

Correspondence should be addressed to Taewoo Nam, MyongjiUniversity, Department of Public Administration, Geobookgolro 34,Seodaemoongu, Seoul, 120-728, Republic of Korea. E-mail: [email protected]

“Each PerformanceStat is different. Each has adapted thekey features to achieve its public purpose and to mesh withits organizational culture and political constraints” (p. 431).The adaptations incorporate key common features, whichare developed and varied by jurisdiction and agency. Behn(2010) highlighted the evolution of PerformanceStat towarda strategy to manage the collaboration across agencies anddepartments within a city government, coining the newterm, “CollaborationStat.” As such, the “-Stat” approachas a performance strategy extends beyond reviewing inter-nal operational performance to reviewing performance ofcross-agency working.

The City of Philadelphia offers a perfect example ofthe ongoing evolution from a typical PerformanceStat toPerformanceStat plus CollaborationStat. Philadelphia cre-ated its own PerformanceStat, PhillyStat, in 2008. PhillyStatpossesses its own unique properties, as well as common char-acteristics comparable with other PerformanceStats. It hastwo separate processes: an operation-level performancereview and an outcome-level performance review. While theformer is parallel to PerformanceStat, the latter resemblesCollaborationStat.

This study aims to offer practical implications for govern-ments through assessing the capacities of PhillyStat both as aPerformanceStat and as a CollaborationStat. The criteria for

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 3: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 515

assessment are derived from three articles by Behn (2006,2008a, 2010), published in academic journals. This paperunfolds as follows. The next section discusses the theoreti-cal (performance evaluation and collaboration) backgroundof the “-Stat” approach, and draws criteria from previousliterature to assess PerformanceStat and CollaborationStat.The subsequent section describes the details of PhillyStat asa research case, followed by two sections of the assessment.PhillyStat is assessed as a PerformanceStat, and then as aCollaborationStat. The final section addresses concludingremarks.

THEORETICAL BACKGROUNDS OF THE“-STAT” APPROACH

The “-Stat” approach is based on the two theoretical back-grounds: performance evaluation (PerformanceStat) and col-laboration (CollaborationStat). This section discusses the“-Stat” approach both as performance evaluation and ascross-organizational collaboration.

The “-Stat” Approach as Performance Evaluation

Performance evaluation needs to begin by determining whatkinds of improvements are defined and achieved, and to whatextent. PerformanceStat as a method of performance eval-uation should be “a leadership and management strategydesigned to improve the performance of an individual publicagency or the set of agencies within a governmental juris-diction” (Behn, 2008a, p. 208). The strategy cannot improveperformance until the leadership team first defines and spec-ifies the nature of performance that should be achieved.In other words, leaders involved in PerformanceStat need tothink carefully about what is the “performance deficit” thatthey are trying to eliminate, reduce, or mitigate (Behn, 2004,pp. 10–11). Those involved in PerformanceStat need to iden-tify and solve performance-deficit problems, and then set andachieve the next performance targets.

Performance data is key to performance evaluation.Since PerformanceStat consists of data-driven performancereviews (Hatry & Davies, 2011; Perez & Rushing, 2007);data are central to the quality of what PerformanceStat pro-duces. Behn (2008a) claimed “Collecting data is not enough.Someone needs to analyze these data.” (p. 212). The ques-tion “What kind of data?” leads to further inquiries such asfollows: “What data will be most revealing?” “How shouldthese data be examined to determine whether the purposeis being achieved?” “What analytic techniques are mostuseful?” And “What should be compared with what?”.

The most basic infrastructure of performance evaluationmeetings refers to the session room, where periodic meet-ings are held. The most obvious feature of Baltimore’s

CitiStat room is the podium (Behn, 2007, p. 25), but apodium is not essential. A discussion or presentation sessioncould be conducted with settings different from Baltimore’sCitiStat. Another component of PerformanceStat infras-tructure is technology to help collect, analyze, and dis-play the data. A centralized nonemergency contact service,311, is considered crucial technology for PerformanceStat(Behn, 2005, 2006, 2008a; Perez & Rushing, 2007).Gullino (2009) considered Baltimore’s CitiStat to bean e-government tool or an extension of informationaccess, especially when armed with Geographic InformationSystem. In this way, PerformanceStat can employ varioustechnologies.

The meeting itself needs to be assessed chiefly in termsof “who” and “how.” This is not to assess how the meet-ing achieves its purpose, but rather to assess the dynamicsof human interactions. Attitudes of the leadership team andunit managers deserve observation. Careful attention shouldbe paid to how participants conduct the meeting. There areimportant components of the meeting to be considered: thechairperson, the frequency of meetings held, the attendees(departmental heads and managers), the topics, and the mood(Behn, 2008a, pp. 218–222).

A professional staff is key to operational capacity, asemphasized by Hatry and Davies (2011), who note that “justas a dedicated and persistent leader is needed to energizeand legitimize an initiative, knowledgeable and energeticstaff will also be needed to ensure that the analysis andreview process runs smoothly and that data and findingsare correctly interpreted” (p. 13). They list the follow-ing core tasks of staff supporting data-driven performancereviews: (1) obtaining the desired performance information;(2) transforming data into clear, readable charts, tables, ordiagrams; (3) analyzing the information to identify candi-date highlights, issues, and questions that the leader mightwant to address in the forthcoming meeting; (4) assist-ing the leader in selecting the issues and questions to beaddressed at the meeting; (5) communicating with expectedparticipants in advance on the content of the forthcomingmeeting; (6) aiding the leader during the meeting; (7) keep-ing track of the findings, issues, requests, and actions calledfor during the meeting; (8) preparing and communicating theresults of the meeting to all participants; (9) following uprequests for information on the status of actions requestedduring the meeting; and (10) getting ready for the nextmeeting (p. 14).

The results of PerformanceStat cannot be positive with-out follow-up. Not following up is one of the greatest errorsin the carrying out of PerformanceStat (Behn, 2008b). Oneof the key CompStat principles identified by Maple andMitchell (1999, p. 32) was “relentless follow-up.” The active,relentless follow-up on actions or requests made during themeetings is critical to the review process.

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 4: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

516 NAM

The “-Stat” Approach as Cross-OrganizationalCollaboration

Behn (2010) explored the possibility that a PerformanceStatstrategy can be adapted to situations in which improving per-formance requires cross-agency or even cross-jurisdictionalcollaboration. PerformanceStat is designed to motivateemployees of an entire public agency to focus their indi-vidual and collective energies and creativity on accomplish-ing specific public purposes and producing specific results.PerfomanceStat mostly involves reviewing performance onan operational level within an agency or a jurisdiction.For example, the Australian version of CompStat is called“Operational Performance Reviews” (Mazerolle, Rombouts,& McBroom, 2007, p. 238). The operational improvements,however, are not the sole responsibility of a single agencyor department. Following through with intended purposesmay require the energy and creativity of people from avariety of public agencies. PerformanceStat needs to be mod-ified to produce results when collaboration is required todo so. Considering that, Behn (2010) created a new ver-sion of PerformanceStat, called CollaborationStat, whichemploys the same basic principles of leadership and manage-ment as PerformanceStat, but applies them to a collaborativestructure rather than a hierarchy.

Those who seek to employ a CollaborationStat strategyto improve the collective performance of several agenciescannot rely entirely on too much formal authority. WhereasPerformanceStat is oftentimes described as “a hierarchi-cal, hold-someone-else-accountable exercise” (Behn, 2010,p. 433), CollaborationStat simply cannot fit this description.Within any collaborative group, accountability for perfor-mance is mutual and collective.

Viewing “-Stat” approach as cross-organizational col-laboration, Behn (2010) proposed nine requirements forsuccessful CollaborationStat: (1) common purpose; (2) data;(3) cause-and-effect theory; (4) interpersonal trust; (5) expe-rience with PerformanceStat; (6) convenient forum for meet-ings; (7) operational impact; (8) champions and sponsors;and (9) continuity of leadership. Literature on the topicsof interagency collaboration, cooperation, coordination, orpartnerships underscores the clarification of a common pur-pose. For example, Thomas (2003) wrote: “Individuals worktogether because they want to achieve a common goal, notbecause they are told to work together by their bosses orelected officials” (p. 24). Collaborating agencies do not needto have identical missions; however, for each agency, theoverall mission needs to include “a sub-purpose that has alot in common with the sub-purposes of the others” (Behn,2010, p. 449).

Without current data, a public agency cannot deter-mine whether its performance is improving. The cur-rent data are important to both PerformanceStat andCollaborationStat, but what distinguishes CollaborationStatfrom PerformanceStat is the use of outcome data capturing

the ultimate, total impact of interagency collaboration (Behn,2010, p. 449). The members of a collaborative group mayhave plenty of current data but no outcome data, only out-put data, or process and activity data. Output data can revealwhat an individual collaborator produced, and activity datacan reveal what various members of the group have done.

Collaborators need a cause-and-effect theory that explainswhy progress, as measured by their output, process, or activ-ity data, has been made. There may be several ideas aboutwhat mechanisms connect the outputs or activities to thedesired outcomes (Bunge, 2004). Testing and proving someof those ideas (cause-and-effect theories) depend on theavailability of outcome data (Behn, 2010, p. 450).

Public agencies are more likely to collaborate and adopta more formally structured CollaborationStat strategy ifkey managers in these agencies have already developedsome interpersonal trust and professional respect (Behn,2010, p. 452). Trust is fundamental to interorganizationalcollaboration (Agranoff & McGuire, 2003, pp. 182–184;Bardach, 1998, pp. 252–268; Bryson, Crosby, & Stone,2006, pp. 47–48; McGuire, 2006; Thompson & Perry,2006; Vangen & Huxham, 2003). Working with professionalrespect for the competence of each other enables multipleagencies to undertake more ambitious cross-agency projectsand activities that substantially require collaboration.

Behn (2010) proposes that “public agencies that are seek-ing to collaborate are more likely to create a more formallystructured CollaborationStat strategy if some of them havedeveloped some experience in using a PerformanceStat strat-egy to improve performance with their own stove-pipedhierarchies” (p. 450). If one or more of the participatingagencies has already acquired the expertise of implementinga PerformanceStat strategy, they will be able to teach others,as well as to figure out how to adapt their knowledge to theneeds of the group.

A convenient forum of meetings enables repeated interac-tions, which are important for maintaining cooperative rela-tionships (Thomas, 2003, p. 272). CollaborationStat requires“a venue that all accept as a useful and convenient placeto come together to examine data and their trends, to dis-cuss problems and obstacles, and to develop new strategiesand collaborative interactions” (Behn, 2010, p. 451). Thevenue should not permit any member of the collaborativegroup dominant status; rather, it needs to be perceived assufficiently neutral to be acceptable to all.

Visible operational impact strengthens motivation ofagencies to participate in CollaborationStat. If each par-ticipant has an observable impact on some of the dataused to measure results, performance, and progress, themembers of the collaborative group are able to hold eachmutually and collectively responsible for progress (Behn,2010, pp. 451–452). On the other hand, if organizations seethemselves as having no measurable activity or output thatcontributes to the desired outcome, they have no reason toparticipate.

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 5: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 517

Having internal sponsors and enthusiastic external cham-pions is important to ensuring that the group’s purpose iscarried out. Behn (2010, p. 453) cited the supporting exam-ple of Baltimore’s Mayor Dixon, who never actively engagedin any CollaborationStat discussions, but did provide the nec-essary political stimulus and strong support as advocates ofcollaboration.

In relation to the role of sponsors and champions,CollaborationStat requires continued leadership. Withoutsome continuity in leadership, it will be difficult for collabo-ration champions to emerge and develop their effectiveness.In this regard, there is a difference between PerformanceStatand CollaborationStat since PerformanceStats have survivedchanges in both elected and appointed chief executives, butCollaborationStat has no evidence or experience of havingsurvived change. If several of the core collaborators resignleadership simultaneously, it might be necessary to restartCollaborationStat almost from the beginning (Behn, 2010,p. 453).

THE CASE OF PHILLYSTAT

The City of Philadelphia (hereafter, Philadelphia) has a shorthistory of its own CitiStat. When Mayor Michael Nutterfirst took office in 2008, the Managing Director CamilleBarnett launched PhillyStat as a way to hold departmentheads accountable for tracking problems and complaints anddealing with them. This program seeks to set goals and beable to measure how well the city government is servingPhiladelphia residents.

To retool the idea, the succeeding Managing DirectorRich Negrin temporarily put Phillystat on hold when hetook over in 2010. The need for suspension and fine-tuningof PhillyStat was essential because it was falling short onthe Nutter Administration’s key principles (transparency,accountability, and customer orientation) during its earlierstage. After a year’s hiatus, the current Managing Directorrevived a dormant series of meetings known as PhillyStat inJuly 2011. PhillyStat returned with notable tweaks. Unlikethe original incarnation of CitiStat, there are now two typesof meetings. Some meetings, chaired by the ManagingDirector, focus on improving the performance of day-to-dayoperations in specific departments. Other meetings, chairedby the mayor, focus on measuring the city’s progress towardhis substantial goals. The former is called PhillyStat Ops, andthe latter is called PhillyStat Outcomes.

The relaunch of PhillyStat entailed redesign in formatand organization. For each department to ensure overallpeak performance, PhillyStat Ops expanded the previousPhillyStat meeting format by including five key performanceareas within the individual department: operations, humanresources, finance and budget, technology, and customerservice. In an effort to facilitate problem-solving, the newPhillyStat meetings are also staffed with leaders from those

five areas who can identify and share the most effective prac-tices, while also helping to resolve issues identified duringthese meetings. As the city’s performance management tool,PhillyStat Ops collects data on how well city departments dotheir jobs, reports information, and comes up with innova-tions to improve efficiency. City leaders review departmentalperformance metrics. PhillyStat Ops involves top officialsappearing in front of the managing director and answeringquestions.

With the relaunch, PhillyStat Outcomes added a newdimension to the original PhillyStat model by bringing cityleaders together to set and track the whole-of-city per-formance indicators directly related to the mayor’s broad,strategic goals for the city. This ensures that departmen-tal initiatives are aligned to those high-level goals for clearoversight and decision-making. For PhillyStat Outcomesmeetings, city leaders collectively conduct a careful reviewof the progress toward the mayor’s strategic goals—publicsafety, education, and employment in a greener and moresustainable Philadelphia with a healthy business climate.

In sum, PhillyStat assesses the operational performance ofcity departments, promotes a culture of continuous improve-ment, and strives to accomplish the larger goals by encourag-ing coordination and collaboration across departments. Thetwo types of PhillyStat (Ops and Outcomes) are aligned,respectively, to PerformanceStat and CollaborationStatreviewed in the previous section. Table 1 compares theirdifferent natures.

As a new enhancement in transparency, PhillyStat meet-ings are transparent for the public as well as across depart-ments within the city hall. All sessions are open so thatthe public is invited to watch. Philadelphians can see themeetings live on television, and thereby monitor and trackthe progress of new key innovations developing insideof departments. The PhillyStat homepage (www.phila.gov/performance/Philly_Stat.html) is available for checking theschedule of upcoming meetings. The PhillyStat Calendar onthe webpage provides presentations and videos for past ses-sions. Also, the meeting materials are available upon requestto PhillStat staff.

ASSESSING THE OPERATIONALPERFORMANCE REVIEW

This section analyzes the review of operational performancein PhillyStat Ops meetings. For the analysis, performancereview meetings of two different (in their main serviceareas and domains) departments are selected (Philly 311 forQuarter 3 on July 6, 2011, and Office of Fleet Managementfor Quarter 4 on September 14, 2011). Each selected meetingwas organized and structured with the same agenda and for-mat as its preceding and subsequent events, so the selectedmeetings can be considered typical of Ops meetings. Eachdepartment’s Ops meetings are similar because the common

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 6: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

518 NAM

TABLE 1Two Natures of PhillyStat

PhillyStat Ops PhillyStat Outcomes

Stat approach PerformanceStat CollaborationStatKeyword Operational performance High-level strategic goalsMain activity Assess and monitor operational performance within a

single departmentTrack the achievement of the Mayor’s high-level strategic

goals and encourage collaboration across departmentsFoci Five areas (operations; human resources; finance and

budget; technology; and customer service)Five major goals (safety; education and health; livable and

business-friendly place; green and sustainable place; andefficient and effective government with integrity andresponsiveness)

Chair of meetings The Managing Director The MayorFrequency of meetings Quarterly with respect to each department Quarterly with respect to each strategic goal

purpose of such meetings is to continuously track operationalperformance by using identical indicators and metrics.

These two particular meetings have been chosen as thefocus for this paper because of the availability of abundantdata and information about them (presentation materials andvideo files of the Ops sessions). Although the selected meet-ings cannot be representative of all other Ops meetings in thesense of random case selection in quantitative studies, thecase selection is grounded on information-oriented selectionthat qualitative case studies usually employ. The selectionof the two meetings considers maximum variance in com-parable cases “to obtain information about the significanceof various circumstances for case process and outcome”(Flyvbjerg, 2006, p. 230). Corresponding to that logic of caseselection, the selection of the two meetings among othersdoes not necessarily cause bias for the analysis of PhillyStatOps.

Purpose

Behn (2008a) wrote: “[PerformanceStat] is not managerialmagic. A PerformanceStat strategy cannot improve perfor-mance until the leadership team of the agency or jurisdictionfirst defines the nature of the performance that they seekto improve” (p. 208). In this sense, an initial questionfor assessing PerformanceStat should be “What is the per-formance deficit that needs to be eliminated, reduced, ormitigated?” (Behn, 2004, pp. 10–11). Currently six depart-ments or offices (i.e., Philly311, Human Resources, FleetManagement, Procurement, Records, and Public Property)are joining in PhillyStat Ops. Each department sets perfor-mance goals in the five key areas (i.e., general day-to-dayoperations, human resources, finance and budget, technol-ogy, and customer service). Specified is the performancepurpose that each department is trying to achieve. Eachdepartment focuses on strategic directions, which are majorinitiatives to accomplish core missions over the next cou-ple of years. Table 2 presents the clarified purpose ofPhilly311 and Fleet Management. Other departments alsohave specific purposes that can be assessed, monitored,tracked, and reviewed.

Data

The nature of performance data is central to determin-ing the quality and result of performance evaluation. Behn(2005) suggested several core features of performance datafor PerformanceStat: the breadth, depth, disaggregation, andnewness of the data, and he emphasizes the thoroughness ofthe analyses of these data. As PerformanceStat relies heavilyon quantified and measurable data, PhillyStat uses a vari-ety of performance indicators that cover the core areas ofdepartmental operation. Table 3 lists examples of the mainindicators with respect to each core area of departmentalfunctions.

The two departmental cases above, Philly311 and FleetManagement, reveal some common weaknesses in the itemsof performance data. In the indicators of Table 3, budget,technology, and human resources deserve careful consider-ation. The departmental presentations in each Ops meetingconsist of simply delivering the obvious fact that budget isnot enough. The status of budget, expenditures, and obli-gations was not explained in conjunction with the level ofoperational performance. Performance review, in terms oftechnology, merely confirms with the timeline when thedepartment can finalize a project to launch new technolo-gies. In terms of human resources, the review is not linkedwith what the department actually achieves. Currently it onlygives departments an opportunity to report the number ofvacancies in their office to the top management rather thanan opportunity to explain the rationale of immediate recruit-ment and discuss why and how the vacancies affect theiroperational performance.

The results of analyzing data are presented in the meet-ings. Each department bases its own self-reported perfor-mance evaluation upon concrete data. The data analysis usessimple techniques like graphs, charts, and frequency tables.The “-Stat” suffix does not necessarily stand for statistics;instead, the results discussed in PhillyStat Ops are reportedwith plain language and easily understandable methods toillustrate a trend, pattern, and status. Most data are ulti-mately analyzed in terms of whether or not target goals aremet. A performance deficiency or a gap between the current

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 7: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 519

TABLE 2Examples of Specifying Performance Purposes

Philly311 Fleet management

Core missions • Provide the public with quick, easy access to all City of Philadelphiagovernment services and information while maintaining the highestpossible level of customer service.

• Support City departments and agencies in the delivery ofmunicipal services by ensuring that City vehicles andother automotive related equipment and services areavailable, dependable and safe to operate.• Assist agencies and department in improving service delivery by

allowing them to focus on their core missions and manage theirworkloads efficiently.

• Provide recommendation into ways to improve City governmentthrough accurate, consistent measurement and analysis of servicedelivery citywide.

Strategic directions • Collect data to enhance knowledge about the local communitiesthrough increased engagement and citizen feedback.

• Replace older, often repaired, equipment for new, morereliable technologically efficient vehicles.

• Implement new technology to improve the quality of informationdelivery by making it timely, accurate, transparent, and easy toaccess.

• Continue to sustain the current workforce throughrigorous training, while maintaining and improvingproductivity.

• Increase accountability for delivering customer servicecitywide.

• Create a high-performance organization.

• Develop thoughtful succession planning by evaluatingindividuals who would be successful in the department’smost important positions.

TABLE 3Examples of Performance Indicators

Philly311 Fleet management

Main data source • Philly311 database • Fleet Accounting Computer Tracking System (FACTS)• Survey

Operations • Accuracy rate of the knowledgemanagement database content (target: 80%)

• The status of projects (e.g., thebenchmarking project of industry database,the consolidation plan)

• The number of vehicles managed• Total number of vehicles grouped by major user

categories• Total parts and labor costs grouped by major user

categories• Median age of vehicles• Average percentage of availability by department• Vehicle repair turnaround time• Percentage of work orders completed citywide

Customer service • The number and average percentage of callsanswered

• The number of trained employees (administrative staff,team leaders, and supervisors)

• Telephone answer rate• The number and percentage of first call

resolution• The percentage that call takers accurately

enter in service requests• Average waiting time for a call

Finance and budget • The difference from target budget • The difference from target budget• Budget compared to the previous years • Budget compared to the previous years

Technology • The status of technology projects until thetarget kickoff date (pending, in progress, orcompleted)

• The status of technology projects until the target kickoffdate (pending, in progress, or completed)

Human resources • The number of fulltime employees • The number of fulltime employees• The difference between the current status

and target in terms of the number ofemployees

• The difference between the current status and target interms of the number of employees

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 8: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

520 NAM

status and target goal is identified and highlighted. The dataanalysis is a main part of each presentation at the meetings.Performance evaluation is reported with an explanation ofhow much current performance has been improved since thelast period (quarter) or over a longer period of time. To con-clude the data analysis, key challenges and key wins arereported.

Infrastructure

Behn’s (2006, 2007, 2008a) observation of variousPerformanceStat practices in U.S. governments and agenciesfound similar characteristics in terms of the meeting roomand the technology. This technical aspect of PerformanceStatmay affect the efficiency and effectiveness of reporting,presentation, and discussion. Additionally, the infrastruc-ture is important to the implementation of the leadershipstrategy.

All PhillyStat meetings are held at the Municipal ServiceBuilding, but the actual session room may change withadvanced notice. The room format of PhillyStat Ops meet-ings is very different from that of Baltimore’s originalCitiStat. Whereas Behn (2006, 2007) identified the podiumas the most obvious feature of the CitiStat room, the roomfor PhillyStat Ops does not have a podium. Instead, everypresenter speaks out about the departmental quarterly reportwhile seated. The CitiStat room is more like a typical pre-sentation room, but the PhillyStat Ops room is equippedwith a large table surrounded by about 15 chairs, and about20 chairs for the audience at the opposite end of the table.The chairperson of PhillyStat Ops meeting, the managingdirector, sits down in the middle on the one side of therectangular table. PhillyStat staff and some key membersof the Managing Director Office sit down on the chairper-son’s side. A department head and the department seniorteam sit down on the other side of the table. A screenfor the presentation is placed over the table. This tablesetting is better for free communication than the originalCitiStat model, which is designed for an effective presen-tation.

Technologies other than a projector for presentationare not used in the meetings. PhillyStat staff is consid-ering more active outreach through social media such asFacebook and Twitter because PhillyStat itself can be atransparency initiative that shows and reveals what govern-ment is doing, even to citizens who do not typically attendmeetings.

Meetings

This criterion is for assessing the meetings themselves. BothOps and Outcomes sessions meet quarterly. Ops meetingsare held each quarter with respect to each department, andOutcomes meetings are also held each quarter with respect to

each strategic goal. The meetings for both Ops and Outcomesare scheduled at least one or two months in advance. Thestaff of PhillyStat coordinates and arranges the meetingschedule. Departments are notified of the schedule, and it isalso posted on the PhillyStat homepage for public awarenessand convenience. As previously stated, the city’s manag-ing director chairs the meetings, and a department head anddepartment senior team members join the meetings. Themanaging director, appointed by the mayor, serves as a pro-fessional manager in the way of a chief executive officer,while the mayor as an elected administrator still has powerand authority. The mayor’s office is not directly involved inthis operational performance review. Instead, the managingdirector communicates with the mayor about the results ofthe Ops meetings.

The number of participants varies with meetings, but typ-ically ranges between 10 and 20 people (other staff from thedepartment, and sometimes a small number of ordinary citi-zens) other than the usual table members may sit in the chairsset out for the audience. The agenda discussed at a series ofmeetings is clearly set before the meeting begins. Monitoringand tracking performance focuses on the five core areas to bereviewed.

Each meeting shows the dynamics of human interactions.Behn’s (2008a) observation of how the meeting is conductedin reality merits attention:

For PerformanceStat, the meeting can evolve into a game.The leadership plays gotcha, while the subunit participantsrespond by attempting to score debating points, devotingtheir time to anticipating and rebutting the gotcha questions.In the gotcha game, no one is attempting to actually improveperformance. (p. 217)

In PhillyStat Ops, what Behn (2008a) observed at otherCitiStats can be only partially true. From Ops meet-ings, determining inadequate performance did not neces-sarily lead to suggesting some feasible ways to actuallyimprove performance. However, those from the ManagingDirector Office try to understand the operational difficul-ties in under-performing areas rather than simply pointingout where performance is lacking. Ops meetings alloweach department to expose internal challenges to the lead-ership team, and thereby a common, shared understand-ing of some key challenges is developed through themeetings.

Operational Capacity

A fulltime director and fulltime staff commitment toPerformanceStat is important to its basic operational capac-ity (Behn, 2006). PhillyStat staff works for the ManagingDirector Office. There are multiple deputy managing direc-tor positions in the city government. One deputy managing

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 9: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 521

director takes charge of PhillyStat. The actual title of theposition is the Director of PhillyStat. The whole team ofPhillyStat staff consists of fulltime employees: a director andfour analysts. They possess high-level skills and techniquesfor statistical data analysis.

The PhillyStat Director’s role is basically to lead thePhillyStat team. The team’s main responsibilities chieflyinclude arranging and organizing regular PhillyStat meet-ings, analyzing the departmental reports, reporting the analy-sis back to the departments, and reporting the analysis to themanaging director and the mayor’s office. Significantly, thePhillyStat Director and staff help the departments advancetheir performance indicator metrics in the five core func-tional areas by changing a diffused model to a centralizedone for better management of the executives. The PhillyStatstaff advises each department to develop the metrics in a waythat is more consistent with other departments through theview of the entire city government.

Follow-Up

PerformanceStat should not discontinue after the meet-ing. Rather, post-meeting tasks are crucial for real perfor-mance improvement. Questioning, feedback, and follow-up should continue persistently and sometimes relentlessly(Behn, 2005). Behn’s (2006) framework for a case studyof CitiStat suggested the importance of assignments, bywhich the performance review team, as a consequence ofthe meeting, requires the reviewed department to makeup the performance deficit with a new plan. Then thedepartmental performance is reexamined in the followingmeeting.

PhillyStat Ops relies on a different follow-up mechanism.Ops meetings do not impose strictly specific formal assign-ments on the departments. At a series of quarterly meetings,each department reports any changes from previous outputsin terms of performance indicators. The iteration of reviewswith respect to each common functional area tends to weakenthe necessity to push immediate assignments specific to eachdepartment. The Managing Director Office (PhillyStat staff)and each department agree on number-based target goals foreach quarter. In this sense, following up entails the periodicalthorough check-up of whether the department is reaching itstarget from one quarter meeting to the next.

A more meaningful characteristic of PhillyStat Opsis that Ops meetings help the executive manager of thecity (the managing director) better understand manage-ment within each department, develop the department’scapability of performance management, and even changeits organizational culture, beyond monitoring and track-ing the number-based performance. Ops meetings facil-itate the direct communication and discussion betweenthe executive and departments. That does not mean an

easier workload and lighter responsibility for the depart-ments. Though departments do not have their own internalPerformanceStat, they do have frequent internal meetingsto prepare for the quarterly Ops meetings. In the internalmeetings, each department makes an effort to better visual-ize and show a trend of change in operational performanceover time. Frequent internal meetings within the departmenthelp develop a more effective way by which its opera-tional performance is understood to the Managing DirectorOffice.

ASSESSING THE COLLABORATIVEPERFORMANCE REVIEW

This section analyzes PhillyStat Outcomes on the basis of thenine components required for CollaborationStat. PhillyStatOutcomes meetings conduct the leadership team-driven per-formance review of cross-departmental collaborative activ-ities. The meetings are a high-level review of the mayor’sstrategic goals and outcomes. The mayor or his chief of staffchairs while the city’s core team members (the city solicitor,the communications director, the finance director, the man-aging director, and the deputy mayors) present. Outcomesmeetings had been held twice for each strategic goal. Thisassessment is based on the presentation materials and videoclips of all meetings that have happened so far.

Common Purpose

Table 4 describes the mayor’s strategic goals, outcomes, andthe list of city government officials (the titles of their posi-tion) who have primary accountability for the outcomes. Thegoals are shared by related agencies, departments, and evennongovernmental entities.

For instance, Table 5 shows the list of multiple collab-orating organizations related to Goal 2. Like the exampleshown in Table 5, other strategic goals are also shared byrelated organizations or departments. In the city’s govern-ment, a common purpose is strengthened by partnershipsamong collaborating agencies and departments. Importantly,key internal departments of the City Hall have establisheda service-level agreement. The agreement for joint servicedelivery and maintenance is based on commitment to acommon purpose and shared outcomes.

Current Data

PhillyStat Outcomes meetings use outcome data to cap-ture the ultimate, total impact of interdepartmental andinteragency collaboration. A unique aspect of PhillyStatOutcomes is the use of service-level data aligned withservice-level agreement. Most services are not related to onesingle department, though a department may claim main

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 10: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

522 NAM

TABLE 4Strategic Goals for the City of Philadelphia

Strategic goals Outcomes Primary owners

Goal 1: Philadelphia becomes • Outcome 1: Adults and children are safer. Deputy Mayor for Public Safetyone of the safest cities in • Outcome 2: People feel safer.America • Outcome 3: Residents feel a greater responsibility

to keep their neighborhoods safe.Goal 2: The education and • Outcome 1: Philadelphians are better educated. Deputy Mayor for Health and Opportunity; Chief

health of Philadelphiansimproves.

• Outcome 2: Philadelphians are healthier. Education Officer

Goal 3: Philadelphia is aplace of choice.

• Outcome 1: Philadelphia becomes more businessfriendly.

Deputy Mayor for Planning and EconomicDevelopment

• Outcome 2: People choose to live and stay inPhiladelphia.

Goal 4: Philadelphia becomes • Outcome 1: Our water and air are cleaner. Deputy Mayor for Transportation and Utilities;the greenest and mostsustainable city inAmerica.

• Outcome 2: We use less energy. Deputy Mayor for Environment and CommunityResources

Goal 5: Philadelphia • Outcome 1: Our government has integrity. Deputy Mayor for Administration and Coordination;government works efficientlyand effectively, with integrityand responsiveness.

• Outcome 2: Our government works efficiently andeffectively.

Director of Finance; City Solicitor

• Outcome 3: Our government is responsive.

TABLE 5The Organizations Involved in Strategic Goal 2

Outcomes Main department/program Involved organizations

Outcome Mayor’s Office of Education • The School District of Philadelphia• The Department of Human Services• Philadelphia Youth Network• College-Ready Committee of the• Philadelphia Council for College and Career

Success• PhillyGoes2College Office• Campus Philly (nonprofit)

Outcome 2 The Health and Opportunity Cluster • The Department of Public Health• The Office of Supportive Housing• The Department of Behavioral Health and

Intellectual disAbility Services• The Department of Human Services

responsibility for producing and delivering a service, andoften functions as a primary channel for service requests.

The data source for an analysis varies with the strate-gic goals and expected outcomes. Philly311 is one ofthe most important data sources for Goal 5 (efficientand effective government) among other strategic goals.The city government taps into the highly effective mar-riage of the 311 nonemergency service contact system toPerformanceStat. Philly311’s original mission is to providethe public with quick, easy access to all City of Philadelphiagovernment services and information. Now, Philly311 goesbeyond its primary function as a centralized channel for ser-vice requests by allowing agencies and departments to focuson their core missions and efficiently manage their work-load. Philly311 now offers outcome-level data in addition tooperational-level data.

The most significant concern for PhillyStat is eachdepartment’s willingness to share data. Without the will-ingness for data sharing, the CollaborationStat-type meet-ings are no better than symbolic and nominal as politicalrhetoric. Developing the metrics of outcome-level perfor-mance indicators is also central to increasing the impactof PhillyStat Outcomes. The outcome-level metrics needsfurther sophistication.

Cause-and-Effect Theory

PhillyStat Outcomes meetings have used a highly hierarchi-cal model to take into consideration causes and effects. Themeetings base their discussion on public program evalua-tion, which examines the impact of a policy trigger. Table 6presents an example of the cause-and-effect scenario related

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 11: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 523

TABLE 6The Cause-and-Effect Scenario for Making Philadelphians Better Educated

Measures and targets Strategies and programs to achieve the target

• Measure 1: Six-year high school graduation rate Strategy 1: Provide support to students who are not on track to graduate• Target 1: Increase the high school graduation rate to

65% by 2012 and 80% by 2015• Re-engagement Centers will help students get back on track to

graduate.• Accelerated high schools will provide students who are at a high risk of failure

to graduate with an expedited path to graduation.Strategy 2: Identify early indicators and intervene to prevent students from

dropping out• Project U-Turn as a citywide collaborative (including representatives of the

School District, City agencies, foundations, youth-serving organizations,parents and young people) campaign focuses public attention on and developsstrategies Philadelphia’s dropout crisis.

• Cross-Sector Truancy Reduction Plan (partnership among the Courts, theDepartment of Human Services, and the School District) will increaseprevention and early intervention in the schools.

• Measure 2: Percent of residents with a four-yearcollege degree or higher

Strategy 1: Collaborative efforts to increase the college attainmentlevel

• Target 2: Increase the percentage of residents with a4-year college degree to 25% by 2012 and 36% by2015

• College-Ready Committee of the Philadelphia Council for College and CareerSuccess will support youth attainment of post-secondaryeducation.

• The Mayor’s Returning to Learning Partnership is to help more Cityemployees reach their postsecondary goals.

• The PhillyGoes2College initiative connects Philadelphians of all ages to theresources they need to attain a college degree through the dissemination ofinformation, workshops for students and parents, and college-related events.

• The Graduation Coach Campaign ensures that all youth in Philadelphia haveaccess to a caring, supportive, knowledgeable adult who can coach themthrough high school, college, and their careers.

Strategy 2: Increase the number of college graduates who stay in Philadelphiaafter graduation

• Campus Philly fuels economic growth by encouraging college students tostudy, explore, live, and work in the Greater Philadelphia region.

to Goal 2 (Philadelphians are better educated). Target goalsto be achieved by 2015 are set in terms of two simple mea-sures (high school graduation rate and the percentage of4-year college graduates). Various programs and initiativesare the components which will make an impact on the edu-cation level of Philadelphians. The citywide programs toimprove educational attainment have to be a collaborativeeffort among government agencies and also with nonprof-its and schools. Those programs are supposed to help thecity to achieve their target goals and ultimately contributeto increasing the number of better educated Philadelphians.This is only an example, and other strategic goals are alsospecified into cause-and-effect theories (or the connectionbetween programs, strategies, and target goals) in Outcomesmeetings.

Interpersonal Trust

Since PhillyStat Outcomes necessarily entails cross-departmental efforts, interpersonal trust is an importantfactor to affect the process and result of Outcomes meet-

ings. In office since 2008, the mayor, in his secondterm, has emphasized the importance of teamwork, part-nership, and collaboration across departments and agencies.Cross-departmental collaboration occurs in the process ofpreparing for Outcomes meetings. The open meetings area result of working together on how citywide outcomesshould be presented in terms of the city government’sachievement of strategic goals, rather than the achieve-ment of one particular department. In PhillyStat Outcomes,individual-level trust contributes to department-level coordi-nation and collaboration. The mayor’s core team, appointedat the beginning of his second term, has a good team-work spirit that can be translated into trust and respectamong the members. Interpersonal trust is fundamentalto the organic partnership among government agencies,departments, and nongovernmental organizations related toachieving the mayor’s strategic goals for the whole city.PhillyStat Outcomes can be seen as a collective manage-ment tool driven by the mayor’s leadership, thereby facil-itating interdepartmental collaboration and boosting inter-personal trust and respect among the key members of themeetings.

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 12: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

524 NAM

Experience with PerformanceStat

During the temporary suspension of PhillyStat for reforma-tion, the key leadership team has learned from practices inother PerformanceStats. The new version of PhillyStat—Ops plus Outcomes—was created with the reconsiderationof prevalent CitiStats across the country and the adaptationof current exemplars to the city government. In preparationfor retooling of PhillyStat in 2010, the mayor, the man-aging director, and other key decision-makers listened toadvice from those within city government who had experi-ence with CitiStats in other cities. The PhillyStat Directorand staff members also actively shared their experiences withAgencyStat within the city (e.g., SchoolStat in Philadelphiaand performance management tool of Police Department)and CitiStats in other metropolises.

A unique feature of PhillyStat Outcomes is that experi-ence with PhillyStat Ops can be shared with that of PhillyStatOutcomes. Some executive managers (the managing direc-tor and his deputy managing directors) have simultane-ous experience in both PhillyStat Ops as PerformanceStatand PhillyStat Outcomes as CollaborationStat. In addition,PhillyStat staff that takes part in both meetings can offersuggestions to the leadership team, based on lessons derivedfrom their experiences in the two different levels of perfor-mance review.

Convenient Forum for Meetings

Behn (2010, p. 451) claimed that regular CollaborationStatsessions create repeated interaction. Outcomes meetings areheld at the city’s Municipal Service Building and thereis little difference between Ops and Outcomes meetings.Involved people (the mayor’s office and core team members)meet quarterly with respect to each strategic goal. PhillyStatstaff arranges the meeting schedule and posts the meetingdates one to two months in advance. The schedule rarelychanges, but some meetings have moved to later dates withadditional notice. The number of participants in Outcomesmeetings is usually more than that of Ops meetings, rang-ing between 20 and 25; 10 to 15 chairs are available for theaudience.

In general, the mood of the meetings is communica-tive. The meetings start with the mayor’s emphasis on hisbroad goals for the progressive city. After the mayor’s shortspeech, the meetings do not typically follow a mechan-ical, presentation-then-comment and question-and-answerformat. Presentation hand-outs are distributed to people atthe table and in the audience. Everyone is seated aroundthe long table, and participant discussion is moderated bythe mayor. The Outcomes meetings are more like a forumfor in-depth discussion about given agendas. The conversa-tion covers things that have happened and are happening inthe city, and the mood of discussion is usually enthusiastic.

The high-level performance review session oftentimes con-tinues, regardless of what the presentation (prepared by theprimary owner of each goal) technically delivers by number-based data. This is because the discussion is more aboutmissions and visions than about checking on whether or notthe outcome target goals have been met, though the latter iscertainly addressed.

Operational Impact

PhillyStat Outcomes may bring about two possible oper-ational impacts. The first possibility is that the Outcomesmeetings could directly influence operation in each depart-ment and agency. The other possibility is that Outcomesmeetings could influence Ops meetings. However, thecity has not yet confirmed any direct connection betweenoutcome-level performance and operational performance.A possible reason for this is that in most departments, anexus between citywide outcomes and daily departmentaloperations is not clear. Philly311, however, makes a stronglink between outcome-level performance review (Goal 5 interms of more efficient, effective, and responsive customerservices) and the agency’s core operational functions (reso-lution of service and information requests).

Champions and Sponsors

PhillyStat Outcomes has gained much support from exter-nal champions and internal sponsors. Support is a resultof the way in which PhillyStat began—by the mayor’sstrong leadership. The mayor and his suite of managersare champions of PhillyStat. The mayor and the formermanaging director introduced PhillyStat, based on theirambitions to ensure that the City of Philadelphia is run-ning at optimal performance and due to the necessity ofa powerful performance tracking tool for the leadershipteam. The current managing director renovated PhillyStat byaugmenting the Outcomes review session. The executivesthemselves have been involved in designing and structur-ing PhillyStat. For PhillyStat Outcomes, the mayor him-self is dedicated to all Outcomes meetings to which heis assigned as a chair. The mayor, the managing director,and other core members try to understand citywide prob-lems and challenges in order to achieve their strategic goals.In addition to internal supports, PhillyStat Outcomes con-tinues to gain growing attention from the public and localmedia.

Continuity of Leadership

A continuity of strong leadership is important to maintainingCollaborationStat. The mayor is the most powerful cham-pion and sponsor for PhillyStat Outcomes. However, there is

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 13: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 525

a concern related to directing of the system by the currentmayor, whose position will eventually be taken by anotherindividual. Changing leadership means that continuation ofPhillyStat Outcomes may not be guaranteed in the succes-sor’s administration. Behn’s (2010, p. 453) speculation maybe true that CollaborationStat risks losing support from thetop in the absence of continuously strong leadership andsupport for the program.

Outcomes review meetings are not used for political bar-gaining, but rather they are a series of public managers’in-depth discussions of the city’s development and revitaliza-tion. For continuous efforts to lead to the progress of a city,Behn (2005) underscored experimentation, learning, and theinstitutional memory of a city’s top executives. The institu-tional memory of PhillyStat Outcomes needs to be transitedto the next term, no matter who takes over the administra-tion. In turn, transmitting the memory and system to thesucceeding administration is the current incumbent mayor’sresponsibility.

FURTHER DISCUSSION AND CONCLUSIONS

This paper assessed the capacities of PhillyStat bothas PerformanceStat and as CollaborationStat. In general,PhillyStat has more than basic capacities for reviewing,discussing, and monitoring operational and outcome-levelperformance in terms of the assessment criteria. The assess-ment revealed unique features of PhillyStat that makes itlook different from other CitiStats. Based on those uniquefeatures, this paper suggests the following four implica-tions to cities and agencies that may consider kicking offPerformanceStat, and that may implement improvementsinto the existing model.

First, the “-Stat” approach, one of the trendy strate-gies for performance review, reflects the way by whicha jurisdictional government defines the focus and charac-teristics of performance review. PerformanceStat programsrecently have continued to expand and evolve (Thornburgh,Kingsley, & Rando, 2010, p. 6). PhillyStat notably shows thenew expansion and evolution of PerformanceStat—beyondtracking department-level operational performance towardreviewing outcomes made by efforts to achieve broad strate-gic goals. PhillyStat Outcomes provides a new opportunityto analyze and monitor how the city government as awhole contributes to the progress of the city. The new pro-cess does not shrink the value of operational performancereview. The Outcomes meetings are not simply an addi-tion to the extant CitiStat-type performance managementtool, but rather those meetings create a new dimension ofcitywide performance management. PhillyStat Outcomes isaligned to CollaborationStat which, Behn (2010) suggested,is a tool to review the performance of cross-departmental

collaborative programs and initiatives. Cities who are con-sidering CollaborationStat need to pay attention to variousenablers of PhillyStat Outcomes. The first and foremostenabler is the mayor’s strong leadership and enthusiasticcommitment. In addition to his championing and sponsor-ing, the whole leadership team (the mayor’s office andthe Managing Director Office) has made frequent com-munication with all related departments and agencies. Theservice-level agreement is an important factor to enable theeffective outcome-level performance review. As a result, thewhole city government’s performance can be evaluated atthe service level, not only in each department’s operationallevel.

Second, there are various views on government per-formance, and especially there can be considerableincongruence between administrative perspectives andnongovernmental parties’ (academics, journalists, civicgroups, and also general citizenry) observations. The gapbetween operational performance and citizen perception—or the gap between indicator-based outputs and practicaloutcomes perceived—is hard to close: for example, seethe article titled “Crime experts question NOPD statsthat paint New Orleans as a safe city with a murderproblem” (http://www.nola.com/crime/index.ssf/2013/05/new_orleans_crime_stats_analys.html). Conspicuousimprovements in operational performance indicators do notalways guarantee the level of city government performancewith which citizens are satisfied. This may be one of usualweaknesses inherent in existing PerformanceStats (in termsof “whose performance?” and “performance for whom andwhat?”). Aligning between operational goals and strategicones of a city is the way to bridge between operational per-formance and strategic performance, as PhillyStat connectsits PerformanceStat (performance review of day-to-dayoperation) with CollaborationStat (performance review ofultimate outcomes). In regard to this perceived gap, anotherissue is the decrease in what the mature (for some agenciesor jurisdictions) stage of the “-Stat” approach may offer.In earlier days of the approach, CompStat that successfullyreduced serious crime of New York City and cut its crimerate in half-earned compliments from the public (Eterno,2010; Maple, 1999; Timoney, 2010; Weisburd, Mastrofski,Greenspan, & Willis, 2004; Willis, Mastrofski, & Weisburd,2003; Willis, Mastrofski, Weisburd, & Greenspan, 2003),and Baltimore’s CitiStat produced dramatic improvementsin city services and saved $350 million (Bens, 2005; Perez& Rushing, 2007; Schachtel, 2003; Thornburgh et al.,2010). What about in the second decade of the 21st century?What the “-Stat” approach brings to some governmentsmay lie in the plateau stage of the S-shaped learningcurve (slow beginning–steep acceleration–plateau), whilebeginners (agencies or jurisdictions) may get as muchas or more than they expected. Even for the former, the“-Stat” approach still deserves attention. The approach was

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 14: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

526 NAM

recognized as a powerful solution to problems inheritedthrough traditional bureaucracy, which did not prioritizeits core mission (service delivery), did not set sufficientlyhigh-performance standards, and did not record the infor-mation that public managers would need to meaningfullyimprove it (Bens, 2005; Schachtel, 2003; Thornburghet al., 2010, p. 1).

Third, the “-Stat” approach can be considered a collec-tive management tool as well as a performance manage-ment tool and leadership strategy. Data-driven performancemanagement is not the most important characteristic ofPhillyStat. More importantly, PhillyStat leads to organiza-tional culture change through collective management. Theoriginal CitiStat is an efficient and effective tool that mon-itors and tracks operational performance based on quantifi-able data. PhillyStat goes beyond data-driven performancereview. It enables and empowers key people in each depart-ment to communicate with the executive-level leadershipteam about the department’s current performance, futuredirections, and even challenges. The PhillyStat meetingsinvolve conversations about department achievements anddifficulties. Hence, PhillyStat becomes a communicationtool to help the leadership team understand both opera-tional management and the process of achieving strategicgoals.

Last, the “-Stat” approach can ultimately contributeto building and developing capabilities of departmentsand agencies. The leadership team of PhillyStat informscity departments of how current performance metrics canbe adapted to the whole city’s perspective. PhillyStatallows each department and agency to understand andconsider the views of its partners and collaborators.PhillyStat, thus, becomes a venue of organizationallearning.

Accordingly, PhillyStat triggers significant change in thecity’s performance management. This change brings a morecollaborative, communicative, and enthusiastic managementto city government. PhillyStat is a tool to change man-agers’ attitude about performance management rather thana tool to directly improve performance, even though ittracks the achievement of target goals and sometimes cansuggest a better way to reach the data-based target. ThePhiladelphian version of CitiStat is an innovative way toenhance the whole city’s performance through changingmanagement style. PhillyStat offers other city governmentsthe new possibility of creating the unprecedented “-Stat”approach: PerformanceStat plus CollaborationStat. Yet, itssuccess in other cities will depend on various factors andconditions.

REFERENCES

Agranoff, R., & McGuire, M. (2003). Collaborative public management.Washington, DC: Georgetown University Press.

Bardach, E. (1998). Getting agencies to work together: The practice andtheory of managerial craftsmanship. Washington, DC: The BrookingsInstitution.

Behn, R. D. (2004). Performance leadership: 11 better practices that canratchet up performance. Washington, DC: IBM Center for the Businessof Government.

Behn, R. D. (2005). The core drivers of CitiStat. International PublicManagement Journal, 8(3), 295–319.

Behn, R. D. (2006). The varieties of CitiStat. Public Administration Review,66(3), 332–340.

Behn, R. D. (2007). What all mayors would like to know aboutBaltimore’s CitiStat performance strategy. IBM Center for theBusiness of Government. Retrieved from www.irep-irff.org/doc/anticr/BehnReportCiti.pdf

Behn, R. D. (2008a). Designing PerformanceStat: Or what are the key strate-gic choices that a jurisdiction or agency must make when adapting theCompStat/CitiStat class of performance strategies?. Public Performance& Management Review, 32(2), 206–235.

Behn, R. D. (2008b). The seven big errors of PerformanceStat. PolicyBriefs. Cambridge, MA: Rappaport Institute for Greater Boston, TaubmanCenter for State and Local Government, John F. Kennedy Schoolof Government, Harvard University. Retrieved from http://www.hks.harvard.edu/thebehnreport/Behn,%207PerformanceStatErrors.pdf

Behn, R. D. (2010). Collaborating for performance: Or can there exist sucha thing as CollaborationStat?. International Public Management Journal,13(4), 429–470.

Bens, C. (2005). CitiStat: Performance measurement with attitude. NationalCivic Review, 94(2), 78–80.

Bryson, J. M., Crosby, B. C., & Stone, M. M. (2006). The design and imple-mentation of cross-sector collaborations: Propositions from the literature.Public Administration Review, 66(S6), 44–55.

Bunge, M. (2004). How does it work? The search for explanatory mecha-nisms. Philosophy of the Social Sciences, 34(2), 182–210.

Eterno, J. A. (2010). The NYPD’s CompStat: Compare statistics or composestatistics. International Journal of Police Science & Management, 12(3),426–449.

Flyvbjerg, B. (2006). Five misunderstandings about case study research.Qualitative Inquiry, 12(2), 219–245.

Gullino, S. (2009). Urban regeneration and democratization of informa-tion access: CitiStat experience in Baltimore. Journal of EnvironmentalManagement, 90(6), 2012–2019.

Hatry, H., & Davies, E. (2011). A guide to data-driven performancereviews. Washington, DC: IBM Center for the Business of Government.Retrieved from www.businessofgovernment.org/sites/default/files/A%20Guide%20to%20Data-Driven%20Performance%20Reviews.pdf

Maple, J., & Mitchell, C. (1999). The crime fighter. New York, NY:DoubleDay.

Mazerolle, L., Rombouts, S., & McBroom, J. (2007). The impact ofCOMPSTAT on reported crime in Queensland. Policing: An InternationalJournal of Police Strategies & Management, 30(2), 237–256.

McGuire, M. (2006). Collaborative public management: Assessing whatwe know and how we know it. Public Administration Review, 66(S6),33–43.

Perez, T., & Rushing, R. (2007). The CitiStat model: How data-driven government can increase efficiency and effectiveness. Center forAmerican Progress. Retrieved from www.americanprogress.org/issues/2007/04/citistat.html

Schachtel, M. R. B. (2003). CitiStat and the Baltimore neighborhoodindicators alliance: Using information to improve communication andcommunity. National Civic Review, 90(3), 253–266.

Thomas, C. W. (2003). Bureaucratic landscapes: Interagency cooperationand the preservation of biodiversity. Cambridge, MA: The MIT Press.

Thompson, A. M., & Perry, J. L. (2006). Collaboration processes: Inside theblack box. Public Administration Review, 66(S6), 20–32.

Thornburgh, D. B., Kingsley, C., & Rando, M. (2010). Smart cities:PerformanceStat at 15. Philadelphia, PA: Fels Institute of Government.

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14

Page 15: Assessing Operational and Collaborative Performance Management: A Case Study of PhillyStat

ASSESSING OPERATIONAL AND COLLABORATIVE PERFORMANCE MANAGEMENT 527

Timoney, J. F. (2010). Beat cop to top cop. Philadelphia, PA: University ofPennsylvania Press.

Vangen, S., & Huxham, C. (2003). Nurturing collaborative relations:Building trust in interorganizational collaboration. Journal of AppliedBehavioral Science, 39(1), 5–31.

Weisburd, D., Mastrofski, S. D., Greenspan, R., & Willis, J. J.(2004). The growth of Compstat in American policing. Washington,DC: Police Foundation. Retrieved from www.policefoundation.org/pdf/growthofcompstat.pdf

Willis, J. J., Mastrofski, S. D., & Weisburd, D. (2003). CompStat in practice:An in-depth analysis of three cities. Washington, DC: Police Foundation.Retrieved from www.policefoundation.org/pdf/compstatinpractice.pdf

Willis, J. J., Mastrofski, S. D., Weisburd, D., & Greenspan, R.(2003). CompStat and organizational change in the Lowell policedepartment: Challenges and opportunities. Washington, DC:Police Foundation. Retrieved from www.policefoundation.org/pdf/compstat.pdf

Dow

nloa

ded

by [

Bru

nel U

nive

rsity

Lon

don]

at 1

5:22

04

Nov

embe

r 20

14