27
Draft version Please do not cite or circulate without the author’s permission! Comments welcome! The Paradigmatic “Midlife Crisis”: Organizational Learning and Legitimacy at the International Monetary Fund since 2001 Paper presented at the 6 th ECPR General Conference Reykjavik, 25–27 August 2011 Matthias Kranke Otto Suhr Institute of Political Science Free University of Berlin, Germany 14195 Berlin, Germany [email protected] Abstract The International Monetary Fund (IMF) finally decided to put an evaluation office in charge of organizational learning. Since the Independent Evaluation Office (IEO) became operative in 2001, it has completed 19 evaluations of IMF policies. This paper studies IMF staff responses to IEO recommendations in all of these key reports. Assuming that the Fund’s policy is staff-driven, I employ a coding technique based on focused document analysis in order to determine the level of “compatibility” in organizational learning. The empirical evidence obtained from this analysis reveals a pattern of low conflict that has persisted from the inaugural IEO evaluation in 2002 up until the latest one completed in mid-2011. This finding leads to two successive arguments from a procedural perspective that is grounded in constructivist thinking: first, the IEO’s main contribution to learning lies in generating, accumulating, and storing “encoded” knowledge; and second, the IEO’s activities can lend considerable output legitimacy to IMF operations, thereby at least partly addressing its oft- decried “legitimacy deficit.” These two arguments are illustrated with anecdotal evidence from the IEO evaluation of structural conditionality. I conclude that IEO-inspired learning is organizational learning for the sake of learning and legitimacy. Keywords International Monetary Fund (IMF); Independent Evaluation Office (IEO); organizational learning; legitimacy; evaluation reports; IMF staff

The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

Draft version

Please do not cite or circulate without the author’s permission! Comments welcome!

The Paradigmatic “Midlife Crisis”: Organizational Learning and

Legitimacy at the International Monetary Fund since 2001

Paper presented at the 6th ECPR General Conference

Reykjavik, 25–27 August 2011

Matthias Kranke

Otto Suhr Institute of Political Science

Free University of Berlin, Germany

14195 Berlin, Germany

[email protected]

Abstract

The International Monetary Fund (IMF) finally decided to put an evaluation office in charge of organizational learning. Since the Independent Evaluation Office (IEO) became operative in 2001, it has completed 19 evaluations of IMF policies. This paper studies IMF staff responses to IEO recommendations in all of these key reports. Assuming that the Fund’s policy is staff-driven, I employ a coding technique based on focused document analysis in order to determine the level of “compatibility” in organizational learning. The empirical evidence obtained from this analysis reveals a pattern of low conflict that has persisted from the inaugural IEO evaluation in 2002 up until the latest one completed in mid-2011. This finding leads to two successive arguments from a procedural perspective that is grounded in constructivist thinking: first, the IEO’s main contribution to learning lies in generating, accumulating, and storing “encoded” knowledge; and second, the IEO’s activities can lend considerable output legitimacy to IMF operations, thereby at least partly addressing its oft-decried “legitimacy deficit.” These two arguments are illustrated with anecdotal evidence from the IEO evaluation of structural conditionality. I conclude that IEO-inspired learning is organizational learning for the sake of learning and legitimacy.

Keywords

International Monetary Fund (IMF); Independent Evaluation Office (IEO); organizational learning; legitimacy; evaluation reports; IMF staff

Page 2: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

2

Introduction

We hope to have, before long, a board of fact,

composed of commissioners of fact, who will

force the people to be a people of fact, and of

nothing but fact. You must discard the word

Fancy altogether.

—Gentleman, in classroom, in Hard Times by

Charles Dickens

“Commissioners of fact” are hard to come by. In many ways, we as human beings regularly

seek out those among our peers “certified” to have acquired relevant general knowledge or

special expertise—be they doctors, mechanics, or teachers. But with new experiences in our

lives do we realize that there is no such thing as an infallible commissioner of fact: a doctor

might fail to choose the proper treatment for a patient; a mechanic to repair a car; and a

teacher to answer a student’s question. Even more worryingly perhaps, we encounter that

facts are (almost) invariably subject to interpretation. Specialized training does not prevent

experts in their respective fields from disagreeing with one another: doctors on the best

treatment; mechanics on the best repair; and teachers on the best answer.

Understanding the relativity of “truth,” we direct much of our energy to weighing

conflicting evidence. Learning, then, centers less on knowing it all than on knowing how to

evaluate the past in order to draw lessons for the present and future. The evaluation of

individual and collective performances is a precondition for learning. Learning takes place in

institutionalized settings for nursery, primary and secondary, and tertiary education, ranging

from kindergarten to various forms of schools to vocational training and university

education. There has been an evident trend in many societies not only toward an expansion

of institutionalized learning for children and young adults but also toward lifelong learning.

This trend is increasingly reflected in the efforts of international organizations to evaluate

past policies. Learning on the basis of evaluations is certainly more complicated in collective

entities such as the World Bank or the International Monetary Fund (IMF) than it is for an

individual. Formalized performance evaluation—be it undertaken by the Independent

Evaluation Group (IEG)1 at the World Bank or by the Independent Evaluation Office (IEO) at

the IMF—represents a very obvious, though by no means the sole, platform for organizational

learning. But what do we see when unpacking learning in international organizations? How

do we even “see” organizational learning in the first place? Even though questions about

This preliminary paper has benefited from personal interviews—four in particular—during a brief stint at the IMF and the World Bank in Washington, DC, in March 2011. I am truly grateful to all the interviewees for their time and effort, as well as for follow-up e-mail conversations in one case. Discussions with Daniel F. Schulz and Christof Mauersberger during the drafting phase helped me greatly to clarify my thinking on the topic.

1 The World Bank launched its internal evaluation unit as early as 1973 under the name “Operations Evaluation Department” (OED). It was renamed IEG in 2004, shortly after the IMF had set up the IEO.

Page 3: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

3

policy reform abound after the recent global financial crisis, it is still instrumental to look at

the time before the crisis. The establishment of the IEO, which commenced its work in 2001,

is a case in point. Its role as a performance evaluator within the IMF offers us clues as to how

the entire organization has tackled policy and governance challenges in the years prior to, as

well as during, the crisis. Taking the IEO as the unit of analysis, I seek to develop a contextual

understanding of organizational learning at the IMF.

The large literatures on organizational learning and international organizations still

burgeon in separate, barely overlapping academic biotopes. This is unfortunate given the

huge potential for cross-fertilization. While organization studies have long been dominated

by disciplines such as management, sociology, or psychology (see, for example, Dodgson

1993), few studies in International Relations or International Political Economy have focused

on how international organizations learn from the past. With publications on modes of

“learning” and “adaptation” in the 1990s, Ernst B. Haas and Peter M. Haas rank among those

who contribute to bridging the gap (Haas 1990; Haas and Haas 1995). Their works, however,

have not served to spark a “learning turn” in either IR or IPE. Despite growing general

scholarly occupation with the politics of performance of international organizations (see

Gutner and Thompson 2010), the literature on the IEO itself remains exceptionally sparse

even a decade after its inception. A notable exception is Catherine Weaver’s (2010) recently

published study that process-traces the IEO’s creation on the basis of archival documents

from the IMF, complemented by personal interviews.

This paper explores the role of the IEO in organizational learning at the IMF. In particular,

it speaks to the thorny issue of assessing the impact of IEO reports on learning processes at

the IMF that might induce policy reforms, as well as changes to internal and external work

routines (Weaver 2010: 379–380). To this end, I cull evidence from the 19 completed

evaluations since 2001 through a coding technique based on focused document analysis.

Adopting the plausible proposition that the Fund’s policy is staff-driven, I code IMF staff

responses to the recommendations put forth in the IEO evaluation reports in order to

determine the level of “compatibility” in organizational learning. The empirical evidence

reveals a pattern of low conflict that has persisted from the inaugural IEO evaluation in 2002

up until the latest one completed in mid-2011.

This finding leads to two successive arguments from a procedural perspective that is

grounded in constructivist thinking: first, the IEO’s main contribution to learning lies in

generating, accumulating, and storing “encoded” knowledge; and second, the IEO’s activities

can lend considerable output legitimacy to IMF operations, thereby at least partly addressing

its oft-decried “legitimacy deficit.” The former argument pertains more to the IMF-internal

role of the IEO, the latter more to the Fund’s “external” relations with member states. I

illustrate the two arguments with anecdotal evidence from the IEO evaluation of structural

conditionality. Even though a few broader insights may be derived from my paper, I advance

Page 4: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

4

a contextual understanding that is not readily transferable to other organizations but specific

to the setting in which learning takes place at the IMF.

In the next section, I offer a short—and indeed selective—organizational biography of the

IMF on the verge of its 70th anniversary, focusing on performance evaluation in the IMF to

put the creation of the IEO into perspective. The subsequent section discusses the coding

results to substantiate the first argument about the IEO as a semi-internal unit for “encoded”

knowledge before I elaborate on the IEO’s legitimacy function for the IMF. The concluding

section reflects on the findings and considers the implications for the IMF in particular and

for learning processes in international organizations in general.

A Short Biography of the IMF and the IEO’s Mandate for Evaluation

Performance Evaluation at the IMF before and since 2001

The IMF will celebrate its 70th anniversary in 2014. At the time of the foundation of the IMF

and the World Bank at the Bretton Woods Conference in 1944, performance evaluation did

not figure on the political agenda of the two organizations specially designed to guarantee a

more stable international economic order. Yet after initial calls for the creation of a

performance evaluator fell on deaf ears within the IMF in the early 1970s, while other

organizations installed such bodies,2 the Fund continued to rely mostly on the Policy

Development and Review Department (PDR; now Strategy, Policy, and Review Department,

SPR), but also on other departments, to review country-specific programs and other issues on

an ad hoc basis (Weaver 2010: 368).

It was only in the early 1990s that the debate over an independent evaluation office

gathered steam. The twist and turns of the debate, which might be divided into three “waves,”

are extremely well documented in Weaver’s (2010: 368–374) study, and the Report of the

External Evaluation of the Independent Evaluation Report, commissioned by the IMF, also

touches upon some of the developments toward the establishment of the IEO (Lissakers,

Husain, and Woods 2006: 6). Thus, it is sufficient to note here that the IMF opted for the

creation of an independent evaluator and against the reliance on self-evaluation (typically at

a departmental level) and/or on external ad hoc evaluations by experts. The IEO finally

became operative in 2001 following a decision in April 2000 to create an independent body

for the evaluation of IMF policies (Weaver 2010: 366–367, 373).

The launch of the IEO marked a thitherto unprecedented degree of organizational

“maturity” of the IMF as a whole. Put in biographical terms, its “puberty” lasted much longer

than the World Bank’s, which “matured”—that is, institutionalized learning—already in 1973

2 Apart from the World Bank’s IEG, Weaver (2010: 366, fn. 362) cites the example of the Asian Development

Bank, which launched its OED in 1978.

Page 5: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

5

with the establishment of the IEG.3 Unlike humans, however, “young” international

organizations cannot draw on benign parenting or compulsory schooling during their

“childhood.” Instead, they have to process demands—both external and internal—instantly to

assert themselves in a competitive political environment. An organization as critical to the

monetary and fiscal health of countries as the IMF can ill afford to not learn at all; even in the

absence of a body officially in charge of learning did the IMF have to draw lessons from past

failures and successes, albeit in a less formalized and more spontaneous manner. The IMF’s

coming to terms with its paradigmatic “midlife crisis” experience of the 1997–98 Asian

financial crisis coincided with, and lent additional force to, the ongoing debate over the

necessity and format of performance evaluation (Weaver 2010: 371–372). The next

subsection presents the IEO’s mandate and the primary instrument through which it fulfills

the specific tasks.

The Core of the IEO’s Mandate: Evaluation Reports on Selected Topics

Since 2001, organizational learning at the IMF has been done in both a formalized manner

and on a permanent basis. The IEO’s “Terms of Reference” summarize under its mandate, or

“mission” in official parlance, four specific tasks:

The Independent Evaluation Office (IEO) has been established to systematically conduct

objective and independent evaluations on issues, and on the basis of criteria, of relevance

to the mandate of the Fund. It is intended to serve as a means to enhance the learning

culture within the Fund, strengthen the Fund’s external credibility, promote greater

understanding of the work of the Fund throughout the membership, and support the

Executive Board’s institutional governance and oversight responsibilities (IMF 2001,

emphases added).

The publication of IEO evaluations serves as the primary means for fulfilling these tasks.4

The evaluation procedure is systematic in three senses: (1) coverage, (2) periodicity, and (3)

procedure. First, the reports address controversial issues that are central to the Fund’s

mandate. They cover broad areas ranging from program design to technical assistance, from

surveillance to governance, as Table 1 (below, next section) shows. On paper, the IEO enjoys

full discretion in the selection of topics for evaluation because while the Board may provide

comments, it cannot ultimately prevent the evaluation of a selected topic; in reality, however,

there exist channels for the Executive Board of indirect influence on the choice of topics.

Given its budgeting power and its right to appoint and dismiss the IEO Director, the Board

can send “subtle” signals to the IEO (Weaver 2010: 376–377). Topics can be analyzed from a

3 That is not to say that the World Bank’s policies were more effective than the IMF’s just because of the

conducted evaluations. Still, the difference in timing of institutionalized learning is obvious. For some more detail on this, see another (draft) paper by Weaver (n.d.).

4 Other less fundamental publications include annual reports and so-called issues papers in preparation for the final evaluation reports.

Page 6: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

6

comparative (cross-country) perspective,5 whereas country-specific evaluations are no longer

seen as falling under the IEO’s remit after objections by concerned (borrowing) member

states to the potential revelation of market-sensitive information, especially when programs

are still ongoing (Weaver 2010: 378).

Second, the IEO completes evaluations on a regular basis. Since it began its operations in

2001, it has completed 19 evaluations, the first one in 2002. The annual output thus averages

nearly two evaluations, though the intervals between the completed evaluations are variable.

At first glance, this average might look disappointingly low compared with, for instance, the

IEG’s voluminous output, but it reflects both the limited number of IEO staff (currently 12

including the Director, supplemented with external consultants on ad hoc basis) and the

belief in increased impact of evaluations through a concentration of resources on a few key

publications (Interview 2011a,b,c). Currently (as of August 2011), the IEO works on three

further evaluations that it has yet to complete: Learning From Experience At The IMF: An

IEO Assessment of Self-Evaluation Systems, The Role of the IMF as Trusted Advisor, and

International Reserves: IMF Advice and Country Perspectives. The IEO invites suggestions

for future evaluations; its work plan from August 2010 indicates surveillance, technical and

design, governance, and interactions with external stakeholders as possible topics (IEO

2010b).

Third, IEO evaluations follow an established institutional procedure, from which

deviations are rare. The typical IEO evaluation report is processed in the following sequence:

(1) The IEO produces an evaluation report on a selected topic and makes some

recommendations for future IMF policies and work routines; (2) the IMF Management

Director issues a very brief statement, commending the IEO for its report and soliciting the

views of staff and the Executive Board; (3) IMF staff responds to the overall report and the

specific recommendations laid out therein; (4) the IEO comments on the statement from

management and the response from staff; and (5) the Acting Chair recaps the Executive

Board discussion in a so-called summing up.6 High-ranking IMF officials or country

representatives sometimes offer additional statements when appropriate, as did the First

Deputy Managing Director Timothy Geithner in a memorandum to the IEO Director on the

issue of capital account crises (IEO 2003b: 155) or the Argentinean Governor Roberto

Lavagna in a statement to the Board on the evaluation of the Fund’s role in Argentina (IEO

2004b: 115–119). In one instance, the evaluation of technical assistance (IEO 2005a), IMF

staff responded to none of the recommendations made by the IEO. It remains unclear why

there was no response from IMF staff on this particular evaluation. As one of my interviewees

5 More information can be found at <http://www.imf.org/external/np/ieo/gai.htm>. 6 This pattern is normally reflected in the published final document that contains not only the IEO report but

also all of the other above-listed reactions to it. For whatever reason, a few reports on the IEO’s website are disaggregated into their components so that, for instance, the staff response is not attached to the main document of the evaluation.

Page 7: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

7

emphasized, this report has to date been “the most poorly responded to of all the evaluations”

(Interview 2011a).

The Executive Board introduced measures to counteract a lack of response in January

2007. With the requirement of Management Implementation Plans (MIPs) and Periodic

Monitoring Reports (PMRs), it instituted a two-layer “system for tracking follow-up on and

implementation of Board-endorsed IEO recommendations” (IEO 2010a: 7). Now IEO

recommendations that have received the Board’s endorsement7 must be followed up by staff

and management. For a completed evaluation, the initial step consists in drafting and

presenting to the Board a MIP, which specifies how each of the endorsed recommendations

will be implemented; a PMR then surveys the implementation status of all MIPs in one year

(IEO 2010a: 7). While far from perfect, this tracking system ensures that recommendations

proposed in IEO reports cannot simply vanish into thin air. Figure 1 provides a stylized flow

chart of the processing of a typical IEO evaluation.

7 Identifying Board “endorsement” is in itself a tricky undertaking. Chelsky (2008) discusses the difficulty in

determining the level of agreement among Executive Directors on individual recommendations.

Page 8: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

8

FIGURE 1. The Processing of an IEO Evaluation in the IMF

Source: Author.

IMF Staff and the IEO: Learning through “Encoded” Knowledge

Methodological Considerations

Organizational learning is inherently difficult to gauge. Even when actors themselves claim to

have drawn the right lessons from the past, it is still tempting to overstate the actual extent of

cognitive and behavioral advances of individuals and collective entities. Scholars are

frequently skeptical of organizational learning as an explanatory variable, not least because

learning represents such an all-encompassing activity that it can hardly be isolated from

other factors contributing to a certain outcome. In other words, we cannot establish a strong

casual link between the learning activities and policy change of an organization. The aim of

this paper is thus explicitly not to examine the aggregate impact of IEO evaluation reports on

IMF policy in any systematic manner, nor do I conduct an in-depth single case study of the

Yes No

IEO evaluation report on selected topic with recommendations

Statement by Managing Director

Staff response to IEO recommendations

IEO comments on management and staff responses

Acting Chair’s summing up of Executive Board discussion

Management Implementation Plan to follow-up on individual

recommendations

Periodic Monitoring Report to track progress on all MIPs

in one year

No further action

Board endorsement?

Page 9: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

9

“career” of a particular IEO report. With these limitations in mind, I pursue a more modest

goal: to assess the balance between conflict and agreement in organizational learning at the

IMF, as evident in all completed IEO evaluations.

The “compatibility” of IEO recommendations and IMF staff responses serves as proxy for

the balance between conflict and agreement in organizational learning.8 It is sensible to

define recommendations in the evaluation reports as the primary “trigger” for learning at the

IMF because the extensive IEO analyses, from which these future-oriented recommendations

emanate, confront the organization more directly with its past than does any other document.

It is through this express consideration of controversial issues that internal groups and,

indeed, individuals at the Fund are encouraged to deal with their collective future. For each

evaluation report, the recommendations made by the IEO constitute the “checklist” items

against which compatibility is assessed.

Checking IMF staff responses against this list is not as obvious a choice. Besides staff,

management and the Executive Board form the key functional groups at the Fund. All of the

procedures within the organization involve this “triangle” at some point. But there are several

empirical and methodological reasons for examining staff responses rather than the

Managing Director’s statements or the Acting Chair’s summings up. A number of studies

contend that the well-trained and experienced staff has used its leeway vis-à-vis the Board to

initiate policy change (Babb 2003; Chwieroth 2007, 2008; Momani 2005).9 In general,

interviewees confirm that Fund policy is predominantly staff-driven, citing, among other

things, staff’s greater technical expertise and long-standing experience from interactions with

country authorities; as a result, the Board relies on staff assessments in many cases to just

“sign off” on proposed policies.10 This effect of the Board serving as “a de facto rubber stamp

for staff proposals” (Babb 2003: 16) would be intensified if more decisions were to be made

on a lapse-of-time basis, as has been proposed to reduce the Executive Directors’ workload.

Moreover, a typical Acting Chair’s summing up employs excessively vague language to

accurately reflect differences in views among Directors as equals (Chelsky 2008). By contrast,

IMF staff is organized in a hierarchical fashion, which produces “unified”11 responses to IEO

recommendations. A staff response is, in turn, less difficult to code than a consensus-oriented

summary of the “spirit” of a Board meeting about a contentious issue, on which Directors

may be compelled to disagree with each other by virtue of their role as representatives of

their constituencies. Finally, a statement by the Managing Director amounts to little more

8 See Gabler (2010) for an application of “normative frame compatibility” to social learning shortcomings in the

World Trade Organization’s (WTO) Committee on Trade and Environment (CTE). Combining high and low normative frame compatibility with learning within and across communities, Gabler distinguishes between four forms of social learning: “simple” and “complex” learning within communities, as well as “reciprocal” and “conflictual” learning across communities.

9 But cf. Momani (2004). 10 This is a recurring theme in interviews at the IMF. I wish to thank Susanne Lütz, together with whom I carried

out several interviews in March 2011, for sharing with me the insights from previous interviews that she had conducted in September 2009.

11 Different opinions are likely to persist between staff members but do not appear in the publications.

Page 10: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

10

than a praise for the IEO’s work and a call for further discussion. It can, therefore, not be

checked against IEO recommendations.

The compatibility analysis concerns itself only with initial staff responses to IEO

recommendations as expressed in the evaluation reports. The Managing Director’s

statements, IEO comments to staff responses, and summings up by the Acting Chair, as well

as all follow-up activities (MIPs, PMRs), are deliberately excluded from the coding procedure.

I code the staff response to the recommendations in each of the 1912 completed evaluation

reports, starting from a simple tridimensional ordinal scale for negative (“–”), neutral or

balanced (“0”), and positive (“+”) responses. Each “clear” response carries 1 point. However,

I have had to expand the initial scale by two intermediate categories (“–/0” and “0/+”),

which translate into 0.5 points for each of the two categories, to code the many ambiguous

IMF responses. For example, “–/0” is assigned when IMF staff is in agreement with the

overall goal contained in a recommendation but rejects the recommended course of action as

ineffective or unwarranted. By contrast, “0/+” is assigned when the IEO recommends a

particular course of action that the IMF staff, without expressly supporting the

recommendation, regards as having already been taken. The points assigned for individual

recommendations are then added up to create absolute numbers for each completed

evaluation.

Results and Discussion

The application of the presented coding technique to the staff responses in evaluation reports

from 2002 to 2011 yields results that point to a low level of conflict in organizational learning

between the IEO and IMF staff. In short, IEO recommendations and IMF staff responses are

mostly compatible. This can be seen both from the overview in absolute numbers (Table 1)

and, perhaps yet more clearly, and from the development over time of the relative shares of

negative, neutral, and positive responses (Figure 2). The compiled data set the stage for the

constructivist argument that the IEO’s main contribution to learning lies in generating,

accumulating, and storing “encoded” knowledge.

12 The IEO has up to now completed 19 evaluations, one of which did not produce any recommendations (see

below Table 1, note c on the evaluation that the IEO undertook jointly with the World Bank’s OED). The actual number of evaluations that “demanded” a staff response to recommendations is thus 18. I continue to speak of “19 completed evaluations” throughout this paper to avoid unnecessary confusion for the reader.

Page 11: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

11

TABLE 1. Coded IMF Staff Responses to IEO Recommendations, 2002–2011

No Year Title of Evaluation Report Area(s) Number of

Recommendationsa

Staff Responses

– 0 + n.a.

01 2002 Evaluation of Prolonged Use of IMF

Resources Program design 14b 1.5 2 1.5 9

02 2003 The IMF and Recent Capital Account Crises:

Indonesia, Korea, Brazil Bilateral surveillance and

program design 6 0 1 5 0

03 2003 Fiscal Adjustment in IMF-Supported

Programs Program design 5 0 1 4 0

04 2004 Evaluation of the IMF’s Role in Poverty

Reduction Strategy Papers and the Poverty Reduction and Growth Facility

Poverty reduction 6 0.5 1.5 4 0

05 2004 The IMF and Argentina, 1991–2001 Bilateral surveillance and

program design 6 0 3.5 2.5 0

06 2005 Evaluation of the Technical Assistance Provided by the International Monetary

Fund Technical assistance 6 0 0 0 6

07 2005 The Poverty Reduction Strategy Initiative: Findings from 10 Country Case Studies of

World Bank and IMF Supportc Poverty reduction 0 — — — —

08 2005 The IMF’s Approach to Capital Account

Liberalization Bilateral and multilateral

surveillance 2 0 1 1 0

09 2005 IMF Support to Jordan, 1989–2004 Program design 9 0 1 8 0

10 2006 Report on the Evaluation of the Financial

Sector Assessment Program Bilateral surveillance 7 1 1 5 0

11 2006 Multilateral Surveillance Multilateral surveillance 4 1.5 1.5 1 0

12 2007 The IMF and Aid to Sub-Saharan Africa Program design 3 0 2 1 0

Page 12: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

12

TABLE 1. Continued

No Year Title of Evaluation Report Area(s) Number of

Recommendationsa

Staff Responses

– 0 + n.a.

13 2007 IMF Exchange Rate Policy Advice Multilateral surveillance 11 2 6.5 2.5 0

14 2007 Structural Conditionality in IMF-Supported

Programs Program design 6 2.5 1 2.5 0

15 2008 Governance of the IMF: An Evaluation Governance 4 0.5 1 0.5 2

16 2009 IMF Involvement in International Trade

Policy Issues Trade policy 6 0 1 5 0

17 2009 IMF Interactions with Member Countries Governance 9 0.5 4 2.5 2

18 2011 IMF Performance in the Run-Up to the Financial and Economic Crisis: IMF

Surveillance in 2004–07

Bilateral and multilateral surveillance

5 0.5 2.5 2 0

19 2011 Research at the IMF: Relevance and

Utilization Research 4 0.5 1.5 1 1

Notes:

“–” = negative; “0” = neutral or balanced; “+” = positive; “n.a” = not addressed. a It is by no means uncommon for “a recommendation” to consist of a set of several related specific recommendations. I count subitems not separately but as falling under the

respective “meta” recommendation as long as this attribution can be made. Moreover, I see no need to distinguish between “recommendations” and “lessons” whenever the latter substitute the former. Conclusions or findings, sometimes framed as “lessons,” might precede—and thus inform—the actual recommendations, yet in the case of IMF Support to Jordan, 1989–2004, in which the lessons drawn are of a similar character as the recommendations in previous and subsequent IEO evaluation reports, this distinction is difficult to uphold.

b IMF staff counted 22 recommendations (perhaps including some of the listed subitems) in the inaugural IEO evaluation report. For the purposes of this paper, however, I assume that there were 14 recommendations presented by the IEO (according to the Executive Summary), to only 5 of which IMF staff responded in a freely chosen order; thus, 9 remained unanswered. Since the second IEO evaluation, there has been little to no ambiguity left about this: the IEO lists or even numbers the recommendations (usually contained in the “Executive Summary,” sometimes built into the “Conclusion” or “Findings and Recommendations”), and almost always IMF staff either responds explicitly to them in the given order or at least relates clearly back to each of them. Unfortunately, with a return to a more erratic style of documentation, ambiguity about the number of recommendations and the “response rate” has somewhat crept back into some of the latest evaluations.

c Joint report by the World Bank’s OED and the IMF’s IEO, which contains no recommendations. Sources: all completed IEO (2002–2011) evaluations and staff responses to IEO recommendations.

Page 13: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

13

FIGURE 2. Relative Shares of IMF Staff Responses, 2002–2011

0,0

10,0

20,0

30,0

40,0

50,0

60,0

70,0

80,0

90,0

100,0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

Evaluation number

Share of responses (in %)

0

+

Note: Graphical representation based on the absolute numbers provided in Table 1 (see corresponding Table A1 for the relative numbers). Source: Author.

Page 14: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

14

Three aspects of the results are noteworthy. First, the level of affirmation (sum of responses

assigned a “+” or a “0/+”) is at least as high as the level of objection (“–” or “–/0”) in all but

one report (No. 11). Although the frequent occurrence of the category “0” is in part a result of

the coding rule for ambiguous responses, it is striking that the number of neutral responses

exceeds that of affirmative ones a number of times. These impressions are corroborated by

the average shares (see Table A1): responses have been mostly affirmative regardless of some

minor qualifications that are regularly raised. Second, the level of disagreement between IEO

and staff has been slightly more pronounced since the tenth evaluation. The subsequent

report is then the sole evaluation on which the IEO received more negative than positive

responses from staff. Third, there is no clear trend for any of the three major categories over

time or across the areas of IMF operations specified in the third fourth column of Table 1. For

example, contrary to what one might have expected, the involved groups did not engage in

role-searching activities. Neither IEO staff nor IMF staff is significantly more

“accommodating” or “defiant” in the early than in the later evaluation reports.

What is to be made of the empirical evidence? In fact, we do not see a “Rashomon” effect13

of many sides to the same story. Rather, we observe a significant level of compatibility in

organizational learning between the IEO and IMF staff. The statements in the reports

demonstrate that not only do both sides often agree on the measures to be taken but also,

more critically, subscribe to the same or very similar goals for the IMF to accomplish as an

organization beyond the more specific recommendations. Intense struggles over the nature of

the IMF’s mandate either do not matter much or are effectively sidelined before the release of

reports. A passage from the staff’s response even on the highly controversial topic of IMF

involvement in Argentina from 1991 to 2001 (see Weaver 2010: 378, fn. 314) is telling in this

regard. Notwithstanding disagreements over the degree of consistency in the IEO report, the

common denominator lies in the insistence on an organizational mandate that requires the

Fund to advocate political reforms (IEO 2004b: 110, para. 113):

The [IEO] report concludes—also in line with our analysis—that the Fund erred by not

pushing strongly enough for needed reforms and policy adjustments … and by providing

financial support for too long and when policies were increasingly weak and inconsistent.

In particular, staff agrees with the IEO recommendation that the Fund should not enter or

maintain a program if there is no immediate need for balance of payment (BoP) support and

if “political obstacles” hinder “policy adjustment or structural reform” (IEO 2004b: 7, 113,

para. 114). In the remainder of this section, I offer tentative explanations for the high degree

of compatibility, while also reflecting on methodological issues with respect to the coding

technique, and classify the institutionalized interactions as producing a particular kind of

13 The effect is named after a Japanese movie from 1950 in which four people report the same crime in four

distinctly different ways.

Page 15: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

15

knowledge that contributes to organizational learning at the IMF. The observable degree of

compatibility can be understood as the interplay of five factors of IEO evaluations:

(1) Angle: IEO reports are written almost exclusively by trained economists (Interview

2011c) and, in fact, constitute analyses of the activities of other economists (staff and

also Executive Directors, who are often former central bankers). The common

educational and professional background not only provides a fundament for shared

understandings but also facilitates the resolution of internal conflicts once they arise.

However, it exposes the Fund to criticism for insulation from more challenging

feedback (see Barnett and Finnemore 1999: 722–724).

(2) Process: During the execution of an evaluation, IEO staff draws heavily on data to be

provided by the IMF, and consults with or interviews staff. In this process, myriad

interactions between IEO staff and IMF staff take place, which helps considerably to

take off the “sting” from the official publication if actors manage to mitigate or even

settle conflicts in an early phase of completing an evaluation.

(3) Format: These first two factors might explain why most recommendations are couched

in very broad terms. The corollary of their format is that they are difficult to disagree

with openly. Unless recommendations focus on very specific actions instead of

suggesting alternative ways to reach a commonly agreed goal, IMF staff will be likely to

agree with the ideas in IEO reports.

(4) Thrust: This is a related point. Common sense suggests that it is less conflictive to

argue over the future than over the past, for mistakes that have been committed stoke

conflict whereas the danger of mistakes that might be committed rather leads to joint

efforts to avoid making—or repeating—them. Put differently, the future is, by

definition, always uncertain. IMF staff may feel a stronger impulse to defend its past

actions than to insist on acting in the same fashion yet again.

(5) Independence: The actual degree of IEO independence in conducting its evaluations of

IMF policies has repeatedly been questioned on material and normative accounts (see

discussion in Lissakers, Husain, and Woods 2006: 10–14). In addition to recalling

concerns with limited de facto independence from interference (Weaver 2010: 376–

377), one has to bear in mind that the IEO staff’s remuneration is done via the IMF’s

payroll and that there exist “revolving door” effects of staff moving between the IEO

and the IMF. Also, arrangements are in place for IEO staff members to be able to

return to their previous IMF positions after the end of the contract with the IEO

(Interview 2011c). As many IEO staffers have previously worked for and/or plan to

return to the “core” IMF, the IEO is a semi-internal evaluation unit rather than a fully

independent external evaluator.

Page 16: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

16

The high degree of compatibility in organizational learning might as well be a reflection of

problems inherent in the coding procedure. Three methodological caveats deserve mention.

First, there lies considerable difficulty in assigning the appropriate label to an IMF response,

particularly when multi-item IEO recommendations are concerned. Not only because of the

design of the recommendations but also because of the expectations on staff to respond in a

detailed manner, coding ultimately requires numerous “judgment calls” on how to interpret a

response to a broadly worded recommendation or one that would require weighing the single

items against the importance attached to it by the IEO. This problem is not much ameliorated

through the introduction of the two intermediate categories.14 Second, my analysis overlooks

disagreements over IEO findings not contained in recommendations, as well as tensions

between the IEO and IMF-internal actors other than staff.15 Third, my coding technique is

“blind” to the origin of compatibility. That is, it does not assess whether the low level of

conflict is due to restraint exercised by both IEO and IMF staff, by only one of them, or by

neither. If exercised, such restraint can, of course, cut both ways: either it ensures that the

most basic lessons are learned with the bigger issues brushed aside, or it amounts to an

overcautious approach to managing internal and external relations, which, in turn, might

result in blockage precisely because the bigger issues are not addressed.

The obtained evidence implies that processes of organizational learning at the IMF are

more consensual than they are (openly) conflictive. In the analysis of IMF staff responses to

IEO recommendations, we repeatedly find high “normative frame compatibility” (Gabler

2010: 83–84). For Ernst B. Haas (1990: 23), who makes a similar point, genuine learning can

be conceived only as the application of consensual knowledge. Organizational learning entails

that actors question and potentially redefine their beliefs, as opposed to “adaptation,”

through which work routines may be improved and problems solved but underlying beliefs

remain unchallenged (Haas 1990: 3, 33–35; see also Haas and Haas 1995: 259–260, 262).16

The crucial question here is not whether the evidence is indicative of learning or merely

adaptation at the IMF, which is impossible to measure with the chosen quantitative

approach. Rather, these contributions to the literature on organizational learning remind us

that permanent, open conflict is not necessary for international organizations to learn and

perhaps even detrimental to the cause of learning itself. In this sense, the compatibility of

IEO recommendations and IMF staff responses seems reasonably conducive to

organizational learning. It is essential that staff, identified above as the chief policy driver at

the Fund, is put in an excellent position to learn through its involvement in the evaluation

14 I plan to have another person code the responses independently in order to achieve a minimum degree of

intercoder reliability. 15 Weaver (2010: 378, fn. 314) mentions the evaluation on Argentina (IEO 2004) as having been “fiercely

debated when it was presented to the Board.” The IEO recommendations were received by staff in a rather positive manner (see Table 1 above).

16 Haas and Haas’s understanding of “adaptation” vs. “learning” bears a striking similarity to Argyris and Schön’s (1978: 18–26) concept of “single-” and “double-loop learning.”

Page 17: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

17

process. Knowing that the Executive Board is rather unlikely to vote down policy proposals

altogether, staff might reintroduce an issue even if the Board has chosen to not endorse an

IEO recommendation to that effect. Though such endorsement is vital to putting into effect

any proposals, functional departments, particularly SPR, can use the input from the IEO to

present more refined proposals to the Executive Directors for approval at a later point.

Thus, the IEO’s main contribution to learning lies not so much in exposing IMF policy

“blunders” as in generating, accumulating, and storing relevant knowledge. The kind of

knowledge that originates from the organization’s operations has a collective character.

Through its focused evaluations of key topics, published as publicly accessible reports, the

IEO renders explicit and codifies that knowledge, turning it into what Alice Lam (2000: 492–

493) refers to as “encoded” knowledge.17 This conforms to a constructivist understanding of

organizational learning as “the output of a series of communications [via the evaluation

reports, M.K.], not its input” (Freeman 2006: 379).18 The preparation, execution, and follow-

up activities during one evaluation process create a dense internal network that can again be

utilized for the next evaluation. In these processes, the IEO serves as the venue facilitating

the interactions between staff, Board, and management, as well as within and across

functional and area departments. The recently introduced follow-up mechanisms, involving

MIPs and PMRs, aim at better implementing lessons drawn from not only the final IEO

reports themselves but also from the intra-organizational activities before and after its

completion.

Explicit collective knowledge means that no one has a good excuse not to be informed

about the current state of affairs. This kind of abstracted knowledge gives rise to another type

of knowledge that is not explicit but tacit: “embedded” knowledge, developed through

procedural routines as an expression of codified organizational rules and beyond them,

supports intra-organizational interactions (Lam 2000: 493). Metaphorically speaking, it is

the “grease” to the IMF “machine.” For example, published IEO evaluations reports

contribute to the growth of encoded knowledge that can also be found in the written

requirements for MIPs and PMRs to be presented to the Board as follow-up instruments. Yet

how staff and management seek to implement endorsed recommendations depends in large

measure on embedded knowledge, namely on being familiar with general role expectations,

with context-specific particularities, and with strategic considerations, none of which can be

gleaned from a written rulebook on how the IMF works.

17 Lam (2000: 490–493) combines the epistemological dimension (explicit vs. tacit) with the ontological

dimension (individual vs. collective) of knowledge. Four types of knowledge ensue: “embrained” (individual and explicit), “embodied” (individual and tacit), “encoded” (collective and explicit), and “embedded” (collective and tacit) knowledge.

18 Freeman (2006) prefers to use the term “constructionist.” Representative of the general entry of constructivism into the social sciences, constructionism has emerged as a distinct strand of (constructivist) learning theory based on the works of Jean Pigeat and, later, Seymour Papert.

Page 18: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

18

It is debatable to what extent IEO-induced learning of the type described above has policy

implications. As we have seen, collective knowledge, in its explicit and tacit forms, is

instrumental to learning processes, but it is always uncertain if this body of collectively

shared knowledge translates into policy reforms. The IEO’s (2007) evaluation of Structural

Conditionality in IMF-Supported Programs provides anecdotal evidence for an indirect

impact of IEO analyses on IMF policies. Just prior to the latest global financial crisis, the IEO

released a critical account of the use of structural conditionality, which it claimed remained

problematic regardless of earlier attempts to revise guidelines on conditionality under the

“streamlining initiative” of 2002. The response from IMF staff to the recommendations was

nearly as negative as it gets (2.5/1/2.5; see Table 1 above) and particularly critical of the

recommended program and conditionality design review whose most controversial point was

a call by the IEO to discontinue all structural benchmarks (IEO 2007c: 20, para. 52, and 31,

para. 14) .The suggestion that “a notional cap on the number of structural conditions per

program-year … [of] about one half of the current average” (IEO 2007c: 20, para. 51) be set

was received critically by staff and also not endorsed by the Board (IEO 2007c: 30–31, para.

13, and 36; see Weaver 2010: 366–367, 373).. Indeed, the IMF has recently turned away from

the imposition of conditions and generally become more accommodating of borrowing

countries’ needs (Broome 2010; Lütz and Kranke 2010), leading many a country

representative to worry about the implications of such a “soft” approach (Interview 2011d).

Such evidence can, however, at best establish a tenuous link between IEO evaluations and

IMF policy reform, since it involves counterfactual reasoning about how specific policies

would have changed, or not, if it had not been for the IEO or, at the very least, the evaluation

of that one particular topic.

The IMF at Large: How Much Legitimacy Derived from Learning?

The IEO is responsible for more than just learning at the Fund. Though this is its foremost

purpose, its mandate includes two goals that relate to organizational legitimacy: both

strengthening external credibility and promoting a greater understanding of the IMF’s work

across its members are matters of the legitimate, or rightful, exercise of power in the realm of

global economic policy. Time and again, the IMF has been diagnosed with a legitimacy deficit

(Best 2007; Seabrooke 2007). With every new global economic crisis, criticisms flare up

regarding one or several of the following core areas: program design (too many conditions

with too little focus, limited involvement of country authorities); surveillance activities

(unreliable predictions, no “teeth”); governance (biased voting shares, closed-shop

recruitment on many levels).

Page 19: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

19

This brief section focuses on the legitimacy dimensions of the IEO’s evaluation work. The

argument developed herein builds on—and amends—that from the previous section: the

IEO’s activities can lend considerable output legitimacy to IMF operations, which is derived

from the learning processes surrounding IEO evaluations. The IEO’s activities are pivotal to a

process of legitimation from which the IMF derives its power in global economic policy-

making by justifying its rules of operations vis-à-vis an increasingly demanding membership

(Beetham 1991: 17–18, 69–70).19 For this argument, I draw on an established distinction in

the policy process literature between three legitimacy components or phases. The initial

distinction between “input” and “output” legitimacy, which was popularized by Fritz W.

Scharpf20, has been extended to include a “throughput” component. Input legitimacy features

are conditions of access to participation; throughput legitimacy features concern the

deliberative process itself; and output legitimacy features are measures of efficiency and

effectiveness (Papadopoulos and Warin 2007; see also Hoppe 2011).

Input and Throughput Legitimacy

Input and throughput legitimacy is elusive in a process that is initiated by the IEO and

remains an IMF-internal affair unless the IEO seeks external input. According to its mandate,

the IEO (2001) is “free to consult with whomever and whichever groups it deems necessary,

both within and outside the Fund.” Thus, access conditions are defined on a case-by-case

basis by the semi-internal unit that is the IEO. The rules underlying the international

monetary and financial system, however, would be perceived as more legitimate if they were

more closely aligned with the preferences of those residing outside the Fund.

This continuing problem of limited openness to external voices is characteristic of the

Fund’s “top-down” approach to its own internal organization. Over the decade, a hierarchical

organizational culture has become entrenched in which the free flow of fresh ideas is

impeded (Lissakers, Husain, and Woods 2006: 23; see also Weaver 2010: 380). Compared

with the IMF, the World Bank has built a “flat” hierarchy despite its much bigger size in

terms of staff numbers (Interview 2011a). External feedback is integrated only at the will of

19 For Beetham (1991: 69), rules must be “justifiable to the subordinate.” The IMF represents a somewhat

peculiar case for his concept, which seems to have in mind traditional domestic politics, for two reasons. First, because the entire membership is, at the same time, both rule-giver (“the powerful”) via the Executive Board and rule-taker (“the subordinate”) in surveillance activities and country programs. Admittedly, a member that is under an IMF program more than occasionally will perceive itself as more of a rule-taker than a rule-giver. Second, the ultimate rule-givers are to merely approve or reject policy initiatives from a democratically not directly legitimated group (staff).

20 Scharpf has published extensively on input and output legitimacy (see, for example, Scharpf 1997, 1999, 2003) and inspired many scholars to apply these two dimensions to studies in various fields (Glenn 2008). Like Beetham, Scharpf entertains an understanding of legitimacy that is tailored to the domestic political context, defining input legitimacy as “government by the people” and output legitimacy as “government for the people” (Scharpf 2003).

Page 20: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

20

the IEO, yet not when external actors demand to be heard. Critical voices abound,21 and only

a minority of them has an explicitly anti-globalist agenda. Furthermore, it seems that input

would have to meet the criteria of high-order economic thinking to be incorporated in any

evaluation.22 The invitation to suggest topics for future evaluations is a first step, albeit one of

no substantive impact on the analytical processes of compiling necessary data and

completing an evaluation.

Adverse access conditions affect the amount of throughput legitimacy that IEO

evaluations can ideally create. The institutionalized procedure for processing an IEO (see

again the flow chart depicted in Figure 1), or its “deliberation,” excludes external voices

altogether. From start to finish, IEO staff, management, IMF staff, and the Executive

Directors handle the reports among themselves. Input on an ad hoc basis comes only, as

mentioned above, from people occupying official IMF positions. If not consulted by the IEO,

even member states interested in taking part in the deliberation process have to rely on

indirect channels of influence.

As a result of these limitations, organizational learning does relatively little to enhance the

Fund’s input and throughput processes. On these two counts, the IEO contributes negligibly

to the perception of the IMF as a legitimate organization. The potential reward for change is

substantial, but it would arguably require an increase in the number of IEO staff to process

external input that would beyond providing advice for the selection of future evaluation

topics. The IMF has not addressed this deficit in any systematic manner, as IEO evaluations

continue to be conducted by staff with the same background in training and profession as

IMF staff. This has led to frequent complaints that the evaluations focus more on technical

and operational details than on broader strategic issues (Lissakers, Husain, and Woods

2006: 11–12). The degree of justifiability of IMF rules suffers from the shortcomings in the

input and throughput dimensions of legitimacy.

Output Legitimacy

IMF operations, specifically its loan programs, have probably always been greeted with a

great deal of criticism. This is understandable given “natural” limits to how welcome any

program of “tough choices” is perceived by the general public of an affected country. The

IMF’s relations with countries that face no immediate BoP financing needs are never as

strained as the relations with countries that require IMF funds from whatever facility.

However, in the wake of each of the many economic crises of the 1980s and 1990s,

21 To name but one of many: the Bretton Woods Project (available at <http://www.brettonwoodsproject.org/>)

provides critical but constructive analyses of IMF and World Bank policies. 22 In his detailed study of software patent regulation in the U.S. and the EU, Thomas R. Eimer (2011), drawing on

Theodore J. Lowi’s concept of the “arena,” advances the argument that actors need to invoke the specific terminological conventions of the respective arena if they wish their requests to be processed.

Page 21: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

21

particularly the Asian crisis in the late 1990s, the IMF’s critics became both more numerous

and more vocal for a reason: if the IMF demands the return to “sound” economic policies via

fiscal retrenchment and monetary tightening, the result should not be a deepening of an

existing crisis. Perhaps most contentious has been the issue of imposing structural

conditionality onto borrowing countries. It is on contentious issues such as this that the IEO’s

potential contribution to greater output legitimacy comes into play.

Increased output legitimacy in the eyes of stakeholders hinges on actual policy changes.

The IEO’s activities have the potential to lend output legitimacy to IMF operations. On the

condition that economic policies are revisited and adapted to new circumstances or because

of new evidence, the IEO furthers a reputation of the IMF as a “thinking organization” not

utterly impervious to the influence of fresh ideas. Though this does not tackle the input

problem that these ideas first need to find their way into the organization or evolve within it,

the generation and storage of knowledge in the evaluations reports makes concerns with IMF

policies explicit. Also, if endorsed by the Board, specific recommendations come to be

implemented and encoded in new operational rules. The introduction of MIPs and PMRs in

2007 was therefore a boost for the IEO’s potential to derive output legitimacy from the

learning processes that its evaluations trigger.

Again, I refer to the evaluation of structural conditionality for illustrative purposes. It is

not clear to what extent the greater focus of IMF conditionality on core areas and its greater

restraint in the number of conditions included in country programs can be attributed to the

work of the IEO on this issue. Despite failing to convince the Board of one of its central

recommendations (see above), the IEO has managed to raise the awareness for the problems

associated with conditionality. This awareness is, for example, reflected in the third PMR of

October 2009, in which SPR highlights policy change to structural conditionality that go

beyond the IEO’s recommendations on the issue (IMF 2009: 4; see also IMF 2008). It is

telling as well that two annual reports on structural conditionality (ARSC) have been

produced in response to the 2007 evaluation (for the latest ARSC, see IMF 2010). Indeed,

recent crisis lending to Eastern and Central European countries (Hungary, Latvia, and

Romania) indicates a retreatment from such measures (Lütz and Kranke 2010).

IEO evaluations can pave the way to, or accelerate, policy reform. In the output dimension

of legitimacy, this basically involves sufficient sensitivity to “the chances of policy acceptance”

(Papadopoulos and Warin 2007: 457) by the “takers” of IMF policies. Organizational rules

become more intelligible and thus justified if they yield acceptable results. The IEO can make

and analytically embed—and has already done so—reform proposals so that they gain more

traction within the Fund. To increase the acceptance for IMF policies, messages need to be

communicated. A relevant concern about the IEO’s ability to establish rapport with key

stakeholders is raised in the external evaluation of the IEO, which identifies a lack of

Page 22: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

22

outreach activities to communicate findings and “turgid writing” (Lissakers, Husain, and

Woods 2006: 27) as profound problems that reduce the (external) effectiveness of the

evaluations. These problems affect effectiveness to the extent that those who are not well

versed in the economic language used at the IMF take less from an evaluation than they could

otherwise. For IMF staff and well-trained country authorities, however, this does not form

too high an obstacle to drawing the right lessons.

Conclusion

The “commissioners of fact” employed by the IEO are no teachers in the classic sense. IEO

staff members can be more appropriately characterized as guardians of learning and envoys

of output legitimacy at the same time. Taken together, the two arguments developed in this

paper can be reduced to the notion that IEO-inspired learning serves a double purpose, as is

enshrined in the office’s mandate: fostering learning and improving the IMF’s legitimacy. In

short, the IEO engages in organizational learning for the sake of learning and legitimacy. Its

inception in the midst of the Fund’s paradigmatic “midlife crisis” (at nearly 47 years of

organizational age) marked an express shift toward a greater concern for organizational

learning.

While critics of the Fund might not be impressed with the numerical evidence that I have

presented, one major finding of this paper is that the learning processes are not replete with

conflict. Quite the contrary, IMF staff responses and IEO recommendations often hold the

same position on the substance even where they differ on the form. It is difficult for any

international organization to strike the right balance between too little consensus and too

much conformity for learning. It is rather the latter than the former problem that plagues the

IMF: If the IEO can receive external input on its evaluations only when solicited, the IMF will

continue to suffer from the exclusion of noneconomic views. The alternative would be to

finally tackle the lack—or, in fact, absence—of noneconomists among IMF staff. Today, staff

“diversity” in terms of education and economic conviction is severely limited in two respects.

When a newly hired economist enters the Washington headquarters, one will only ask from

which elite U.S. university that person graduated and whether he or she belongs to the

“freshwater” (roughly monetarist) and “saltwater” (roughly neo-Keynesian) camp of

economic thought. Even in spite of conformist recruitment patterns, the IEO has created a

more favorable environment for systematic organizational learning. As the evidence from the

evaluation of structural conditionality has illustrated, this can support policy reform. The

Fund moves—slowly but surely.

Page 23: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

23

References

Argyris, Chris, and Donald A. Schön. 1978. Organizational Learning: A Theory of Action Perspective. Reading, MA: Addison-Wesley.

Babb, Sarah. 2003. “The IMF in Sociological Perspective: A Tale of Organizational Slippage.” Studies in Comparative International Development 38(2): 3–27.

Barnett, Michael N., and Martha Finnemore. 1999. “The Politics, Power, and Pathologies of International Organizations.” International Organization 53(4): 699–732.

Beetham, David. 1991. The Legitimation of Power. Houndmills, Basingstoke: Macmillan Education.

Best, Jacqueline. 2007. “Legitimacy Dilemmas: the IMF’s pursuit of country ownership.” Third World Quarterly 28(3): 469–488.

Broome, André. 2010. “The International Monetary Fund, crisis management and the credit crunch.” Australian Journal of International Affairs 64(1): 37–54.

Chelsky, Jeff. 2008. “Summarizing the Views of the IMF Executive Board.” IEO Background Paper BP/08/05 (March). Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/05212008CG_background6.pdf>, accessed 2 August 2011.

Chwieroth, Jeffrey M. 2007. “Testing and Measuring the Role of Ideas: The Case of Neoliberalism in the International Monetary Fund.” International Studies Quarterly 51(1): 5–30.

Chwieroth, Jeffrey M. 2008. “Normative Change from Within: The International Monetary Fund’s Approach to Capital Account Liberalization.” International Studies Quarterly 52(1): 129–158.

Dodgson, Mark. 1993. “Organizational Learning: A Review of Some Literatures.” Organization Studies 14(3): 375–394.

Eimer, Thomas R. 2011 (forthcoming). Arenen und Monopole: Softwarepatente in den USA und in Europa. Wiesbaden: VS Verlag für Sozialwissenschaften.

Freeman, Richard. 2006. “Learning in Public Policy.” In The Oxford Handbook of Public Policy, edited by Michael Moran, Martin Rein and Robert E. Goodin. 367–388. Oxford: Oxford University Press.

Gabler, Melissa. 2010. “Norms, Institutions and Social Learning: An Explanation for Weak Policy Integration in the WTO’s Committee on Trade and Environment.” Global Environmental Politics 10(2): 80–117.

Glenn, John. 2008. “Global Governance and the Democratic Deficit: stifling the voice of the South.” Third World Quarterly 29(2): 217–238.

Gutner, Tamar, and Alexander Thompson. 2010. “The politics of IO performance: A framework ” Review of International Organizations 5(3): 227–248.

Haas, Ernst B. 1990. When Knowledge Is Power: Three Models of Change in International Organizations. Berkeley, CA: University of California Press.

Haas, Peter M., and Ernst B. Haas. 1995. “Learning to Learn: Improving International Governance.” Global Governance 1(3): 255–285.

Hoppe, Robert. 2011. “Institutional constraints and practical problems in deliberative and participatory policy making.” Policy & Politics 39(2): 163–186.

IEO. 2002. Evaluation of Prolonged Use of IMF Resources. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/092502Report.pdf>, accessed 29 July 2011.

IEO. 2003a. Fiscal Adjustment in IMF-Supported Programs. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/09092003all.pdf>, accessed 30 July 2011.

IEO. 2003b. The IMF and Recent Capital Account Crises: Indonesia, Korea, Brazil. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/07282003all.pdf>, accessed 30 July 2011.

Page 24: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

24

IEO. 2004a. Evaluation of the IMF’s Role in Poverty Reduction Strategy Papers and the Poverty Reduction and Growth Facility. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/07062004report.pdf>, accessed 30 July 2011.

IEO. 2004b. The IMF and Argentina, 1991–2001. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/07292004report.pdf>, accessed 30 July 2011.

IEO. 2005a. Evaluation of the Technical Assistance Provided by the International Monetary Fund. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/01312005Vol%201_main_report.pdf>, accessed 1 August 2011.

IEO. 2005b. IMF Support to Jordan, 1989–2004. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/12062005report.pdf>, accessed 4 August 2011; <http://www.ieo-imf.org/ieo/files/completedevaluations/120620057-ieores.pdf>, accessed 7 August 2011.

IEO. 2005c. The IMF’s Approach to Capital Account Liberalization. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/04202005report.pdf>, accessed 4 August 2011.

IEO. 2006a. Multilateral Surveillance. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/09012006report.pdf>, accessed 7 August 2011.

IEO. 2006b. Report on the Evaluation of the Financial Sector Assessment Program. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/01052006report.pdf> and <http://www.ieo-imf.org/ieo/files/completedevaluations/010520067-ieores.pdf>, accessed 7 August 2011.

IEO. 2007a. The IMF and Aid to Sub-Saharan Africa. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/03122007report.pdf>, accessed 7 August 2011.

IEO. 2007b. IMF Exchange Rate Policy Advice. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/05172007exrate_full.pdf>, accessed 7 August 2011.

IEO. 2007c. Structural Conditionality in IMF-Supported Programs. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/01032008SC_main_report.pdf>, <http://www.ieo-imf.org/ieo/files/completedevaluations/ 01032008SC_statffstatement.pdf>, and <http://www.ieo-imf.org/ieo/files/ completedevaluations/01032008SC_summing_up.pdf>, accessed 8 August 2011.

IEO. 2008. Governance of the IMF: An Evaluation. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/05212008CG_main.pdf>, accessed 8 August 2011.

IEO. 2009a. IMF Interactions with Member Countries. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/01202010IMC_Full_Text_Main_Report.pdf> and <http://www.ieo-imf.org/ieo/files/completedevaluations/ 01202010IMC_Staff_Response.pdf>, accessed 8 August 2011.

IEO. 2009b. IMF Involvement in International Trade Policy Issues. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/06162009Trade_Full_Report.pdf>, accessed 8 August 2011.

IEO. 2010a. Annual Report 2010. Washington, DC: International Monetary Fund. <http://www.ieo-

Page 25: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

25

imf.org/ieo/files/annualreports/IEO%20Annual%20Report%202010%20%28without%20Moises%20Signature%29.pdf>, accessed 1 August 2011.

IEO. 2010b. Possible Topics for Evaluation Over the Medium Term. 27 July. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/futureevaluations/Possible_Topics_Med_Term_Aug_2010.pdf>, accessed 15 August 2011.

IEO. 2011a. IMF Performance in the Run-Up to the Financial and Economic Crisis: IMF Surveillance in 2004–07. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/Crisis-%20Main%20Report%20%28without%20Moises%20Signature%29.pdf>, accessed 8 August 2011.

IEO. 2011b. Research at the IMF: Relevance and Utilization. Evaluation Report. Washington, DC: International Monetary Fund. <http://www.ieo-imf.org/ieo/files/completedevaluations/Main_Report.pdf> and <http://www.ieo-

imf.org/ieo/files/completedevaluations/Staff_Response.pdf>, accessed 8 August 2011. IMF. 2001. “Terms of Reference for the Independent Evaluation Office of the International

Monetary Fund.” 21 October (revised November 16, 2004). Washington, DC: International Monetary Fund. <http://www.imf.org/external/np/ieo/tor.pdf>, accessed 15 August 2011.

IMF. 2008. Implementation Plan in Response to Board-Endorsed Recommendations Arising from the IEO Evaluation of Structural Conditionality in IMF-Supported Programs. 8 April. Washington, DC: International Monetary Fund. <http://www.imf.org/external/np/pp/eng/2008/040808.pdf>, accessed 17 August 2011.

IMF. 2009. Third Periodic Monitoring Report on the Status of Implementation Plans in Response to Board-Endorsed IEO Recommendations. 7 October. Washington, DC: International Monetary Fund. <http://www.imf.org/external/np/pp/eng/2009/100709.pdf>, accessed 17 August 2011.

IMF. 2010. Application of Structural Conditionality—2009 Annual Report. 9 March. Washington, DC: International Monetary Fund. <http://www.imf.org/external/np/pp/eng/2010/030910.pdf>, accessed 17 August 2011.

Interview. 2011a. Former IEO staff member. Washington, DC, 23 March (including follow-up e-mail conversation).

Interview. 2011b. IEG staff member. Washington, DC, 24 March. Interview. 2011c. IMF staff member. Washington, DC, 24 March. Interview. 2011d. IMF country representative. Washington, DC, 25 March. Lam, Alice. 2000. “Tacit Knowledge, Organizational Learning and Societal Institutions: An

Integrated Framework.” Organization Studies 21(3): 487–513. Lissakers, Karin, Ishrat Husain, and Ngaire Woods. 2006. Report of the External Evaluation of the Independent Evaluation Office. 29 March. Washington, DC: International Monetary Fund. <http://www.imf.org/external/np/pp/eng/2006/032906.pdf>, accessed 1 August 2011.

Lütz, Susanne, and Matthias Kranke. 2010. “The European Rescue of the Washington Consensus? EU and IMF Lending to Central and Eastern European Countries.” LEQS Paper No. 22/2010 (May 2010). London: The London School of Economics and Political Science. <www2.lse.ac.uk/europeanInstitute/LEQS/LEQSPaper22.pdf>, accessed 15 July 2010.

Momani, Bessma. 2004. “American politicization of the International Monetary Fund.” Review of International Political Economy 11(5): 880–904.

Momani, Bessma. 2005. “Limits on streamlining Fund conditionality: the International Monetary Fund’s organizational culture.” Journal of International Relations and Development 8(2): 142–163.

Papadopoulos, Yannis, and Philippe Warin. 2007. “Are innovative, participatory and deliberative procedures in policy making democratic and effective?” European Journal of Political Research 46(4): 445–472.

Scharpf, Fritz W. 1997. “Economic integration, democracy and the welfare state.” Journal of European Public Policy 4(1): 18–36.

Page 26: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

26

Scharpf, Fritz W. 1999. Regieren in Europa: Effektiv und demokratisch? Frankfurt a.M.: Campus.

Scharpf, Fritz W. 2003. “Problem-Solving Effectiveness and Democratic Accountability in the EU.” MPIfG Working Paper 03/1 (February). Köln: Max Planck Institute for the Study of Societies. <http://www.mpifg.de/pu/workpap/wp03-1/wp03-1.html>, accessed 8 August 2011.

Seabrooke, Leonard. 2007. “Legitimacy Gaps in the World Economy: Explaining the Sources of the IMF’s Legitimacy Crisis.” International Politics 44(2): 250–268.

Weaver, Catherine. 2010. “The politics of performance evaluation: Independent evaluation at the International Monetary Fund.” Review of International Organizations 5(3): 365–385.

Weaver, Kate. n.d. “The Evaluation Paradox: The Politics of Independent Evaluation in the World Bank and International Monetary Fund.” Unpublished paper. <http://polisci.osu.edu/faculty/athompson/io_performance/Weaver.pdf>, accessed 3 August 2011.

Page 27: The Paradigmatic “Midlife Crisis”: Organizational Learning and ...€¦ · But what do we see when unpacking learning in international organizations? How do we even “see”

27

TABLE A1. Relative Shares of IMF Staff Responses, 2002–2011

Staff Responses in %

No – 0 + n.a.

01 10.7 14.3 10.7 64.3

02 0.0 16.7 83.3 0.0 03 0.0 20.0 80.0 0.0 04 8.3 25.0 66.7 0.0 05 0.0 58.3 41.7 0.0 06 0.0 0.0 0.0 100.0

07 — — — —

08 0.0 50.0 50.0 0.0 09 0.0 11.1 88.9 0.0 10 14.3 14.3 71.4 0.0 11 37.5 37.5 25.0 0.0 12 0.0 66.7 33.3 0.0 13 18.2 59.1 22.7 0.0 14 41.7 16.7 41.7 0.0 15 12.5 25.0 12.5 50.0

16 0.0 16.7 83.3 0.0

17 5.6 44.4 27.8 22.2

18 10.0 50.0 40.0 0.0

19 12.5 37.5 25.0 25.0

Average 9.5 31.3 44.7 14.5

Note: All numbers are rounded to one decimal point, which causes slight deviations from 100 percent for the sum

of each line. Source: Author.