24
THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: LINKING INTERPRETATION AND CHOICE JENNY W. RUDOLPH Harvard Medical School and Massachusetts General Hospital J. BRADLEY MORRISON Brandeis University JOHN S. CARROLL Massachusetts Institute of Technology We offer a theory of action-oriented problem solving that links interpretation and choice, processes usually separated in the sensemaking literature and decision- making literature. Through an iterative, simulation-based process we developed a formal model. Three insights emerged: (1) action-oriented problem solving includes acting, interpreting, and cultivating diagnoses; (2) feedback among these processes opens and closes windows of adaptive problem solving; and (3) reinforcing feedback and confirmation bias, usually considered dysfunctional, are helpful for adaptive problem solving. Understanding and improving action-oriented problem solving under time pressure is a crucial aspect of organizing for high performance (Eisenhardt, 1989; Rudolph & Repenning, 2002; Starbuck, Greve, & Hedberg, 1978; Sutcliffe & Weber, 2003; Weick, Sutcliffe, & Obstfeld, 1999). Professionals in many settings must formulate interpretations and make choices quickly with imperfect understanding as problems and op- portunities evolve rapidly and as their actions both generate information and change the situ- ation (Hodgkinson & Healey, 2008). To take effec- tive action, executives in “high-velocity” or “high-hazard” environments, entrepreneurs seeking first-mover advantage, team leaders in military and clinical settings, firefighters con- fronting smoke and flames, and professionals troubleshooting manufacturing or computer pro- gramming challenges are each wondering, “What is going on here?” (Arthur, 1990; Carroll, Rudolph, & Hatakenaka, 2002; Eisenhardt, 1989; Klein, Pliske, Crandall, & Woods, 2005; Repen- ning & Sterman, 2002; Weick, 1993). As they pur- sue solutions, their diagnoses and actions coevolve; feedback links together their sense- making and decision-making processes. In this paper we clarify and articulate a model of action-oriented problem solving that inte- grates processes of interpretation and choice. We developed a system dynamics simulation model to represent theoretical concepts and re- lationships and to test emergent properties of our theory. The model was motivated by the empirical example of doctors coping with an operating room emergency. Patterns of problem- solving behavior produced by the model provide implications for both theory and practice. INTERPRETATION AND CHOICE IN ACTION- ORIENTED PROBLEM SOLVING Mapping and understanding the processes that link meaning making and choice in action- oriented problem solving have been difficult be- cause the sensemaking literature and decision- making literature have partitioned problem solving by focusing on different prototypical sit- uations. The sensemaking literature assumes We gratefully acknowledge the help of Marlys Christian- son, Jason Davis, Mark Healey, Zur Shapira, Karl Weick, the AMR reviewers and editors, and especially Olav Sorenson and Ron Adner. Thanks also to Fulton 214 (Erica Foldy, Pacey Foster, Danna Greenberg, Tammy MacLean, Peter Rivard, and Steve Taylor) for multiple trenchant reviews of this paper. Academy of Management Review 2009, Vol. 34, No. 4, 733–756. 733 Copyright of the Academy of Management, all rights reserved. Contents may not be copied, emailed, posted to a listserv, or otherwise transmitted without the copyright holder’s express written permission. Users may print, download, or email articles for individual use only.

THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

THE DYNAMICS OF ACTION-ORIENTEDPROBLEM SOLVING: LINKING

INTERPRETATION AND CHOICE

JENNY W. RUDOLPHHarvard Medical School and

Massachusetts General Hospital

J. BRADLEY MORRISONBrandeis University

JOHN S. CARROLLMassachusetts Institute of Technology

We offer a theory of action-oriented problem solving that links interpretation andchoice, processes usually separated in the sensemaking literature and decision-making literature. Through an iterative, simulation-based process we developed aformal model. Three insights emerged: (1) action-oriented problem solving includesacting, interpreting, and cultivating diagnoses; (2) feedback among these processesopens and closes windows of adaptive problem solving; and (3) reinforcing feedbackand confirmation bias, usually considered dysfunctional, are helpful for adaptiveproblem solving.

Understanding and improving action-orientedproblem solving under time pressure is a crucialaspect of organizing for high performance(Eisenhardt, 1989; Rudolph & Repenning, 2002;Starbuck, Greve, & Hedberg, 1978; Sutcliffe &Weber, 2003; Weick, Sutcliffe, & Obstfeld, 1999).Professionals in many settings must formulateinterpretations and make choices quickly withimperfect understanding as problems and op-portunities evolve rapidly and as their actionsboth generate information and change the situ-ation (Hodgkinson & Healey, 2008). To take effec-tive action, executives in “high-velocity” or“high-hazard” environments, entrepreneursseeking first-mover advantage, team leaders inmilitary and clinical settings, firefighters con-fronting smoke and flames, and professionalstroubleshooting manufacturing or computer pro-gramming challenges are each wondering,“What is going on here?” (Arthur, 1990; Carroll,

Rudolph, & Hatakenaka, 2002; Eisenhardt, 1989;Klein, Pliske, Crandall, & Woods, 2005; Repen-ning & Sterman, 2002; Weick, 1993). As they pur-sue solutions, their diagnoses and actionscoevolve; feedback links together their sense-making and decision-making processes.

In this paper we clarify and articulate a modelof action-oriented problem solving that inte-grates processes of interpretation and choice.We developed a system dynamics simulationmodel to represent theoretical concepts and re-lationships and to test emergent properties ofour theory. The model was motivated by theempirical example of doctors coping with anoperating room emergency. Patterns of problem-solving behavior produced by the model provideimplications for both theory and practice.

INTERPRETATION AND CHOICE IN ACTION-ORIENTED PROBLEM SOLVING

Mapping and understanding the processesthat link meaning making and choice in action-oriented problem solving have been difficult be-cause the sensemaking literature and decision-making literature have partitioned problemsolving by focusing on different prototypical sit-uations. The sensemaking literature assumes

We gratefully acknowledge the help of Marlys Christian-son, Jason Davis, Mark Healey, Zur Shapira, Karl Weick, theAMR reviewers and editors, and especially Olav Sorensonand Ron Adner. Thanks also to Fulton 214 (Erica Foldy, PaceyFoster, Danna Greenberg, Tammy MacLean, Peter Rivard,and Steve Taylor) for multiple trenchant reviews of thispaper.

� Academy of Management Review2009, Vol. 34, No. 4, 733–756.

733Copyright of the Academy of Management, all rights reserved. Contents may not be copied, emailed, posted to a listserv, or otherwise transmitted without the copyrightholder’s express written permission. Users may print, download, or email articles for individual use only.

Page 2: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

the environment is challenging if not hostile toproblem solving: information abounds butmeaning is ambiguous; the best informationmay emerge only after taking action. Indeed, thepurpose of interpretation is not so much to beright (often a post hoc rationalization) as toguide action toward a more effective response.Some of the most compelling sensemaking stud-ies analyze problem solvers in extreme and har-rowing situations; they could be killed whensensemaking “collapses” (Weick, 1993). In con-trast, classical decision-making research as-sumes that meaning is given or readily calcu-lated, as in a laboratory gamble. Althoughdecision makers with limited rationality takeerror-prone shortcuts (Simon, 1957; Tversky &Kahneman, 1974), a decision is properly ap-proached as a cool, rational, consistent, andcomprehensivesearchfortherightanswer(Fisch-hoff & Beyth-Marom, 1983; Langley, Mintzberg,Pitcher, Posada, & Saint-Macary, 1995).

While in most important real-world settingsthere is an overlap of processes of evaluationand choice associated with classic decision-making models and processes of interpretationand construction of meaning associated withsensemaking, the two bodies of literature havetended to make advances by addressing theseproblems separately. The lack of cross-fertiliza-tion between these two fields means that theinteractions among processes of interpretationand choice are relatively uncharted. Whetherproblem solving is primarily a choice-based orrule-based process, whether it is primarily aninstrumental and positivist or interpretive activ-ity, or whether inputs and outputs to problemsolving are characterized by clarity or ambigu-ity are all contested issues (March, 1994: viii–ix).We know little about how pacing, punctuation,and relative effort given to interpretation andmeaning making versus evaluation and choiceinfluence problem solving. Echoing March, weagree that “the largest problem is not to chooseamong the [theoretical] alternatives but toweave them together in a way that allows eachto illuminate the others” (March, 1997: 10).

Insights about how interpretation and choiceare woven together in action-oriented problemsolving emerged from our attempt to understandvariations in problem solving in a clinical crisis.We developed a formal model that includesfeedback loops that the separate sensemakingand decision-making literature rely on implic-

itly but have not fully articulated. The modelincludes three basic processes—acting, inter-preting, and cultivating new diagnoses—derived from the example of diagnostic problemsolving in an operating room crisis and fromtheoretical constructs in both bodies of litera-ture. Interactions among the three processesproduce adaptive problem solving and variousproblem-solving failures. The speed andstrength of acting, interpreting, and cultivatingnew diagnoses facilitate and constrain opportu-nities for choice between competing diagnosesand open and close windows of opportunity foradaptive problem solving. For example, eitheracting slowly when one is unsure or generatingalternatives too quickly can make it difficult tocollect enough supporting information to con-firm a diagnosis and resolve the crisis. Themodel reveals leverage points for convertingdysfunctional problem-solving modes intoadaptive ones, including the counterintuitive re-sult that reinforcing feedback and confirmationbias can be beneficial.

Although the initial focus of the model is onmedical diagnoses in a crisis setting, the modelis intended to apply more generally to problemsolving in a broad range of situations character-ized by (1) action-based inquiry—acting is theonly way to generate new information to updateexplanations and action strategies (Sutcliffe &Weber, 2003; Torbert & Taylor, 2008); (2) temporaldynamism—the situation changes on its own ifno action is taken (Rudolph & Repenning, 2002;Snook, 2000; Weick, 1993), and, indeed, failure toact promptly will result in adverse conse-quences; and (3) action endogeneity—actionschange the problem-solving environment (Per-low, Okhuysen, & Repenning, 2002; Sutcliffe &Weber, 2003). Settings that exhibit these charac-teristics include high-velocity entrepreneurialenvironments where useful information be-comes available over time as a strategy is exe-cuted, business competitors are moving on theirown, and the market changes partly in responseto an individual’s own actions (Eisenhardt, 1988;Perlow et al., 2002; Sutcliffe & Weber, 2003). Theyalso include firefighting emergencies in whichfirefighters may have to gather informationabout the fire by placing themselves in harm’sway or trying various countermeasures (Klein,1998). These examples involve knowledgeableprofessionals facing problems that are neithercompletely novel nor completely routine, in

734 OctoberAcademy of Management Review

Page 3: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

which they must act and diagnose iteratively ina dynamic, unfolding situation.

We begin our theory development process inthe next section by presenting findings from amotivating example: an in-depth observationalstudy of action-oriented problem solving by thir-ty-nine anesthesiology residents in a high-fidelity simulated1 operating room facing anacute care crisis with the patient’s life in thebalance (Rudolph, 2003; Rudolph & Raemer,2004). We then describe our iterative method oftheory development and review and synthesizethe relevant research literature in the expositionof our model. This exposition explains the keytheoretical constructs of acting, interpreting,and cultivating new diagnoses and the dynamicinteractions among these constructs. Next, wediscuss simulation results that provide insightson the origins of adaptive and dysfunctionalproblem solving. We end with a discussion ofthe simulation results, highlighting the mecha-nisms that produce variation in action-orientedproblem solving and their implications for prob-lem-solving theory and practice.

ACTION-ORIENTED PROBLEM SOLVING:AN EXAMPLE

The starting point for our theorizing was Ru-dolph’s study of action-oriented problem solvingby thirty-nine advanced anesthesia residentsfacing the same simulated acute care crisis witha diagnostic challenge (Rudolph, 2003; see alsoRudolph & Raemer, 2004). Rudolph observedproblem solving by tracking doctors’ concurrentverbal statements regarding diagnoses, observ-ing their treatments and diagnostic tests, andconducting post hoc video reviews with partici-pants. An operating room crisis is a particularlygood context for elaborating the features of ac-tion-oriented problem solving because it re-quires handling both novel and routine de-mands. The novel demands require skillful

exploration and interpretation of ambiguouscues to diagnose a problem, giving researchersa chance to examine cognitive processes. Theroutine demands require rapid and efficient ac-tion using exploitation of known procedures (Ru-dolph & Repenning, 2002), providing an opportu-nity to study action.

In Rudolph’s study an anesthesia resident iscalled to take over anesthesia during an urgentappendectomy for a 29-year-old woman. Thisscenario presents a common but serious prob-lem in anesthesia: difficulty with the process ofventilating—that is, breathing for the patientusing a mechanical bellows. A variety of diag-noses for the ventilation problem are plausible,such as an asthma attack, a collapsed lung, orinsufficient paralyzing agent, but contradictoryevidence is present for each, except one: thepatient has exhaled some mucous into thebreathing tube, partially blocking it. This is notuncommon, and trainees have been acquaintedwith this problem in their training, but presen-tations of the problem can vary. Treatments ad-dressing diagnoses other than the mucous plugin the breathing tube will not result in any sus-tained improvement in the patient’s status. Witha slowly dwindling level of oxygen in her blood,the patient can have an uneven heartbeat andeven go into cardiac arrest if the problem is notrectified. The cues doctors have available in-clude vital signs displayed on a monitor, thebreath sounds of the patient, and new cues gen-erated by pursuing standard operating proce-dures for treating and diagnosing the patient.

Rudolph found that the doctors fell into fourmodes of problem solving as they attempted toaddress the ventilation problem: stalled, fix-ated, vagabonding, and adaptive (see Table 1).The doctors who were stalled problem solvershad difficulty generating any diagnoses aroundwhich to organize action (on average, just 1.5diagnoses) and pursued few or no treatmentsand tests (one treatment on average). In con-trast, those in the fixated mode quickly estab-lished a plausible but erroneous diagnosis, towhich they stuck despite countervailing cues(Table 1 shows that they returned to their favor-ite diagnosis on average ten times—double anyother mode). Rather than advancing throughmultiple steps of a treatment algorithm to ruleout diagnoses, they repeated the same step orgot stuck.

1 We use the term simulation in two ways in this paper.The first use refers to the source data for Rudolph’s (2003)study of clinical problem solving. These data were providedby a full-field, high-fidelity simulation—that is, the researchparticipant was in a fully equipped and staffed operatingroom [OR] with a computer-controlled mannequin patient.This is similar to war games or aircraft flight simulators. Thesecond use of the term refers to the computer-based simula-tion we conducted to analyze the behavior of our formalmathematical model.

2009 735Rudolph, Morrison, and Carroll

Page 4: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

Although previous studies of fixation error(also known as premature closure or tunnel vi-sion) generally concluded that broadening therange of alternatives is the needed antidote tofixation (Gaba, 1989; Johnson, Hassenbrock, Du-ran, & Moller, 1982), Rudolph found that broad-ening could also be a problem; her data indi-cated a third mode, diagnostic vagabonding.2

“Diagnostic vagabonds” generated a widerange of plausible diagnoses and jumped fromone to another without utilizing multiple actionsteps (such as giving a medication) of the treat-ment algorithms for addressing and ruling outthese diagnoses (they pursued just 1.5 steps onaverage).

Finally, the adaptive sensemaking mode,which looks very much like canonical models ofeffective clinical reasoning (Elstein, Shulman, &Sprafka, 1978), was characterized by generationof one or more plausible diagnoses and exploi-tation of multiple steps of known treatment al-gorithms. This process allowed those in theadaptive mode to rule out some diagnoses, takeeffective action, and, unlike those in any otherproblem-solving mode, resolve the breathingproblem.

This context has several features that are par-ticularly promising for launching a modelingeffort: there is a diagnostic challenge that re-quires sensemaking, but a limited set of plausi-ble diagnoses (about seven); diagnoses, oncegenerated, can only be assessed through activetreatment and testing; there is only one correct

diagnosis; the actions necessary for testing andtreatment are relatively clear; and the conse-quences for misdiagnosis are dire. Further, al-though Rudolph’s study provides rich behav-ioral data on a variety of problem-solvingmodes, there are very limited data on the dy-namic processes underlying these modes andonly a single scenario from which to generalize.Hence, the motivating data are a “theory frag-ment” ripe for elaboration with formal modeling(Davis, Eisenhardt, & Bingham, 2007).

METHODS: SIMULATION FORTHEORY DEVELOPMENT

We explored the mechanisms that producevariation in diagnostic problem solving by de-veloping a formal model. Rather than deducinga formal model from general principles, we useda theory development process now well articu-lated in the system dynamics community to in-duce a model from empirical findings in one orseveral studies (Black, Carlile, & Repenning,2004; Perlow et al., 2002; Repenning & Sterman,2002; Rudolph & Repenning, 2002; Sastry, 1997).While commonly used to build theory from rawdata using qualitative analysis, the groundedtheory approach we employed is also useful todevelop theory from theory (Glaser & Strauss,1967; Strauss & Corbin, 1994). There is a growingappreciation in the organization studies com-munity that inducing formal models from empir-ical data and stylized facts facilitates the iden-tification of structures common to the differentnarratives and enforces the internal consistencyof the emerging theory (Davis et al., 2007).

With the motivating example of Rudolph’s(Rudolph, 2003; see also Rudolph & Raemer,

2 Dorner (1997) identified a similar phenomenon amongpublic officials attempting to identify effective strategies forpublic policy.

TABLE 1Summary of Source Dataa

Variable

Problem-Solving Mode

Stalled Fixated Vagabonding AdaptiveTest ofDifference

N 2 11 17 9 —Subjects who resolved the airway problem 0 0 0 7 �2 (3) � 28.4***Different treatment steps for a diagnosis 1.0 (0.0) 2.0 (1.1) 1.5 (0.5) 3.6 (0.7) F(3, 35) � 17.0***Considerations of favorite diagnosis 3.0 (0.0) 10.0 (5.7) 5.4 (2.3) 5.9 (2.2) F(3, 35) � 5.0**Number of different diagnoses considered 1.5 (0.7) 3.8 (1.7) 6.1 (1.3) 5.0 (1.4) F(3, 35) � 9.1***

a Adapted from Rudolph (2003).Note: Means are given with standard deviation in parentheses. ** p � .01. *** p � .001.

736 OctoberAcademy of Management Review

Page 5: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

2004) taxonomy of four diagnostic problem-solving modes in a medical emergency as astarting point, we pursued the following fivesteps to carry out our theory development. First,using the steps of grounded theory building tobuild theory from theory (Strauss & Corbin, 1994;Suddaby, 2006), we started by translating theconstructs and relationships described inRudolph’s narratives of each problem-solvingmode into the system dynamics language ofstocks, flows, and feedback loops (Forrester,1961; Sterman, 2000). We also gathered frag-ments of other relevant theory, proposing con-structs and relationships among them andsketching bits of feedback structure (Davis et al.,2007). For example, central processes in theproblem-solving literature are described as in-volving adaptive or reinforcing processes, butthe exact feedback relationships that drivethese cycles are not clear. The sensemaking lit-erature describes the interplay of acting andinterpreting as recursive and adaptive (Weick,Sutcliffe, & Obstfeld, 2005: 409), and fixation isdescribed as self-reinforcing (De Keyser &Woods, 1990), but neither has been explicitlymapped.

Second, using the constructs drawn from Ru-dolph’s study and from the sensemaking litera-ture and decision-making literature, we experi-mented with different causal loop diagrams andcompared our diagrams with constructs and re-lationships from this literature. It was at thispoint that we recognized that problem solvershad to make a choice between competing diag-noses. Third, we revised and streamlined ourcausal loop diagram, converging on just threecentral processes. Fourth, we translated thelinks and loops of this diagram into a formalmodel. For example, each stock is representedby an equation that captures the rate of its in-flows and outflows and each connection, suchas one between the perceived plausibility of adiagnosis and openness to new cues, as a math-ematical function (see Appendixes A and B).Theorizing about dynamic processes with thebenefit of a formal model allowed us to identifyimportant logical gaps and inconsistencies inexisting theory (Sastry, 1997; Sterman, 1994). Themodel’s robustness and its ease of interpreta-tion were enhanced by using standard systemdynamics formulations (familiar fragments ofmodel structure that occur frequently) whereverpossible (Sterman, 2000).

Finally, we ran computer simulations of themodel, varying the parameters to explore itsbehavior over a range of conditions. These sce-narios or experiments allowed us to examinewhether the parts of the model were operatingin a psychologically reasonable way andwhether the model was reproducing the empir-ical results. The model and its output provide atheory explicitly mapping the role of both bal-ancing and reinforcing processes in effectiveand ineffective problem solving. Whilegrounded in previous work, the model also pro-vides new insights.

CONCEPTUAL MODEL

Overview

In Rudolph’s findings we observed that doc-tors need to make sense of a rich and sometimesconfusing stream of information and they needto choose among the diagnoses they generate inorder to make sense of this information. Iterat-ing between Rudolph’s problem-solving modesand theories of decision making and sensemak-ing, our modeling process converged on the ideathat action-oriented problem solving involvesthree central tasks linked by both reinforcingand adaptive (or balancing) feedback: (1) prob-lem solvers take actions and, thus, make infor-mation available for their interpretation; (2) theyinterpret the flow of information around them tocontinually reassess the plausibility of their di-agnoses; and (3) they cultivate alternative diag-noses even as they pursue the leading diag-noses.

The trigger for action-oriented problem solv-ing, whether for doctors or executives, is usuallya surprise or divergence from what the problemsolver expects or desires, such as a disappoint-ing corporate initiative or an anomalous build-ing fire (Klein et al., 2005; Louis & Sutton, 1991;Mandler, 1984; Weick et al., 2005). In our motivat-ing example the anesthesiology resident ex-pects the patient to breathe normally but in-stead observes and seeks to address a seriousproblem with the patient’s ventilation. To suc-ceed, any problem solver (e.g., an executive, afire commander) must construct an organizingstory about what is wrong (Weick et al., 2005). Inour clinical example the resident scans the clin-ical signs and symptoms, the patient history,

2009 737Rudolph, Morrison, and Carroll

Page 6: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

and the timing of the problem, and then devel-opes a plausible story; this then takes the formof a diagnosis (Elstein, 2001; Elstein et al., 1978;Klein, Phillips, Rall, & Peluso, 2006; Rudolph &Raemer, 2004).

We pick up the story from here, presenting thekey constructs in our model of action-orientedproblem solving—acting, interpreting, and cul-tivating new diagnoses—and show how theyare linked. While these three processes unfoldand interact recursively, we start our model de-scription with the acting process because it iseasiest to grasp.

Action Generates New Information

Catalyzed by a deviation from expectationsand guided by an initial organizing diagnosis,the problem solver launches into action. In thedomain of time-pressured acute medical care,conducting tests and providing treatments ofteninvolve following a standardized algorithm—aset of steps combining therapeutic treatmentwith diagnostic tests (Cook & Woods, 1994; El-stein et al., 1978). This is a rule-based behaviorthat requires executing a known procedure (Ras-mussen, Pejtersen, & Goodstein, 1994). Manyother professional domains, from nuclear powerplant operations to computer chip manufactur-ing quality control, have standard diagnostic oroperating procedures to address problems (Car-roll, Rudolph, Hatakenaka, Wiederhold, & Bold-rini, 2001; Cyert & March, 1963; Edmondson, 2002;Gersick & Hackman, 1990; Levitt & March, 1988;Repenning & Sterman, 2002; Winter, 1971). Bymoving through the steps of a standard operat-ing procedure, problem solvers generate cuesthat become available for them to notice,bracket off from the stream of cues, and inter-pret. Having advanced the steps further, theproblem solver has access to a larger pool ofcues for making meaning in an ambiguous sit-uation.

Since the consequences of action accumulate,we model this progress as a stock variable, “ac-tion steps completed,” in Figure 1. Stocks, likewater in a bathtub, are the accumulations offlows into and out of them and are representedby rectangles (Forrester, 1961; Sterman, 2000).Stocks have a key role in creating dynamics:they create delays, give systems inertia, andprovide systems with memory. Accomplishingaction steps takes time (delays), sets the prob-

lem solver on a particular course of action(inertia), and yields results that remain avail-able for interpretation (memory). The stock isincreased by “taking action,” a flow variable,represented by a pipe and valve icon. The rateof taking action is determined by “time neededto take steps,” which represents the timeneeded for mentally organizing to execute astep, physically preparing for the step, execut-ing the step, awaiting a response from thesystem, and noticing the results as cues in thestream of ongoing experience. The rate of tak-ing action also depends on how many of theaction steps remain to be executed. Causalarrows are used to depict dependencies, suchas the arrow in Figure 1 from “action stepscompleted” to “taking action.”

Another feature of action is that taking actionsteps makes “cues available” (see Figure 1) forinterpretation. For example, as a doctor accom-plishes steps in a clinical treatment algorithm,he or she generates new diagnostic informationthat can then be considered in the context of hisor her diagnosis. Analogously, as an organiza-tion carries out a new strategic initiative, exec-utives can observe changes in the competitiveenvironment that become inputs to their assess-ment of the situation and the merits of theirstrategies.

FIGURE 1Action Steps Make Cues Available

Action StepsCompleted

Taking Action

Time Needed toTake Steps

CuesAvailable

Note: A stock is a reservoir or accumulation (like water ina bathtub) and is represented by a rectangle; flows, like thespigot and drain on a bathtub, fill or drain the stock and aredepicted as “pipes” with “valves.” The “action steps com-pleted” (represented by the rectangular stock) increases (orstops increasing) as people “take action” (or stop takingaction) to complete the standard operating procedure orother algorithm (represented by the inflow pipe). The “timeneeded to take steps” reflects delays inherent in preparingto take action, taking action, and waiting for a response.“Cues available” is a variable that gives the problem solverconfirming or disconfirming data about his or her currentleading diagnosis.

738 OctoberAcademy of Management Review

Page 7: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

Action-Oriented Problem Solving RequiresInterpreting Cues

Facing a complex and ambiguous situationwhere quick action is needed, a problem solverhas to create meaning in order to act. Unlikedecision-making experiments in a laboratory,where “meaning already exists and is waitingto be found,” in complex and ambiguous set-tings, meaning “awaits construction that mightnot happen or might go awry” (Weick, 1995: 15).Generating a plausible story or explanationabout ambiguous cues helps organize andlaunch action (Neisser, 1976; Snook, 2000; Weicket al., 2005).

Studies of strategic action (Sutcliffe & Weber,2003), enactment of organizational structures(Weick, 1995; Weick et al., 2005), naturalisticmedical problem solving (Elstein, 2001; Elsteinet al., 1978; Johnson et al., 1982), tactical decisionmaking under real-world stress (Cannon-Bow-ers & Salas, 1998), problem detection (Klein etal., 2005; Mandler, 1982), and problem solving inother naturalistic environments (Carroll et al.,2001; Klein, Orasanu, Calderwood, & Zsambok,1993; Zsambok & Klein, 1997) all indicate that aplausible explanation, diagnosis, or “story” isthe engine of problem solving. Problem solverssuch as corporate executives, chess players, orfirefighters, for example, use an initial diagno-sis or assessment of the situation to develop aplan of action and decide what further informa-tion is needed (Dreyfus, 1997; Klein, 1998; Sut-cliffe & Weber, 2003). Once an initial diagnosisis established, how plausible people considertheir own explanations or diagnoses waxes orwanes as they make sense of new or existingcues (Koehler, 1991; Smith & Blankenship, 1991).It takes time for people to notice, bracket asimportant, and label new cues and then tochange their mental models (e.g., diagnoses) ac-cordingly (Bartunek, 1984; Kleinmuntz, 1985; Mar-cus & Nichols, 1999; Roberts, 1990).

To capture this process of perceived plausibil-ity increasing or decreasing with the interpreta-tion of cues, we show “plausibility of leadingdiagnosis” as a stock variable in Figure 2. Thisstock depicts the problem solver’s current sub-jective assessment of plausibility. By “leading”we mean the problem solver’s favorite or currentmost plausible explanation. “Updating” is theprocess or flow variable by which the currentview of the plausibility of the leading diagnosis

(the stock) is adjusted to equal “plausibility fromnew cues.“ Updating is an ongoing process ofincorporating interpretations based on new in-formation with current beliefs. Since changes inperceived plausibility require time, the rate ofupdating is influenced by a parameter we call“time needed to update.” “Plausibility from newcues” describes interpretations of informationgenerated by acting, which, in turn, depends inpart on cues available.

Making meaning from a stream of cues avail-able through noticing, bracketing, filtering, andlabeling occurs while the problem solver holdsa belief in the leading diagnosis, which mayinfluence these interpretive processes. New as-sessments of plausibility (“plausibility fromnew cues”) are dependent not only on cuesavailable but also on how open the problemsolver is to these external cues, which we call“weight on cues” and introduce in Figure 3.Studies of confirmation bias show that once anexplanation is set, people prefer supporting todisconfirming information, and this effect isstronger still when cues are presented serially,as in our model (Jonas, Schulz-Hardt, Frey, &Thelen, 2001). Studies of fixation show that asthe plausibility of the current diagnosis rises,openness to external cues, especially ones thatdefy the current view, decreases (De Keyser &Woods, 1990; Johnson, Moen, & Thompson, 1988;Staw, 1976; Xiao & MacKenzie, 1995). In otherwords, “weight on cues” is a downward-slopingfunction of “plausibility of leading diagnosis.”

FIGURE 2Updating Subjective Assessments of

Plausibility

CuesAvailable

Plausibility ofLeading

DiagnosisUpdating

Plausibility fromNew Cues

Time Neededto Update

Note: “Plausibility of leading diagnosis” (represented bythe rectangular stock) increases or decreases as people in-corporate interpretations of “plausibility from new cues”through “updating” (represented by the inflow pipe). “Timeneeded to update” reflects delays inherent in noticing andprocessing information.

2009 739Rudolph, Morrison, and Carroll

Page 8: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

However, prior research is surprisingly silentregarding the exact form of the relationship be-tween plausibility and weight given to externalcues. We postulate that for bold problem solversa small increase in plausibility will lead to adisproportionately large decrease in weight oncues. For cautious problem solvers a small in-crease in plausibility will lead either to no or toonly a small decrease in the weight on externalcues. We use a parameter, “effect of plausibilityon cue interpretation,” to model the variation,from boldness to caution, in how plausibilityinfluences the weight placed on cues. AppendixA depicts our representations of this relation-ship for different values of the effect of plausi-bility on cue interpretation.

To facilitate our mathematical description ofthe stream of action-generated cues, we assumethe information made available for interpreta-tion is either confirming or disconfirming, de-pending on which leading diagnosis is underconsideration. We define the variable “accuracyof leading diagnosis” such that when the prob-lem solver’s leading diagnosis is correct, action-generated cues confirm the diagnosis, and whenit is incorrect, they disconfirm the diagnosis. Thisfeedback regarding environmental cues mimicsthe problem-solving situation in a range of do-mains, such as manufacturing defect eliminationor computer debugging, while the simplifying as-sumption of correct or incorrect diagnosis allowsus to place the mechanisms of interpretation instark relief to clarify how they operate.

The recursive interactions between the in-terpreting and updating processes form afeedback loop. If new cues arrive that increasethe plausibility of the leading diagnosis, thenthe weight on cues decreases slightly, leadingto a small increase in the plausibility fromnew cues, which, in turn, causes updating tofurther increase the plausibility of the leadingdiagnosis, and the cycle continues. This inter-pretation process amplifies a change througha reinforcing feedback process (the loop iden-tifier “R,” for reinforcing). We name this feed-back process the “self-fulfilling interpretationloop.” In the absence of any offsetting influ-ences, this loop pushes the plausibility of anearly-generated diagnosis toward ever-greater plausibility. If the loop is driving to-ward greater plausibility of an erroneous di-agnosis, it will generate the well-known self-confirming pattern of fixation, in which aninitially plausible diagnosis and the filteringof external cues recursively influence eachother so that the problem solver sticks to thediagnosis, despite discrepant cues. If the loopis driving toward greater plausibility of a cor-rect diagnosis, this is salutary. As we demon-strate later, the interplay between this inter-pretation process and the processes of acting,interpreting cues, and cultivating alternativediagnoses gives rise to the distinctive patternsof problem solving in Rudolph’s (2003; see alsoRudolph & Raemer, 2004) study.

FIGURE 3Feedback Structure of Self-Fulfilling Interpretation

Action StepsCompleted

Accuracy ofLeading Diagnosis

Taking Action

Time Needed toTake Steps

CuesAvailable

Plausibility ofLeading

DiagnosisUpdating

Plausibility fromNew Cues

Weight onCues

Time Neededto Update

Effect of Plausibilityon Cue Interpretation

R

Self-fulfillinginterpretation loop

Note: As “plausibility of leading diagnosis” increases, the “weight on cues” decreases, so, all else being equal, the“plausibility from new cues” continues to grow, leading to “updating” that increases “plausibility of leading diagnosis” stillfurther. The process forms a reinforcing feedback loop, “R,” the “self-fulfilling interpretation loop.”

740 OctoberAcademy of Management Review

Page 9: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

Action-Oriented Problem Solving RequiresCultivating New Diagnoses

Problem solvers not only assess the plausibil-ity of their leading diagnosis but also consideralternative diagnoses that they identify throughsearch (Gupta, Smith, & Shalley, 2006; March,1991), conversations (Weick et al., 2005), expla-nations (Hirt & Markman, 1995), or imagination(Amabile, 1982). This is a knowledge-based ac-tivity relying on expertise (Gonzales, Lerch, &Lebiere, 2003; Rasmussen et al., 1994). The pro-cess of inferring the most likely explanation forthe available information is known as abduc-tion (Josephson & Josephson, 1994; Peirce, 1958)and is complementary to deduction—forecast-ing data that would be the consequences of apresumed hypothesis—and induction—drawingconclusions from data. Together, abduction, de-duction, and induction form a problem-solvingcycle of learning- or sensemaking-through-action.

The problem-solving literature suggests a va-riety of ways for problem solvers to identify themost plausible hypothesis. At one extreme allpossible hypotheses are held simultaneouslyand updated through Bayesian inference aseach new piece of information is received (Fisch-hoff & Beyth-Marom, 1983). In well-structuredclinical situations, for example, there are formalprotocols that prioritize lists of diagnoses andclinical tests and treatments. Such decisionrules are particularly useful for atypical prob-lems or less experienced problem solvers(Gonzalez et al., 2003). The other extreme is rep-resented by Klein’s (1998) naturalistic decision-making model, which proposes that for ill-structured problems representative of manyreal-life situations, problem solvers consideronly one hypothesis at a time, mentally simulatethe implications of the hypothesis given theavailable information, and take action if themental simulation confirms the plausibility ofthe hypothesis. Only if the mental simulationfails to confirm the hypothesis do the problemsolvers imagine a new diagnosis and check itfor validity. In the middle, behavioral decisiontheory recognizes the constructive processes atwork in decision behavior (Payne, Bettman, &Johnson, 1992). The problem-solving literatureincludes models of exemplar-based memory re-trieval (Gonzalez et al., 2003, Logan, 1988) andthe concept of a race between exemplars and

general heuristics to produce a solution or re-sponse (Logan, 1988).

For our simulation model we rejected both theextreme Bayesian rationality model and the se-rial diagnosis model. The behavioral decision-making literature strongly challenges the real-ism of Bayesian updating, arguing that limitedcognitive capacity makes such omniscience hu-manly impossible, even for well-structuredproblems (Fischhoff & Beyth-Marom, 1983).Klein’s model seems well suited to situations inwhich the costs of information and the costs oferroneous diagnosis are very high—for exam-ple, his fire commanders “test” their diagnosisby going into a building that may collapsearound them. On the other hand, if tests areeasily run, erroneous diagnoses are easily re-placed, and decision makers are not sufficientlyexpert to see a pattern (Elstein & Schwarz, 2002;Gonzales et al., 2003), it makes sense that prob-lem solvers will hold multiple diagnoses inmind and seek further evidence to distinguishamong them. Indeed, rather than being trainednot to have hypotheses or to have all possiblehypotheses, clinicians are taught to develop a“differential diagnosis,” which encourages themto hold more than one (but not “all”) diagnosesin mind and perform diagnostic tests that dis-tinguish among the active diagnoses (Bar-ondess & Carpenter, 1994). In Rudolph’s studyalmost all the subjects at some point comparedsimultaneous diagnoses. We therefore modeleda process that was psychologically reasonableand computationally simple, involving compar-ison of two potential diagnoses.

We assume that at any one point in time theproblem solver has a preferred or leading diag-nosis in mind and is seeking to validate or dis-credit that diagnosis (Elstein et al., 1978; Klein etal., 2006). Drawing on this research andRudolph’s data, we noted that doctors run tests(albeit with high variance in the quality) thatwill confirm a correct diagnosis or disconfirm anincorrect one. This process of running tests is notBayesian updating in a mathematical sense but,rather, some combination of logical rule follow-ing (“If I can pass something through the breath-ing tube, then it isn’t blocked”) and an intuitivetotaling up of the supportive and countervailingevidence (Elstein et al., 1978; March, 1994).

We further assume that there is always atleast one other diagnosis that the problemsolver is imagining (or taking from a well-

2009 741Rudolph, Morrison, and Carroll

Page 10: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

learned list of candidates) and that the next bestalternative among these others is gatheringplausibility as the patient’s status worsens de-spite the problem solver’s efforts. We call thisprocess “cultivating” and combine it with theprocesses of acting and interpreting in Figure 4.The “plausibility of alternative diagnosis” is astock that is increased by cultivating, the paceof which is determined by the parameter “timeneeded to cultivate.” We also show that the paceof cultivating is reduced when the plausibility ofthe leading diagnosis is high by including avariable labeled “effect of current plausibilityon cultivating.”

If the alternative diagnosis catches up to theleading diagnosis, we assume that the leadingdiagnosis is rejected. Three things happen whenthe leading diagnosis is rejected: (1) the alterna-tive diagnosis becomes the new leading diag-nosis, (2) the problem solver switches to actionsteps appropriate for the new leading diagnosis(so “action steps completed” starts over), and (3)yet another diagnosis becomes the second-place alternative diagnosis (so “plausibility ofalternative diagnosis” starts over). The modelvariable “change trigger” signals the rejectionof the leading diagnosis, and we use dottedlines in Figure 5 to signal these changes.

As we translated this causal structure into aformal mathematical model, we initially set pa-rameters to reasonable values in the context of ourmotivating clinical example and refined them aswe compared simulation output to our sourcedata. We also conducted extensive sensitivityanalysis to test for model robustness under ex-treme conditions. Complete documentation of themodel equations appears in Appendix B.

SIMULATING THE DYNAMICS OF PROBLEMSOLVING

We begin with a set of experiments3 showinghow the interplay of acting, interpreting, andcultivating new diagnoses produces the fourmodes of diagnostic problem solving observedin Rudolph’s study (2003). For clarity of exposition,we have chosen a scenario that controls for theeffects of random search by assuming that allproblem solvers generate alternative diagnoses in

3 “Experiment” is commonly used in the modeling com-munity to refer to manipulations of the model parameters(Carley, 2001).

FIGURE 4Core Model Structure Showing the Interaction of Acting, Interpreting, and Cultivating Alternatives

Action StepsCompleted

Accuracy ofLeading Diagnosis

Taking Action

Time Needed toTake Steps

Plausibility ofAlternativeDiagnosisCultivating

TimeNeeded toCultivate

CuesAvailable

Effect of CurrentPlausibility on

Cultivating

Plausibility ofLeading

DiagnosisUpdating

Plausibility fromNew Cues

Weight onCues

Time Neededto Update

Effect of Plausibilityon Cue Interpretation

R

Self-fulfillinginterpretation loop

742 OctoberAcademy of Management Review

Page 11: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

the same modal sequence in Rudolph’s data, inwhich only the fourth diagnosis is correct.4

Four Modes of Action-Oriented ProblemSolving

To highlight differences among the four prob-lem-solving modes, Figure 6 displays the behav-ior over time of the plausibility of the leadingdiagnosis. The top panel is an illustration of theadaptive mode. The problem solver’s sense ofthe plausibility of the first diagnosis begins atits initial value of 0.5 (out of 1.0), and the dynam-ics of the three component problem-solving pro-cesses begin to unfold simultaneously.

First, the problem solver begins taking actionsteps associated with the first diagnosis, in-creasing the stock of action steps completed,which results in more cues available. Second,armed with some confidence in his or her diag-nosis, the problem solver’s interpretations begin

to increase plausibility, and the weight on cuesfalls slowly as the self-fulfilling interpretationloop acts to reinforce the leading diagnosis. Inthe first few moments the action steps of thediagnostic algorithm have not progressed much,so the limited cues have little effect on plausi-bility. After a short time the accumulated cues(which are “objectively” disconfirming informa-tion because the first diagnosis is incorrect) be-gin to show their effect on plausibility, and wesee a slow decline in the plausibility of theleading diagnosis. Third, the plausibility of thealternative diagnosis builds as the cultivatingprocess unfolds in the face of cues unfavorableto the leading diagnosis. Eventually, the plausi-bility of the alternative diagnosis overtakes theplausibility of the leading diagnosis. At this mo-ment the first diagnosis is rejected and the sec-ond diagnosis becomes the leading diagnosis.

The problem solver begins pursuing the algo-rithm associated with the new leading diagno-sis. The pattern repeats for the second and thirddiagnosis. When the problem solver begins toconsider diagnosis number four, the correct one,plausibility begins to grow as before. However,the new cues available now offer confirmationand are interpreted to build even more plausi-bility of the leading diagnosis. Moreover, the

4 Simulation analyses not shown here replicate the mainresults for scenarios in which the correct diagnosis entersearlier or later than fourth, in which there are two correctdiagnoses, and even in which the poor diagnoses have amodest degree of correctness. A summary of several hun-dred additional simulations demonstrating these results isavailable from the second author on request.

FIGURE 5Changes to Reset the Model When the Alternative Diagnosis Becomes the Leading Diagnosis

Action StepsCompleted

Accuracy ofLeading Diagnosis

Taking Action

Time Needed toTake Steps

Plausibility ofAlternativeDiagnosisCultivating

TimeNeeded toCultivate

CuesAvailable

Effect of CurrentPlausibility on

Cultivating

Plausibility ofLeading

DiagnosisUpdating

Plausibility fromNew Cues

Weight onCues

Time Neededto Update

Effect of Plausibilityon Cue Interpretation

ChangeTrigger

R

Self-fulfillinginterpretation loop

2009 743Rudolph, Morrison, and Carroll

Page 12: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

FIGURE 6Modes of Action-Oriented Problem Solving

Note: Simulation conditions are identical except for the strength of the self-fulfilling interpretation feedback loop, asdetermined by the relationship between “plausibility of leading diagnosis” and “weight on cues,” shown in the insets.

744 OctoberAcademy of Management Review

Page 13: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

self-fulfilling interpretation loop reinforces theincreases in plausibility, reducing the weight oncues and thus boosting plausibility still further.The problem solver pursues the action steps tocompletion and converges on a steady-statechoice of the correct diagnosis.

The top panel in Figure 6 shows two importantfeatures of problem-solving dynamics. First, theconsideration of each diagnosis enjoys a honey-moon period during the time it takes for an al-ternative diagnosis to emerge as a viable con-tender as the basis for action—a temporalinterplay between the leading and alternativediagnoses. In the adaptive mode this temporalinterplay is “well balanced” in that the honey-moon period is long enough for the problemsolver both to take action and to interpret theresults stirred up by that action. Second, there is adynamic interchange in the roles of acting andinterpreting because the cues available accumu-late slowly relative to the ongoing process of in-terpreting experience. So we see in Figure 6 thatplausibility increases at first (even for incorrectdiagnoses), but it later (for each incorrect diagno-sis) decreases. Plausibility increases at first be-cause updating driven by the confirmation-biasedinterpretation process occurs quickly relative tothe accumulation of available cues. Meanwhile,the problem solver continues to interact with thephysical world by taking action steps that gener-ate more cues. As the disconfirming evidencemounts, it eventually overcomes the effects of theself-fulfilling interpretation: plausibility reaches apeak and then begins to decline.

The middle panel of Figure 6 shows simula-tion results that replicate the fixating mode. Thedifference between this experiment and the onein the top panel is only that the effect of plausi-bility on cue interpretation, depicted in the insetpanel, is stronger.5 The simulation begins asbefore with an initial plausibility that starts torise at first, but the lower weight on cues in thisscenario allows the self-fulfilling process togain momentum. The problem solver acts, inter-preting cues and creating meaning that sup-ports the current diagnosis, and the weight oncues falls even more. The self-fulfilling interpre-tation loop reinforces the current diagnosis, andbecause the loop is so strong, the first diagnosis

is always preferred. The diagnostician does notmove on to any other diagnosis. The strong re-inforcing effects of the self-fulfilling interpreta-tion loop result in a pattern of problem solvingin which the problem solver is completely con-fident in the incorrect diagnosis. Self-fulfillinginterpretations discount some disconfirming ev-idence, so the current diagnosis locks in prema-turely, squeezing out the cultivation of alterna-tives, and the problem solver never has achance to find the correct diagnosis.

The bottom panel in Figure 6 shows simula-tion results that replicate the vagabondingmode. In this experiment the effect of plausibil-ity on cue interpretation is weaker.6 The firstthree diagnoses are rejected, but more quicklythan in the adaptive case, implying that theproblem solver does not take as many actionsteps, consistent with field data showing thatdiagnostic vagabonds generated diagnoses butperformed few or no steps of the treatment/testalgorithm (1.5 steps on average). When the fourthdiagnosis, the correct one, enters as the leadingdiagnosis, plausibility increases, but not as rap-idly as in the adaptive case. The problem solverplaces a higher weight on cues (owing to a weakereffect of plausibility on cue interpretation), but thecues to confirm the diagnosis accumulate some-what slowly because they must be made avail-able by advancing through action steps. Mean-while, an alternative diagnosis gains favor andeventually overtakes the correct diagnosis, andthe problem solver also rejects the correct diagno-sis. Once this diagnosis is rejected, the problemsolver continues identifying alternatives, choos-ing them as the leading diagnosis, and rejectingthem in favor of the next emerging alternative.

The stylized problem solver in the vagabond-ing experiment is quite capable of cultivatingalternatives and attending to cues but, lackingmore confident beliefs about the plausibility ofa diagnosis, does not hold onto a diagnosis longenough to adequately advance forward with ac-tion steps. The result is vagabonding, a patternof diagnostic problem solving in which the prob-lem solver jumps from one plausible diagnosisto the next without treating the patient. The dy-namic interplay among acting, interpreting, andcultivating alternatives is out of balance: the

5 Specifically, Effect of Plausibility on Cue Interpreta-tion � 1.

6 Specifically, Effect of Plausibility on Cue Interpreta-tion � 0.15.

2009 745Rudolph, Morrison, and Carroll

Page 14: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

pace of generating new cues associated withthe leading diagnosis (acting) is too slow rela-tive to the pace of cultivating alternatives. Theproblem solver gets stuck in a mode of generat-ing new alternatives but not discovering enoughabout them to reach an effective conclusion.This mode fails because the effect of plausibilityon interpretation is so weak that even the cor-rect diagnosis is rejected.

The model can also generate a mode in whichthe problem solver is stalled—unable to moveforward to take any action steps (e.g., when bothtaking action steps and cultivating diagnosesare extremely slow). Rudolph’s analysis classi-fied only two out of thirty-nine doctors asstalled. Both exhibited behaviors of advancingtreatment steps little or not at all and establish-ing working diagnoses very slowly, consistentwith this example. However, with so few exam-ples and so little action to learn from, we omitthis mode from subsequent analysis.

The Interplay of Acting, Interpreting, andCultivating Diagnoses: Sensitivity Analysis

To examine the critical dynamic interactionsamong acting, interpreting, and cultivating di-agnoses more closely, we conducted experi-ments in which we varied the pace of theseprocesses. The first set of simulations held allparameters the same as in the vagabonding

case of Figure 6, except that we varied the paceof acting.7 We discovered that the eleven simu-lation runs generated, shown in Figure 7, sepa-rate into two distinct patterns. The set corre-sponding to faster acting is adaptive: theplausibility of diagnosis number four climbssmoothly toward one. The other set, based onslower acting, displays vagabonding: diagnosisnumber four is rejected and new alternativescontinue overtaking the lead.

This experiment points to two important re-sults. First, different rates of taking action gen-erate qualitatively different dynamics. A biasfor action (taking action steps faster) offsetsthe effects of less self-fulfilling interpretationand protects the problem solver from incor-rectly rejecting the correct diagnosis. Second,small differences in the rate of acting canmean the difference between adaptive sense-making and vagabonding. This result raisesthe question of just what pace of taking actionis needed to escape from the perils of vaga-bonding.

To shed light on this question, we conductedan extensive set of experiments to test the rela-tionship among the pace of acting, the pace of

7 We did this by setting the time needed to take steps tovalues ranging from very fast (1 minute) to very slow (16minutes).

FIGURE 7Sensitivity Analysis Showing System Behavior for Various Rates of Taking Action Steps

1

0

0 25Time (minute)

Fasteracting

Plau

sibi

lity

of d

iagn

oses

746 OctoberAcademy of Management Review

Page 15: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

cultivating alternatives, and the strength of theeffect of plausibility on interpretation. We per-formed many sets of simulations like those inFigure 7 for various values of the effect of plau-sibility on cue interpretation and the timeneeded to cultivate. For each combination ofthese two parameters, we found the thresholdpace of time needed to take steps that separatesvagabonding from adaptive sensemaking andthe threshold that separates adaptive sense-making from fixating.8 Each of the four panels inFigure 8 shows the results for one value of thetime needed to cultivate. Looking first from leftto right in any panel, as the strength of theinterpretation effect increases from very weak tomoderate, the threshold pace of acting neededto avoid vagabonding gets slower. For interpre-

tation effects of moderate strength, even a veryslow pace of acting will lead to adaptive prob-lem solving. But, as the interpretation effect getsvery strong, the threshold pace of acting neededto avoid fixation gets faster.

The panel views in Figure 8 highlight the in-teractions between the pace of action and thestrength of the interpretation effect. At the left ofeach panel, the weaker effect of the plausibilityon cue interpretation requires faster action toavoid vagabonding. A weak interpretation effectdescribes a problem solver who wants morecues, so the pace of acting must be faster to leadto adaptive sensemaking. When the appetite forcues is high (weak effect of plausibility on cueinterpretation), slow action induces vagabond-ing. But faster action may not always be possi-ble or desirable. In such cases a modest degreeof confidence in the leading diagnosis (i.e., astronger effect of plausibility on interpretation)thwarts the lurking threat of vagabonding. Atthe right of each panel, where there is a stronger

8 To find the thresholds, we classified a simulation asvagabonding if diagnosis number four was rejected, as fix-ated if diagnosis number one was not rejected by minute 30,and as adaptive otherwise.

FIGURE 8Threshold Values for the Pace of Taking Action

2009 747Rudolph, Morrison, and Carroll

Page 16: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

effect of plausibility on cue interpretation, fasteraction helps deter fixation. A strong interpreta-tion effect describes a problem solver who payslittle attention to cues, so the pace of actingmust be faster to generate the overwhelmingamount of disconfirming evidence needed topierce the high sense of plausibility. When theattention to cues is low (a strong effect of plau-sibility on cue interpretation), slow action al-lows premature convergence on an incorrect di-agnosis. Alternatively, a modest degree ofskepticism in the leading diagnosis (i.e., weakereffect of plausibility) prompts an openness tocues that can prevent fixation.

Looking across the family of curves in the fourpanels in Figure 8, we see how the thresholds forthe pace of taking action depend on both theeffect of plausibility on cue interpretation andthe time needed to cultivate alternate diag-noses. When the pace of cultivating alternativesis very fast, the risk of vagabonding is quitehigh and not mitigated much by stronger inter-pretation effects. Very rapid action is stillneeded. For a slower pace of cultivating alter-natives, small increases in the strength of theinterpretation effect yield stronger buffersagainst the threat of vagabonding, so slowerpaces of action are still adequate to avoid vaga-bonding. However, when the pace of cultivatingalternatives is very slow, the risk of fixating ishigh and not mitigated much by weaker inter-pretation effects. Moreover, in contrast to theinteraction around the vagabonding threshold,for a faster pace of cultivating alternatives,small decreases in the strength of the interpre-tation effect yield larger improvements in avoid-ing fixation.

DISCUSSION

Our theory of action-oriented problem solving isintended to be broadly applicable yet not univer-sal. Three boundary conditions are reflected in thestructure of the model and present important ca-veats for interpreting our findings and the propo-sitions we develop next. They are (1) action-basedinquiry—information cues become available onlyby taking action; (2) temporal dynamism—doingnothing or holding an erroneous diagnosis meansthe situation is deteriorating over time, so there ispressure on the problem solver to act on the lead-ing diagnosis or to cultivate a new one; and (3)action endogeneity—moving through the steps of

a course of action changes the characteristics ofthe environment and the stream of cues that be-come available.

The modeling exercise and theory developmentprocess contribute three new insights to under-standing the mechanisms that link interpretationand choice in action-oriented problem solving.First, while current theories of sensemaking anddecision making specialize in examining interpre-tation and choice, respectively, we find that theyare inextricably linked through the interplay ofacting, interpreting, and cultivating diagnoses.Second, we show how the interactions and pacingamong acting, interpreting, and cultivating diag-noses tip people between adaptive problem solv-ing on the one hand and fixation or vagabondingon the other. Third, we highlight the surprisingresult that reinforcing feedback, usually seen asthe driver of maladaptive patterns such as fixa-tion, can actually play a beneficial role in adap-tive problem solving.

Linking Sensemaking and Decision-MakingPerspectives

Our motivating example of anesthesiologyresidents, like many real-world managerialproblem-solving situations, includes processesfamiliar from both sensemaking and decision-making theories. Consistent with the sensemak-ing literature, interpretation drives action, andthe cues generated by such action then serve asinputs to further sensemaking. Such cues are notimmediately meaningful inputs but, rather,have to be interpreted or made sense of, analo-gous to “editing” in prospect theory (Kahneman& Tversky, 1979). New diagnoses are generatedthrough a process that mimics “insight” (Lan-gley et al., 1995), and they are then updated as afunction of incoming information. Consistentwith the decision-making literature, choices arebeing made between the leading and the sec-ond diagnoses, based on their relative plausi-bility. While “choosing” is not a named variablein our model,9 choosing between the leadingand the second diagnoses is a continual option,

9 Our model does not include a choice among actions;instead, it makes the simplifying assumption that problemsolvers are following a protocol that specifies the proper testor treatment given the leading diagnosis; in situations with-out such a protocol, an expanded model would include achoice among actions as well as a choice among diagnoses.

748 OctoberAcademy of Management Review

Page 17: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

and the closeness of the contest between the twochoices varies as the plausibility of each diag-nosis waxes and wanes.

In action-oriented problem solving, choice fol-lows sensemaking. Indeed, there can be nochoice without a previous sensemaking process;action and interpretation are needed to makeavailable the information used for choice. Andsensemaking follows choice as the selection of anew guiding hypothesis sets the problem solveron a new course of action. In contrast, classicrational choice approaches posit that preconsti-tuted information is given and perfectly under-stood, and the decision process is decoupledfrom the sensemaking process that created thepreconstituted information.

This leads us to propose our first proposition.

Proposition 1: In the context of action-oriented problem solving, the outputsof sensemaking and decision makingare inputs to each other.

Failure Modes and Success: The Dynamics ofActing, Interpreting, and CultivatingDiagnoses

The pace of acting, interpreting, and cultivat-ing new diagnoses and the strength of the self-fulfilling interpretation loop interact to produceboth the successful adaptive problem-solvingmode and the three failure modes of fixation,vagabonding, and stalling. For example, al-though an open mind (in our model, a highweight on cues) coupled with deliberation isoften crucial for effective action, in action-oriented problem solving, acting quickly, evenon an uncertain diagnosis, can produce informa-tion needed to improve the diagnosis. In manysituations characterized by our boundary condi-tions, rapid action is essential to generate diag-nostic information and avert impending deteri-oration of a strategic position, a patient, or awildfire. But how much confirmation bias or self-fulfilling interpretation is enough or too much?When is acting quickly an asset? When is it aliability? Our model clearly shows that there isno one answer to such questions; each problem-solving mode is a kind of “syndrome” producedby a unique combination of the pace and powerof different processes. Our results, however,suggest some patterns.

Adaptive problem solving calibrates and bal-ances self-confirmation, speed of acting, andspeed of cultivating alternatives. The windowfor adaptive problem solving gradually closesas the competition between the leading and thealternative diagnosis gets tighter. Just at themoment when the problem solver could discon-firm the leading diagnosis and switch to an-other, the momentum of a strong effect of plau-sibility on cue interpretation sweeps him or herinto fixation—or just at the moment when stick-ing with the leading diagnosis would help ad-vance the appropriate action steps, rapid gener-ation of alternative diagnoses combined withslow accumulation of cues sweeps him or herinto vagabonding. Analogously, organizationscan fall into dysfunctional modes by over- orunderemphasizing the speed of decision mak-ing (Eisenhardt, 1989; Perlow et al., 2002). Hence,rather than emphasizing beneficial processesover dysfunctional biases, our model shows thatthe same processes that drive adaptive problemsolving, when out of balance, lead to dysfunc-tional problem solving in the form of fixation orvagabonding.

The simulation results suggest that faster ac-tion is beneficial in creating a larger region ofadaptive problem solving (Figure 8). Our resultsshow no detrimental effects of rapid action, butin many cases rapid action may be problematic,such as when not acting on short-term trends isbeneficial (cf. March 1991; Sastry, 1997), or whenfast action itself transforms the expected pace ofadaptation in an entrepreneurial environmentin a way that is harmful to all players (Perlow etal., 2002). However, when these “side effects” areof little concern, a faster test or faster informa-tion feedback is better, all else being equal.Indeed, in the extreme case where action andinformation feedback are instantaneous, theproblem approximates a static decision-makingchoice in which information is preconstitutedand given.

Proposition 2: In action-oriented prob-lem solving, faster action decreasesthe risk of failure from fixating orvagabonding.

The results also show how problem solversshould adapt to different paces of change in theenvironment. Consider the challenge of howquickly to cultivate alternative explanations. Aslower pace of cultivating alternative hypothe-

2009 749Rudolph, Morrison, and Carroll

Page 18: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

ses is useful to reduce the threat of vagabond-ing, but too slow a pace may increase the risk offixation. Cultivating new diagnoses exerts aconstant pressure on the problem solver to im-prove the situation, either by advancing a cur-rent course of action or switching to a differentcourse. The extent to which this pressure isneeded depends on the pace at which the prob-lem situation is worsening—that is, the degreeof temporal dynamism. Slower cultivation keepsopen the window of opportunity to more fullyexplore a leading diagnosis. An environmentthat is changing rather slowly offers the luxuryof slow cultivation of alternatives. But when thestatus of the environment is rapidly deteriorat-ing, even short stints of holding onto an errone-ous leading diagnosis can lead to disaster.

Proposition 3: In action-oriented prob-lem solving, the pace of cultivatingalternatives should correspond to thespeed of undesirable change in theenvironment.

Other features of the environment—particu-larly, how quickly or slowly cues become avail-able and how time consuming it is to generatealternatives—can exacerbate or inhibit prob-lem-solving failures. Our analysis shows that arapid pace of acting, even when interpretationeffects are weak, prevents vagabonding, and arapid pace of cultivating alternatives fostersvagabonding.

Proposition 4: Vagabonding will bemore likely when action is con-strained or information feedback isdelayed.

Proposition 5: Vagabonding will bemore likely when it is relatively easyto generate plausible alternatives.

Similarly, our analysis shows that either arapid pace of acting when interpretation effectsare strong or a rapid pace of cultivating alter-natives reduces the risk of fixation. People aremore susceptible to fixation when the rate atwhich cues emerge and the rate at which newdiagnoses are cultivated are both relativelyslow in relation to the pace at which plausibilitybuilds. Working through action steps morequickly can provide the necessary cues toweaken the self-fulfilling interpretation loop

and also allow new data to speed up the pace ofcultivating alternatives.

Proposition 6: Fixation will be morelikely when action is constrained orinformation feedback is delayed.

Proposition 7: Fixation will be morelikely when it is relatively difficult togenerate plausible alternatives.

Overreliance on a well-learned combinationof action pacing, self-fulfilling interpretation,and rate of cultivating new diagnoses can derailproblem solving when environmental demandschange. The strength of the effect of plausibilityon cue interpretation moderates the balance be-tween acting and cultivating. Problem solversprone to either vagabonding or fixation havethree possible solutions. To avoid vagabonding,they may speed up action, hold diagnoses moreconfidently, or slow down cultivating. To avoidfixation, they may speed up action, hold diag-noses less confidently, or speed up cultivating.Some problem solvers will use a narrow rangeof combinations of these tactics in their action-oriented problem solving, whereas others, per-haps more expert, will develop a wider range ofsolution sets and the ability to better calibratethem to the situation at hand.

Proposition 8: Effectiveness in action-oriented problem solving is a functionof people’s ability to vary the pace ofaction, intensity of self-fulfilling inter-pretation, and pace of cultivating newdiagnoses in relation to environmen-tal demands.

Reinforcing Loops and Confirmation Bias CanBe Beneficial

Our modeling results also highlight the coun-terintuitive benefits of reinforcing loops andconfirmation bias, which both tend to be associ-ated with vicious cycles, as in theories of fixa-tion and escalation of commitment (Arkes &Blumer, 1985; Staw, 1976). Confirmation bias isgenerally seen as either present or not andas undesirable when present, but our modelshows that this bias varies in degree and some-times helps. Although an overly strong reinforc-ing process may lead to fixation, diagnostic vag-abonds can benefit from a moderately strongself-fulfilling process that allows them to sus-

750 OctoberAcademy of Management Review

Page 19: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

tain attention to the correct diagnosis longenough to find the right corrective action. Ap-pendix A provides four hypothetical examples ofconfirmation bias that can produce sensemak-ing modes ranging from adaptive problem solv-ing to fixation to vagabonding.

The unexpected benefit of a self-fulfilling in-terpretation loop emerges in situations in whichit is faster to generate an alternative idea thanto take the action steps needed to evaluate themerits of a particular course of action, espe-cially in the face of incomplete, delayed, andambiguous information. Some inertia is neces-sary to launch and sustain action. Self-fulfillinginterpretation allows the problem solver to “pre-serve the frame” (Klein et al., 2006) while theleading diagnosis gathers support, shieldedfrom the pressure to jump to a new diagnosis.Like hill climbing in a rugged solution terrain,some self-fulfilling interpretation allows prob-lem solvers to stay the course beyond a coupleof short-term failures to ascertain if they areindeed real failures (Fleming & Sorenson, 2004).We therefore suggest the following proposition.

Proposition 9: When accessing cuesfrom the environment is a lengthy pro-cess and generating alternatives isquick and easy, a moderate degree ofself-fulfilling interpretation is benefi-cial.

Limitations

Given the interacting constructs and relation-ships in the model, there is more than one wayto generate the problem-solving behaviors ob-served by Rudolph (2003). Our argument is notthat our model is an exact or complete represen-tation of psychological processes in action-oriented problem solving but, rather, that thevariations in model behavior have been suffi-ciently realistic and informative to provide in-sights about theory. The motivating study thatsparked this theorizing examined doctors at anintermediate level of training working on oneclinical problem in one specialty. It is likely thatresults might be different in different clinicalspecialties, different professions, and at differ-ent levels of expertise. However, by articulatingmechanisms in a stylized way, we have sug-gested constructs and relationships that couldguide further empirical exploration as well asfurther theory development.

Improving Problem Solving in Practice

Whereas researchers and practitioners havepreviously identified several ways to avoid fix-ation, some of these may increase vagabonding.We suggest three general strategies for individu-als and organizations to improve problem solvingin practice: awareness, training, and task design.Managers, trainers, and individual professionalsneed to enhance their awareness of problem-solving patterns so that they can assess their ten-dencies toward fixation, vagabonding, stalling,and adaptive problem solving. Even in the heat ofthe moment, professionals can be more self-aware, recognizing repeat treatments as a signalof fixation and multiple disconnected actions as asignal of vagabonding.

Training is a classic and effective approach toknowledge-based work. By recognizing theproblem-solving patterns, their symptoms, andcauses, training can be enhanced. For example,professionals such as managers, executives, di-saster first responders, and clinicians could usefull-field or computer simulations to experiencethe “feel” of acting too slowly or too quicklywhile “keeping the bubble” of understandingthe situation and its possibilities. Problem solv-ers could learn to recognize the moment at whichtwo diagnoses have similar plausibility and whenit is critical for them to observe their own problem-solving process to calibrate the strength of theself-fulfilling interpretation and the balance be-tween rates of generating new diagnoses and ac-cumulating information.

Problem-solving environments are often pre-sumed to be given, but task design and coordi-nation processes strongly affect outcomes. InRudolph’s study residents were expected to useprotocols that simplified their diagnostic chal-lenges; the protocols were based on accumu-lated clinical experience and possiblycontrolled experimentation. In hospitals, manu-facturing plants, and computer programminglabs, the flow of information from action de-pends at least as much on the quality of orga-nizational processes, coordination, and support(the speed and quality of lab tests, prototypes,and beta tests) as on the mental activity of themanager or clinician (Gittell, 2009). Managerscan rethink their policies and investment prior-ities to relieve constraints that limit the pace ofaction or the pace of generating alternatives.

2009 751Rudolph, Morrison, and Carroll

Page 20: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

APPENDIX A

Figure A-1 describes different relationshipsbetween the plausibility ascribed to the leadingdiagnosis and how heavily the problem solverweights cues. Looking first at the axes of thefigure, when the plausibility of the leading di-agnosis is at its extreme value of 0 (the problemsolver perceives the leading diagnosis as com-pletely implausible), the weight on cues is at itsextreme value of 1 (the problem solver pays fullattention to cues). Conversely, when plausibilityequals 1, the weight on cues is equal to 0 (exceptwhen the effect of plausibility on cue interpre-tation is equal to 0, in which case no matter howplausible or implausible the problem solverdeems his or her current diagnosis, he or shealways gives full weight to cues).

When the effect of plausibility on cue inter-pretation is equal to 0.15, which characterizesa cautious problem solver, even large in-creases in the plausibility of the leading diag-

nosis do not much diminish the weight theproblem solver places on cues. It is not until heor she is almost completely certain the diag-nosis is plausible that the weight on cues di-minishes. When the effect of plausibility oncue interpretation is 0.5, describing a moder-ately bold problem solver, decreases in weighton cues are greater for a given increase in theplausibility of the leading diagnosis. Whenthe effect is equal to 1, depicting a slightlybolder problem solver, a decrease in plausi-bility brings about a proportional decrease inweight on cues.

Weight on Cues (t) � (1 � Plausibility of Leading

Diagnosis (t)) ^ Effect of Plausibility on Cue Interpretation,

where the exponent “effect of plausibility on cueinterpretation” is a parameter chosen to repre-sent possible individual and/or situational dif-ferences.

FIGURE A-1Weight on Cues As a Function of Plausibility of Leading Diagnosis for Various Settings of Effect

of Plausibility on Cue Interpretation

1

0.75

0.5

0.25

0

Plausibility of leading diagnosis

Wei

gh

t on

cu

es

Effect parameter = 1

Effect parameter = 0.5

Effect parameter = 0.15

Effect para me te r = 0

Effect parameter = 1

Effect parameter = 0.5

Effect parameter = 0.15

Effect parameter = 0

0 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1

752 OctoberAcademy of Management Review

Page 21: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

APPENDIX B

Integral equations are written in this appen-dix using the following notation: Stock � INTEG(Inflow � Outflow, Initial Value of Stock), wherethe INTEG function means the integral from time0 to time t (the current time) of the inflow less theoutflow plus the initial value of the stock. Themodel is simulated using Vensim DSS software(available from www.vensim.com).

Equations for the Acting Subsection (Figure 1)

Action Steps Completed � INTEG (Taking Action �Resetting Action Steps, 0)

Units: Dimensionless

Taking Action � (1 � Action Steps Completed)/TimeNeeded to Take Steps

Units: Dimensionless/Minute

Time Needed to Take Steps � 8Units: Minute

Cues Available � (Starting Plausibility of Leading Diagnosis� Action Steps Completed � (Accuracy of Leading Diagnosis� Starting Plausibility of Leading Diagnosis))

Units: Dimensionless

Accuracy of Leading Diagnosis �IF THEN ELSE (Current Diagnosis � True Diagnosis, 1, 0)

Units: Dimensionless

True Diagnosis � 4Units: Dimensionless

Equations for the Interpreting Subsection(Figures 2 and 3)

Plausibility of Leading Diagnosis � INTEG (Updating� Carry Over to Leading � Resetting Leading, InitialPlausibility)

Units: Dimensionless

Updating � (Plausibility from New Cues� Plausibility of Leading Diagnosis)/Time Needed to Update

Units: Dimensionless /Minute

Plausibility from New Cues �

Cues Available � Weight on Cues � (1 � Weight on Cues)Units: Dimensionless

Time to Needed Update � 2Units: Minute

Weight on Cues � (1 � Plausibility of Leading Diagnosis) ^Effect of Plausibility on Cue Interpretation

Units: Dimensionless

Effect of Plausibility on Cue Interpretation � 0.5Units: Dimensionless

Equations for the Cultivating AlternativesSubsection (Figure 4)

Plausibility of Alternative Diagnosis �

INTEG (Cultivating � Resetting Alternative, 0)Units: Dimensionless

Cultivating � Effect of Current Plausibility on Cultivating� (1 � Plausibility of Alternative Diagnosis)/Time

Needed to CultivateUnits: Dimensionless/Minute

Effect of Plausibility on Alternative � min (1, 2 � 2 �

Plausibility of Leading Diagnosis)Units: Dimensionless

Time Needed to Cultivate � 4Units: Minute

Equations for Switching Diagnoses (Figure 5)

Change Trigger � IF THEN ELSE (Plausibility of LeadingDiagnosis � Plausibility of Alternative Diagnosis, 1, 0)/TIME STEP

Units: Dimensionless/Minute

Resetting Action Steps � Action Steps Completed �

Change TriggerUnits: Dimensionless/Minute

Resetting Leading � Plausibility of LeadingDiagnosis � Change Trigger

Units: Dimensionless/Minute

Carry Over to Leading � Resetting AlternativeUnits: Dimensionless/Minute

Resetting Alternative � Plausibility of AlternativeDiagnosis � Change Trigger

Units: Dimensionless/Minute

Starting Plausibility of Leading Diagnosis �

INTEG (New Plausibility � Resetting Starting Plausibility,Initial Plausibility)

Units: Dimensionless

2009 753Rudolph, Morrison, and Carroll

Page 22: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

New Plausibility � Resetting AlternativeUnits: Dimensionless/Minute

Resetting Starting Plausibility � Change Trigger �

Starting Plausibility of Leading DiagnosisUnits: Dimensionless/Minute

Initial Plausibility � 0.5Units: Dimensionless

Current Diagnosis � INTEG (Diagnosis Counter, 1)Units: Dimensionless

Diagnosis Counter � Change TriggerUnits: Dimensionless/Minute

REFERENCES

Amabile, T. M. 1982. Social psychology of creativity: A con-sensual assessment technique. Journal of Personalityand Social Psychology, 43: 997–1013.

Arkes, H. R., & Blumer, C. 1985. The psychology of sunkcost. Organizational Behavior and Human DecisionProcesses, 35: 124 –140.

Arthur, W. B. 1990. Positive feedbacks in the economy.Scientific American, 268: 92–99.

Barondess, J., & Carpenter, C. (Eds.). 1994. Differential di-agnosis. Philadelphia: Lea & Febiger.

Bartunek, J. M. 1984. Changing interpretive schemes andorganizational restructuring: The example of a reli-gious order. Administrative Science Quarterly, 29:355–372.

Black, L., Carlile, P., & Repenning, N. P. 2004. A dynamictheory of expertise and occupational boundaries innew technology implementation: Building on Barley’sstudy of CT scanning. Administrative Science Quar-terly, 49: 572– 607.

Cannon-Bowers, J. A., & Salas, E. 1998. Making decisionsunder stress: Implications for individual and teamtraining. Washington, DC: American PsychologicalAssociation.

Carley, K. 2001. Computational approaches to sociologicaltheorizing. In J. Turner (Ed.), Handbook of sociologicaltheory: 69 – 84. New York: Kluwer Academic/Plenum.

Carroll, J. S., Rudolph, J. W., & Hatakenaka, S. 2002. Learn-ing from experience in high-hazard industries. Re-search in Organizational Behavior, 24: 87–137.

Carroll, J. S., Rudolph, J. W., Hatakenaka, S., Wiederhold,T. L., & Boldrini, M. 2001. Learning in the context ofincident investigation team diagnoses and organiza-tional decisions at four nuclear power plants. InE. Salas & G. Klein (Eds.), Linking expertise andnaturalistic decision making: 349 –365. Mahwah, NJ:Lawrence Erlbaum Associates.

Cook, R. I., & Woods, D. 1994. Operating at the sharp end:The complexity of human error. In B. S. Bogner (Ed.),

Human error in medicine: 255–310. Hillsdale, NJ: Law-rence Earlbaum Associates.

Cyert, R. M., & March, J. G. 1963. A behavioral theory of thefirm. Englewood Cliffs, NJ: Prentice-Hall.

Davis, J. P., Eisenhardt, K. M., & Bingham, C. B. 2007.Developing theory through simulation methods. Acad-emy of Management Review, 32: 480 – 499.

De Keyser, V., & Woods, D. D. 1990. Fixation errors: Fail-ures to revise situation assessment in dynamic andrisky systems. In A. G. Colombo & A. Saiz de Busta-mante (Eds.), Systems reliability assessment: 231–251.Amsterdam: Kluwer.

Dorner, D. 1997. The logic of failure: Recognizing andavoiding error in complex situations. New York: Per-seus.

Dreyfus, H. L. 1997. Intuitive, deliberative, and calculativemodels of expert performance. In C. E. Zsambok &G. Klein (Eds.), Naturalistic decision making: 17–28.Mahwah, NJ: Lawrence Erlbaum Associates.

Edmondson, A. 2002. Disrupted routines: Team learningand new technology implementation in hospitals. Ad-ministrative Science Quarterly, 46: 685–716.

Eisenhardt, K. M. 1988. Agency theory: An assessment andreview. Academy of Management Review, 14: 57–74.

Eisenhardt, K. M. 1989. Making fast strategic decisions inhigh-velocity environments. Academy of ManagementJournal, 32: 543–576.

Elstein, A. S. 2001. Naturalistic decision making and clin-ical judgment. Journal of Behavioral Decision Making,14: 363–365.

Elstein, A. S., & Schwarz, A. 2002. Evidence base of clinicaldiagnosis: Clinical problem solving and diagnosticdecision making: Selective review of the cognitiveliterature. British Medical Journal, 324: 729 –732.

Elstein, A. S., Shulman, L. S., & Sprafka, S. A. 1978. Medicalproblem solving: An analysis of clinical reasoning.Cambridge, MA: Harvard University Press.

Fischhoff, B., & Beyth-Marom, R. 1983. Hypothesis evalua-tion from a Bayesian perspective. Psychological Re-view, 90: 239 –260.

Fleming, L., & Sorenson, O. 2004. Science as a map intechnological search. Strategic Management Journal,25: 909 –928.

Forrester, J. W. 1961. Industrial dynamics. Portland, OR:Productivity.

Gaba, D. M. 1989. Human error in anesthetic mishaps.International Anesthesiology Clinics, 27(3): 137–147.

Gersick, C. J. G., & Hackman, J. R. 1990. Habitual routinesin task-performing groups. Organizational Behaviorand Human Decision Processes, 47: 65–97.

Gittell, J. H. 2009. High performance healthcare: Using thepower of relationships ot achieve quality, efficiencyand resilience. New York: McGraw-Hill.

Glaser, B. G., & Strauss, A. L. 1967. The discovery ofgrounded theory: Strategies for qualitative research.New York: Aldine.

754 OctoberAcademy of Management Review

Page 23: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

Gonzales, C., Lerch, J. F., & Lebiere, C. 2003. Instance-based learning in dynamic decision making. Cogni-tive Science, 27: 591– 635.

Gupta, A. K., Smith, K. G., & Shalley, C. E. 2006. Theinterplay between exploration and exploitation.Academy of Management Journal, 49: 683–706.

Hirt, E. R., & Markman, K. 1995. Multiple explanation: Aconsider-an-alternative method for debiasing judg-ments. Journal of Personality and Social Psychology,69: 1069 –1086.

Hodgkinson, G. P., & Healey, M. P. 2008. Cognition inorganizations. Annual Review of Psychology, 59: 387–417.

Johnson, P. E., Hassenbrock, F., Duran, A. S., & Moller, J. H.1982. Multimethod study of clinical judgment. Organi-zational Behavior and Human Performance, 30: 201–230.

Johnson, P. E., Moen, J. B., & Thompson, W. B. 1988. Gardenpath errors in diagnostic reasoning. In L. Bolc & M. J.Coombs (Eds.), Expert system applications: 395– 427.Berlin: Springer-Verlag.

Jonas, E., Schulz-Hardt, S., Frey, D., & Thelen, N. 2001.Confirmation bias in sequential information searchafter preliminary decisions: An expansion of disso-nance theoretical research on selective exposure toinformation. Journal of Personality and Social Psy-chology, 80: 557–571.

Josephson, J. R., & Josephson, S. G. (Eds.). 1994. Abductiveinference: Computation, philosophy, technology. NewYork: Cambridge University Press.

Kahneman, D., & Tversky, A. 1979. Prospect theory: Ananalysis of decision under risk. Econometrica, 47: 263–291.

Klein, G. 1998. Sources of power. Cambridge, MA: MITPress.

Klein, G., Phillips, J. K., Rall, E., & Peluso, D. A. 2006. Adata/frame theory of sensemaking. In R. Hoffman (Ed.),Expertise out of context. Mahwah, NJ: Lawrence Erl-baum Associates.

Klein, G., Pliske, R., Crandall, B., & Woods, D. 2005. Prob-lem detection. Cognition, Technology and Work, 7: 14 –28.

Klein, G. A., Orasanu, J., Calderwood, R., & Zsambok, C. E.1993. Decision making in action. Norwood, NJ: Ablex.

Kleinmuntz, D. N. 1985. Cognitive heuristics and feedbackin a dynamic decision environment. Management Sci-ence, 31: 680 –702.

Koehler, D. J. 1991. Explanation, imagination, and confi-dence in judgment. Psychological Bulletin, 110: 499 –519.

Langley, A., Mintzberg, H., Pitcher, P., Posada, E., & Saint-Macary, J. 1995. Opening up decision making: Theview from the black stool. Organization Science, 6:260 –279.

Levitt, B., & March, J. G. 1988. Organizational learning.Annual Review of Sociology, 14: 319 –340.

Logan, G. D. 1988. Toward an instance theory of automa-tization. Psychological Review, 95: 492–527.

Louis, M. R., & Sutton, R. I. 1991. Switching cognitive gears:From habits of mind to active thinking. Human Rela-tions, 44: 55–76.

Mandler, G. 1982. Stress and thought processes. In S. Gold-berger & S. Breznitz (Eds.), Handbook of stress: 88 –164.New York: Free Press.

Mandler, G. 1984. Mind and body. New York: Norton.

March, J. G. 1991. Exploration and exploitation in organi-zational learning. Organization Science, 2: 71– 87.

March, J. G. 1994. A primer on decision making: How deci-sions happen. New York: Free Press.

March, J. G. 1997. Understanding how decisions happen inorganizations. In Z. Shapira (Ed.), Organizational de-cision making: 9 –32. Cambridge: Cambridge Univer-sity Press.

Marcus, A. A., & Nichols, M. 1999. On the edge: Heeding thewarning of unusual events. Organization Science, 10:482– 499.

Neisser, U. 1976. Cognition and reality: Principles andimplications of cognitive psychology. San Francisco:Freeman.

Payne, J. W., Bettman, J. R., & Johnson, E. J. 1992. Behavioraldecision research: A constructive processing perspec-tive. Annual Review of Psychology, 43: 87–131.

Peirce, C. S. 1958. Collected papers. Cambridge, MA: Har-vard University Press.

Perlow, L., Okhuysen, G., & Repenning, N. P. 2002. Thespeed trap: Exploring the relationship between deci-sion making and temporal context. Academy of Man-agement Journal, 5: 931–955.

Rasmussen, J., Pejtersen, A. M., & Goodstein, L. P. 1994.Cognitive systems engineering. New York: Wiley.

Repenning, N. P., & Sterman, J. D. 2002. Capability trapsand self-confirming attribution errors in the dynamicsof process improvement. Administrative ScienceQuarterly, 47: 265–295.

Roberts, K. H. 1990. Some characteristics of one type ofhigh reliability organization. Organization Science, 1:160 –176.

Rudolph, J. W. 2003. Into the big muddy and out again:Error persistence and crisis management in the oper-ating room. Chestnut Hill, MA: Boston College.

Rudolph, J. W., & Raemer, D. B. 2004. Diagnostic problemsolving during simulated crises in the OR. Anesthesiaand Analgesia, 98(5S): S34.

Rudolph, J. W., & Repenning, N. P. 2002. Disaster dynamics:Understanding the role of quantity in organizationalcollapse. Administrative Science Quarterly, 47: 1–30.

Sastry, M. A. 1997. Problems and paradoxes in a model ofpunctuated organizational change. AdministrativeScience Quarterly, 42: 237–275.

Simon, H. 1957. Models of man: Social and rational. NewYork: Wiley.

2009 755Rudolph, Morrison, and Carroll

Page 24: THE DYNAMICS OF ACTION-ORIENTED PROBLEM SOLVING: …

Smith, S. M., & Blankenship, S. E. 1991. Incubation and thepersistence of fixation in problem solving. AmericanJournal of Psychology, 104: 61– 87.

Snook, S. A. 2000. Friendly fire: The accidental shootdownof U.S. Black Hawks over Northern Iraq. Princeton, NJ:Princeton University Press.

Starbuck, W. H., Greve, A., & Hedberg, B. L. T. 1978. Re-sponding to crises. Journal of Business Administration,9: 111–137.

Staw, B. M. 1976. Knee-deep in the big muddy: A study ofescalating commitment to a chosen course of action.Organizational Behavior and Human Performance, 16:27– 44.

Sterman, J. 2000. Business dynamics. Chicago: Irwin-McGraw Hill.

Sterman, J. D. 1994. Learning in and about complex sys-tems. System Dynamics Review, 10: 291–330.

Strauss, A., & Corbin, J. 1994. Grounded theory methodol-ogy: An overview. In N. K. Denzin & Y. S. Lincoln (Eds.),Handbook of qualitative research: 273–285. ThousandOaks, CA: Sage.

Suddaby, R. 2006. From the editors: What grounded theoryis not. Academy of Management Journal, 49: 633– 642.

Sutcliffe, K., & Weber, K. 2003. The high cost of accurateknowledge. Harvard Business Review, 81(5): 74 – 87.

Torbert, W. R., & Taylor, S. S. 2008. Action inquiry: Inter-weaving multiple qualities of attention for timely ac-

tion. In P. Reason & H. Bradbury (Eds.), The Sagehandbook of action research: 239 –251. London: Sage.

Tversky, A., & Kahneman, D. 1974. Judgement under uncer-tainty: Heuristics and biases. Science, 185: 1124 –1131.

Weick, K. E. 1993. The collapse of sensemaking in organi-zations: The Mann Gulch disaster. Administrative Sci-ence Quarterly, 38: 628 – 652.

Weick, K. E. 1995. Sensemaking in organizations. Thou-sand Oaks, CA: Sage.

Weick, K. E., Sutcliffe, K., & Obstfeld, D. 2005. Organizingand the process of sensemaking. Organization Sci-ence, 16: 409 – 421.

Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. 1999. Organiz-ing for high reliability: Processes of collective mind-fulness. Research in Organizational Behavior, 21: 81–123.

Winter, S. G. 1971. Satisficing, selection, and the innovat-ing remnant. Quarterly Journal of Economics, 85: 237–261.

Xiao, Y., & MacKenzie, C. F. 1995. Decision making indynamic environments: Fixation errors and theircauses. Paper presented at the Human Factors andErgonomics Society 39th annual meeting, SantaMonica, CA.

Zsambok, C. E., & Klein, G. 1997. Naturalistic decisionmaking. Mahwah, NJ: Lawrence Erlbaum Associates.

Jenny W. Rudolph ([email protected]) is assistant clinical professor of anesthe-sia at Harvard Medical School and assistant in research in the department of Anes-thesia and Critical Care at Massachusetts General Hospital. She is associate directorof the Institute for Medical Simulation at the Center for Medical Simulation. Shereceived her Ph.D. in management from Boston College. Her research focuses on therole of communication and cognition in dynamic problem solving and crisis manage-ment, primarily in health care.

J. Bradley Morrison ([email protected]) is assistant professor of managementat Brandeis International Business School, Brandeis University. He received hisPh.D. from the MIT Sloan School of Management. His research focuses on howpeople and organizations cope with dynamic complexity. Current interests includeproblems in business process implementation, learning, and organizationalchange.

John S. Carroll ([email protected]) is the Morris A. Adelman Professor of Manage-ment at the MIT Sloan School of Management and the MIT Engineering SystemsDivision. He received his Ph.D. from Harvard University. His research focuses ondecision making and learning in organizational contexts, especially in high-hazard industries such as nuclear power and health care.

756 OctoberAcademy of Management Review