Upload
raisa-hategan
View
216
Download
0
Embed Size (px)
Citation preview
7/27/2019 Corner - Structuring Decision Problems
1/20
A DYNAMIC MODEL
FOR STRUCTURING DECISION PROBLEMS
James Corner
John BuchananWaikato Management School
University of WaikatoNew Zealand
Mordecai HenigFaculty of Management
Tel Aviv UniversityIsrael
March, 2000
KeywordsProblem Structuring; Descriptive Decision Making; Multi-Criteria
7/27/2019 Corner - Structuring Decision Problems
2/20
7/27/2019 Corner - Structuring Decision Problems
3/20
INTRODUCTION
The structuring of decision problems is arguably the most important activity in the decision
making process (Mintzberg etal., 1976;Abualsamh et al., 1990; Korhonen, 1997; Perry andMoffat, 1997). According to Nutt (1998a), the failure rate for strategic decisions in general lies
at about 50%, and the implementation rate for multicriteria decision-making efforts is evenworse(Kasanen et al., 2000). This suggests to us that existing decision problem structuring
approaches are not entirely adequate to suit the needs of decision makers or, if they are, thendecision makers are not using them. Practitioners and academics are calling for better decision
problem structuring (Korhonen, 1997; Taket and White, 1997; Wright and Goodwin, 1999, toname a few); recognizing that, in general, improved decision structuring will improve the quality
of the decision outcome. It is this call that motivates the work presented here.
In this paper we develop a conceptual model for multi-criteria decision problem structuring. Ourstarting position is consistent with that of Simon (1955), where decision makers are assumed to
be intendedly rational in their decision-making. This perspective of bounded rationality
accommodates the persistent departure of decision makers choice behaviour from stricteconomic rationality as a result of behavioural limitations, and is well documented (see, for
example, Kleindorferet al., 1993 or Bazerman, 1990). Also, this approach brings together ourearlier work on process concepts in multicriteria decision making (Henig and Buchanan, 1996;
Buchanan et al., 1998; Buchanan et al., 1999) and that of other authors, notably Wright andGoodwin (1999).
Such rational approaches to problem structuring typically involve the determination of
alternatives, criteria, and attributes to measure the criteria. The goal is then to develop ways inwhich these components can be generated and considered relative to each other. Previous efforts
at decision problem structuring (which include von Winterfeldt, 1980; Keller and Ho, 1988;
Henig and Buchanan, 1996 and Wright and Goodwin, 1999) address the creation of criteria andalternatives, and present arguments about their inter-relationships in astatic way. The model wepresent here involves the dynamic interaction of criteria and alternatives, which helps explain
where these components come from and how they can influence each other in the structuringprocess. We offer this model as a descriptive model of problem structuring behaviour, in that it
accommodates a number of empirically validated models of decision making. We further offerthis model as a prescriptive model, in that it follows from the earlier prescription of Henig and
Buchanan (1996), and others cited above.
The distinction, and the gap, between descriptive and prescriptive models is important andreflects the different perspectives adopted by the Psychology and Management Science research
communities, respectively. The descriptive perspective focuses on how we actually makedecisions and the underlying reasons for such behaviour. The prescriptive perspective considers
how weshouldmake decisions with a view to improving the quality of decision-making, both interms of process and outcome.
This paper, then, bridges the gap between prescriptive and descriptive models of decision
making. Prescriptive models lack empirical validity and often require a standard of rationalitythat is theoretically satisfying but practically unobtainable. Descriptive models, while rich in
7/27/2019 Corner - Structuring Decision Problems
4/20
their description of real behaviour, are often context-dependent (in respect of both decision
making context and decision maker cognition), thereby limiting their general applicability forprescription. One principle underlying our approach here is to draw heavily from both
perspectives and synthesise them into an approach that has a well-reasoned foundation, clearempirical support and which can potentially be used in practice.
The paper is organized as follows. We first discuss several definitions of problem structuring
found in the literature, then present and explain our proposed dynamic model. We next performseveral validation tests of our model, which demonstrate the suitability of the model as both a
descriptive and prescriptive model of decision problem structuring. Finally, we discuss severalimplications for the use of the model in multicriteria decision-making, and end the paper with a
summary.
DEFINING PROBLEM STRUCTURING
Simon (1960) proposed a three-phase decision process that continues to be the foundation formany decision making methodologies. The three phases of this well-known process model are
intelligence, design and choice. Intelligence involves determining if a problem that has occurredactually requires a decision, which in turn involves some information gathering activities. Once
the decision has been identified, the design phase begins which we refer to as decision problemstructuring where alternatives, criteria and attributes are identified and considered. The final
phase is choice which, based on identified criteria, describes the activity of selecting the mostappropriate course of action (alternative). Since our emphasis is on multiple criteria, the
identification and role of criteria in the design stage is of immediate and particular importance.Simon later added, but rarely discussed, a fourth stage entitled review activity, which involved
reviewing past choices.
Despite the development of many models that describe and prescribe the structuring of decisionproblems, it remains as much art as it does science. There is no single agreed upon definition of
this important aspect of the decision process, but examples include:
a search for underlying structure (Rivett, 1968), or facts (Thierauf, 1978), an intellectual process during which a problem situation is made specific (Majone, 1980), a process for dealing with issues (Taket and White, 1997), the process by which a problem situation is translated into a form enabling choice (Dillon,
2000), and
the specification of states of nature, options, and attributes for evaluating options, (Keller andHo, 1988).These examples provide general definitions of problem structuring, except for that of Keller andHo, who state a definition most often applied in multicriteria decision making and used by us in
this paper.
7/27/2019 Corner - Structuring Decision Problems
5/20
Woolley and Pidd (1981) propose a scheme for categorising definitions and approaches to
problem structuring. Although their review of problem structuring may appear dated, theircategorisation captures the philosophical differences of these approaches. Their four streams
are:
A Checklist Stream, where problems are seen as failures or breakdowns, so a step-by-stepprocedure is sufficient to correct the problem. A Definition Stream, where problems are seen as a collection of elements such as criteria and
alternatives, and the goal is to put these elements, or variables, into some sort of (decision)model.
A Science Research Stream, where quantitative data is gathered in an effort to uncover thereal problem and its underlying structure.
A People Stream, where problems are individual and/or social constructions of differingrealities, so problems do not exist as concrete entities, but from different perceptions.
We give particular attention to the definition and science research streams as most approaches
for structuring decision problems fall into one of these. The checklist stream addresses simple,routine decision problems, such as diagnostic checking. Soft Systems Methodology (Checkland,
1981), where multiple perspectives are considered, is one example of the last stream. In terms ofstreams two and three, an important distinction should be made. Problem structuring approaches
that assume a particular structure a priori belong to the definition stream. Most decision problemstructuring (including our proposed approach) falls into this stream. For example, Decision
Analysis (Keeney and Raiffa, 1976) and AHP (Saaty, 1980) assume a particular structure ofcriteria (that is, an objectives hierarchy) and alternatives and seek to formulate the decision
problem according to that structure. Stream three, however, is more consistent with traditionalOperations Research (OR) modelling. Here, data are uncovered about the problem and, based on
this data, an appropriate structure is used, typically some type of OR model. While a range of
models potentially exists, the particular choice of model is determined from scientific research ofthe problem data; that is, it comes from the data rather than being a priori assumed. In additionto OR modelling, stream three is well exemplified by Gordon et al.s (1975) nine generic
problem structures, which include new product, distribution channel, acquisition, divestment,capital expenditure, lease-buy, make-buy, pricing, and manpower planning problem structures.
Our proposed approach is firmly grounded in theDefinition stream.
We note that while Woolley and Pidds People stream seems conceptually different from theothers, one can argue that the other three streams might be occurring within the differing realities
of the many individuals involved in a particular decision problem. That is, the first three streamsmight capture the different structuring approaches available, while the fourth stream recognises
that not all decision makers structure a given decision problem in the same way. As we point outlater, this latter statement seems to be true.
Most decision modelling/structuring, and especially multicriteria decision making, is based on a
conceptualisation in terms of criteria and alternatives (Keeney and Raiffa, 1976 and Saaty,1980). Criteria and alternatives are mutually defined, and are the fundamental components of
any multicriteria decision problem. Criteria reflect the values of a decision maker and are themeans by which alternatives can be discriminated. Alternatives are courses of action which can
7/27/2019 Corner - Structuring Decision Problems
6/20
be pursued and which will have outcomes measured in terms of the criteria. Henig and
Buchanan (1996) and Buchanan et al. (1998) present a conceptualisation of the multicriteriadecision problem structure in terms of criteria and alternatives, with attributes as the bridge
between them. More precisely, attributes are the objectively measurable features of thealternatives. Therefore, the decision problem was structured so as to separate the subjective
components (criteria, values, preferences) from the objective components (alternatives andattributes), with a view to improving the decision process. This model is presented in Figure 1.
Figure 1 Mapping Between Criteria and Attributes, Attributes and Alternatives(Buchanan, etal. 1998)
Figure 1 is a static conceptualisation, or model, of a decision problem consistent with Woolleyand Pidds (1981) definition stream. It presents a framework comprising the components of
criteria, alternatives, attributes and their interrelationships, and proposes a partition intosubjective and objective aspects. In their application paper, Henig and Katz (1996) use aspects
of the Figure 1 model in the context of evaluating research and development projects.
Recent discussions have occurred in the literature regarding the creation of and inter-
relationships between all these structuring elements. Keeney (1992) provides a way of thinkingwhereby criteria are identified first in the decision making process, then alternatives arecreatively determined in light of such criteria and choice is then made. Known as value-focussed
thinking (VFT), this has been advocated as a proactive approach wherein explicit considerationof values leads to the creation of opportunities, rather the need to solve problems. However,
Wright and Goodwin (1999) suggest that values are not sufficiently well formed in the earlystages of decision making to be able to do this. They agree with March (1988) that values and
criteria are formed out of experience with alternatives and suggest decision simulations as a way
ALTERNATIVES
ATTRIBUTES
CRITER
IA
Subjective Mapping
Objective Mapping
7/27/2019 Corner - Structuring Decision Problems
7/20
to discover hidden values and, subsequently, promising alternatives. Subsequent comments by
noted researchers in the same issue as their paper appear to agree with them, but argue that suchsimulations might have difficulties in practice.
We call the process of first determining alternatives, then applying value and preference
information to them in order to make a choice, alternative-focussed thinking (AFT), as do manyothers. It is clear from the descriptive decision making literature that AFT is easily the more
dominant procedure (for example, Nutt, 1993). This is also the case when one considers theprescriptive multicriteria decision making literature. The majority of these prescriptive models
explore decision maker values in the context of an already fixed and well-defined set ofalternatives. This imbalance is not surprising. In Figure 1, alternatives are objective, and in the
experience of most decision makers, they are usually concrete and explicit; they are whatdecision makers most often confront. In contrast, criteria are subjective, abstract and implicit,
and their consideration requires hard thought. Therefore, it makes some sense descriptively tostart a decision process with what is obvious and measurable, such as alternatives.
This discussion gives rise to the conceptual model of problem structuring presented in the nextsection.
A DYNAMIC MODEL
The model presented in Figure 1 merely specifies problem structuring elements and does not giverise to aprocess of problem structuring. It does not specifically address the process of how
preferences can be better understood or how the set of alternatives can be expanded. How arethe elements of such a structure (criteria, attributes and alternatives) generated to begin with?
That is, where do alternatives come from? And where do criteria come from?
In order to address these questions, we build upon the model in Figure 1. It is difficult todetermine which should or does come first criteria or alternatives since both are vitally
important to the decision problem structuring process. However, our own casual observations ofdecision-making activity suggest that both generate the other interactively. This is supported by
the literature. For example March (1988), as cited by Wright and Goodwin (1999), argues thatexperience with and choice among alternatives reveals values and by implication, criteria. This
is the thrust of Wright and Goodwin. Decision makers need to test drive undertake adecision simulation of the alternatives in order to find out what they really want.
Henig and Buchanan (1996) proposed that attributes form the bridge between criteria and
alternatives and act as a translation mechanism. That is, as we soon illustrate by example, adecision maker moves from considering criteria to attributes to alternatives. This idea, together
with the above discussion, gives rise to a new model of decision problem structuring involvingthese components. Our proposed model is presented simply in Figure 2, without attributes. The
model shows criteria and alternatives joined in a sort of causal loop, with entry into the loopeither using AFT (dotted line) or VFT (solid line). This interactive model states that thinking
about alternatives helps generate criteria, and vice-versa. That is, neither of these twostructuring elements can be thought of alone, independent of the other.
7/27/2019 Corner - Structuring Decision Problems
8/20
Alternatives Criteria
VFTAFT
Figure 2 - A Dynamic Model of Problem Structuring, Relating Alternatives and Criteria
As an illustration of the dynamic model of Figure 2, consider a car purchase decision. While
driving down the road in your car, you decide it is time to replace your car. Perhaps you thinkthat the car is old and costs a lot to repair, and it does not have the acceleration it used to have.Thus, you have begun to identify a gap in performance, between what you have and what you
want. In Simons terminology, identification of a gap belongs in the intelligence phase; Nutt(1993), in his description of decision-making cases, suggests that all decision structuring begins
with a gap. The gap in this illustration is a gap in values or criteria, which suggests that theprocess began with VFT, since you are loosely thinking about criteria (or, at least, discrepancies
in criteria). But where did these thoughts about performance, about criteria, come from? Theymost likely originated from some consideration of alternatives; perhaps comparing your present
car with a recollection of your car when it was new. It is therefore not entirely clear whether younecessarily started with VFT or AFT, and continuing the backward regression could justify
starting with either. Perhaps where you started does not matter. However, you do start on thedecision journey and you initially give your attention to either criteria or alternatives. Let us
just say that in this illustration, your initial focus was on criteria.
However, as you drive your car, you begin to notice that certain other cars also look appealing,say Brand X, since they appear quick to accelerate, reasonable in price, and are about the size of
your current car. In terms of our model, then, you now have moved in your thinking from thecriteria oval to the alternatives oval. So, you go to the closest Brand X dealership to investigate
further the Brand X cars you have observed while driving. You quickly notice that their pricesgenerally are very reasonable and you test-drive a few of them; that is, you experience first
hand some of the alternatives and learn more about their attributes. You discover that while they
are quick and about the same size as your current car, they also seem more tinny than yoursturdy old car. You thus start thinking about safety issues and how the modern highway is full ofsport utility vehicles (which have high profiles). You deem that your small compact is no longer
appropriate on the same road as such vehicles, so safety becomes a salient issue in this decisionproblem. By considering these particular attributes of the alternatives, you now have surfaced a
new and different objective (that is, you have moved back over to the criteria oval in Figure 2).
7/27/2019 Corner - Structuring Decision Problems
9/20
Next you start to investigate other car brands (alternatives), and realise that there are a range of
other manufacturers that address your previously stated criteria, as well as this new criterion ofsafety. Test-driving these new models then leads to further new criteria (such as a roominess,
or salvage value criterion), and so the iteration between criteria and alternatives continues.How choice ultimately is made is not our concern in this paper, but as outlined here, structuring
is shown as a dynamic interaction between criteria and alternatives. It demonstrates one problemwith just VFT, as implied in a quote by Nutt (1993, page 247), who paraphrased Wildavsky
(1979), managers do not know what they want until they can see what they can get. It alsohighlights the main problem with just AFT. AFT would require a decision maker to generate all
alternatives a priori, perhaps only Brand X and a few others, then apply preferences to choosefrom among this set. No opportunity would have been available in such a procedure for
discovering what is truly wanted in the decision problem, since values and criteria would havebeen considered too late in the decision problem structuring process.
As discussed earlier, how a decision maker enters into this loop may not matter. What does
matter is that the previously static, unidirectional models of decision problem structuring (AFT
and VFT) have been made dynamic. Consideration of alternatives helps identify and definecriteria, while criteria are shown to help identify and define alternatives. If such interaction isallowed to occur, divergent thinking and creativity is explicitly incorporated in the decision
making process, which is considered desirable for the process (Volkema, 1983; Abualsamh,1990; and Taket and White, 1997). While convergent thinking might be more efficient,
divergent thinking can lead to more effective decision making.
This dynamic model has, thus far, been motivated both prescriptively (as a recommendation todecision makers) and descriptively (through the car purchase illustration). The model was first
argued from a theoretical position (belonging to theDefinitions stream of Woolley and Pidd(1981), where a structure has been a priori assumed). Further, it was implicitly presumed that
structuring the decision problem is, in fact, desirable and the use of this decision structuringmodel by a decision maker will aid solution. However, arguing from a theoretical position has
shortcomings; although the logic may be impeccable, real decision makers generally findprescriptive models to be unworkable, usually because the foundational assumptions or axioms
have little empirical validity. In other words, it is a nice idea that does not work in practice. Asothers and we have noted, the multicriteria landscape is littered with many such models,
consigned to obscurity because they are not relevant to real world decision making. They lackempirical validity and do not explain decision making behaviour.
The car illustration leads us into the issue of descriptive or empirical validation. There are at
least two issues to be considered. First, how does descriptive decision making theory and data fitwith the dynamic model we have proposed? And second, is any of this behaviour actually
successful when it comes to decision outcomes? We endeavour to demonstrate that the dynamicmodel has empirical validity in that it accommodates recognised models and data in the
descriptive literature. We also show that the descriptive decision making models closest in spiritto the dynamic model are successful in practice.
7/27/2019 Corner - Structuring Decision Problems
10/20
VALIDATING THE MODEL DESCRIPTIVELY
We now consider the empirical or descriptive validity of the dynamic model. This section
presents validation tests separately below.
Known Theories and ModelsThe first test determines whether or not our model accommodates current descriptive theories
and models of decision making. There are many descriptive theories of decision making (fortaxonomies of descriptive models see Schoemaker, 1980; Lipshitz, 1994; Perry and Moffat,
1997; Dillon, 1998). Many descriptive models deal with the entirety of the decision makingprocess (Mintzberg, etal., 1976; Nutt, 1984, 1993) or just with the choice phase (Einhorn, 1970;
Tversky 1972). However, we pay particular attention here to those decision making models thaton the surface concentrate mainly on decision problem structuring. Consequently, we consider
three of the more enduring and valid models in the descriptive literature, relative to our dynamicmodel: Recognition-Primed Decisions (Klein, 1989), Image Theory (Beach and Mitchell, 1990)
and the Garbage Can Model (Cohen etal., 1972).
The first step in the Recognition Primed Decisions model (Klein, 1989) is to recognize that the
current decision situation has some similarity with past decision situations. Thus, priorexperience is drawn upon to guide the current decision making procedure. The model then states
that the decision maker examines goals, cues, expectations and actions in order to gauge thecurrent situation relative to past situations. This is done through a mental simulation process,
which attempts to project past experience with alternatives into the future to see how thisexperience might be used in the current situation. This involves a sequential, dynamic, mentally
simulated generation and evaluation of alternatives. During this simulation exercise, alternativescan be modified, and goals and expectations can become revised, in order to make the current
situation match with some useful experience, thus enabling a direction in which to proceed with
the decision problem.
Image Theory (Beach and Mitchell, 1990) is similar to Recognition Primed Decisions in that it
draws on the decision makers experience with decision making when trying to decide what todo in the present situation. In this model, the decision makers view is represented by three
different images (or criteria) in order to evaluate alternatives. That is, an alternative-focusedprocess is begun and an alternative is assessed to determine its fit with the decision makers
value, trajectory, and strategic images. The fit is determined through comparing the attributes ofthe alternatives with the decision makers images. If the alternative fits; that is, it exceeds
certain threshold values for the attributes, then it is adopted. Otherwise it is rejected. Finally,any appropriate choice procedure is then used to choose among the alternatives that fit. Without
going into the complexities of this theory, initially the images are considered as fixed. If theoutcome is that all alternatives are rejected, then the thresholds may be revised or, perhaps,
revised images are developed. However, this further consideration of images, or criteria, onlyoccurs when all alternatives are initially rejected. If acceptable alternatives are found that fit
with the images, then little or no effort is directed toward considering new criteria.
Both Recognition Primed Decisions and Image Theory contain aspects of the dynamic model ofFigure 2, with some interaction of alternatives and criteria (and goals, expectations, and actions).
7/27/2019 Corner - Structuring Decision Problems
11/20
Clear attention is given to criteria and alternatives, and one is shown to influence the
development of the other. In the case of Recognition Primed Decisions, alternatives are seen toinfluence goals, and perhaps new alternatives are surfaced as a result. In Image Theory, if all
alternatives are rejected, new images/criteria may surface, which in turn, might lead to a new setof alternatives. Both of these models appear to be alternative focused, and the interaction
between alternatives and criteria seems to be limited to a single iteration between the two.
The Garbage Can Model (Cohen etal., 1976) is a model of organizational decision-making,although it also applies at the level of the individual. This model says that a rational search for
alternatives does not occur, since organizations are really full of anarchy and chaos. Rather,solutions are randomly identified, and then one is matched to an appropriate and compatible
decision problem in need of a solution. No explicit mention is made of where criteria andattributes might fit into this scheme, and it remains classified by us as being an alternative-
focused approach to decision making. This well-known model of decision-making has littlecorrespondence with our model in Figure 2.
To summarize this sub-section, we feel that at least two of the currently popular descriptivemodels of decision-making (Recognition Primed Decisions and Image Theory) provide empiricalsupport for our model. That is, there are elements of these two models that indicate that decision
makers develop some sort of interaction between criteria and alternatives, which forms the basicconcept underpinning our model. This interaction, however, is not necessarily valid for all
descriptive models of the decision making process. One might argue that the Garbage CanModel has no real structuring phase at all, so no current model of problem structuring would
seem to fit this model.
Empirical EvidenceOur second validation test determines if empirical evidence seems to be explained by our model.
We use two previously published large sample studies of decision cases involving real managers,both by Nutt (1993, 1999).
The first study considers problem formulation processes involving real decision cases and
managers (Nutt, 1993). The author defines formulation as a process that begins whensignalsare recognized by key people and ends when one or more options have been targeted
for development (page 227). Based on previously stated definitions, we consider this to benearly equivalent to problem structuring. By examining traces of 163 decisions, four types of
formulation process were generated: Issue, Idea, Objective-Directed, and Reframing. Thesefour processes are briefly summarized below.
The least successful process, as measured by a multidimensional metric measuring each
decisions adoption, merit and duration, is theIssue process. This process is present when anissue is identified and considered by the decision maker(s). Decision makers become problem
solvers as they suggest solutions to concerns and difficulties. Used in 26% of the 163 cases, thisis similar to Woolley and Pidds (1981) Checkliststream of problem structuring.
Third most successful and used in 33% of cases is theIdea process. Here, an initial idea -usually
a solution - is presented by a decision maker, which provides direction for the decision making
7/27/2019 Corner - Structuring Decision Problems
12/20
process. Formulation proceeds as the ideas virtues are discussed and refined. Second most
successful and used in 29% of cases is the Objective-Directedprocess. In this formulationprocess, direction is obtained by setting objectives, goals, or aims early in the process, along with
a freedom to subsequently search for newalternatives.
Finally, most successful and used in only 12% of all cases is theReframingprocess. Thisprocess focuses on solutions, problems, or new norms or practices in order to justify the need to
act, thus avoiding Raiffas (1968) errors of the third kind. That is, the decision problem is re-formulated into a problem that the decision makers are confident they can handle, based on some
new direction.
In comparing our model of Figure 2 to Nutts four types of formulation process, someobservations can be made. TheIssue andIdea processes, which comprise 60% of the decision
cases examined by Nutt, appear to be alternative-focused with no dynamic interaction and littleexplicit consideration of criteria.Together they are the most used but least successful. The
Objective-Directedprocess appears to be value-focused, including explicit consideration of goals
and criteria and search for alternatives. It also performs better that the previous two processes.However, a better descriptive process is available.
TheReframingprocess, least used but most successful, incorporates more aspects of our modelthat the other three, when viewed closely. The details of theReframingprocess show that while
solutions to apparent problems are presented early in the process, new norms and criteria growout of thinking about these solutions and their underlying problems. Based on this revised set of
norms and criteria, other alternatives are then sought. This approach to problem formulation isthought to open the decision process to possibilities before the narrowing that occurs during
problem solving (Nutt, 1993, p. 245). This divergent and creative approach to problemformulation seems close in principle to our conceptual model of dynamic interaction between
criteria and alternatives.
Reframing can be further illustrated as we extend our car example. Assume that thinking aboutthe large size of the sport utility vehicles led you to reflect upon even larger vehicles, say, city
buses. You then start to realise that what you really are after is just a way to commute to work.Thus, your thinking process has led to a reframing of the problem, not as a car selection problem,
but as a transportation problem. The real issue was the best choice of transport to and fromwork; this reframing greatly enlarged the set of alternatives (to mass transportation) and
introduced other criteria (comfort, time and cost savings saved by commuting, etc.).
As discussed earlier in this car purchase/transportation illustration, it remains unclear as towhether one begins with criteria or alternatives. However, once started, thinking only about
alternatives early in the process, as found in theIssue andIdea processes, appears to lead torelatively unsuccessful decisions, while thinking about objectives and criteria early (Objective-
Directedprocess) is more successful. However, the most successful process (Reframing) alsoconsiders alternatives early, and then uses them to identify new norms and criteria. It appears
that the starting point matters less than the need to ensure some iterating between criteria andalternatives.
7/27/2019 Corner - Structuring Decision Problems
13/20
Our second validation test in this sub-section involves Nutts (1999) study involving 340
strategic level decisions from as many organizations (a sample of public, private, and third sectorU.S. and Canadian organizations). In this study, tactics used to uncover alternatives were
determined, as was the success of the decision-making effort, given such tactics. Also, the typeof decision was considered (that is, technology, marketing, personnel, financial, etc., decisions,
to name four of the eight found in the study) in light of the success of the different tactics. Thus,this second study of Nutt concentrates on aspects of decision problem structuring that are related
to our proposed model. The empirical evidence presented in this study can be used to determinethe extent to which our model fits decision makers behaviour descriptively.
Nutts study found six distinctly different tactics used to uncover alternatives, namely: Cyclical
Search (used in 3% of the 340 decisions studied), Integrated Benchmarking (6%), Benchmarking(19%), Search (20%), Design (24%), and Idea (28%) tactics. Cyclical Search involves multiple
searches and redefined needs dependent on what is found during the search. The twoBenchmarkingtactics involve adaptation of existing outside practices and Search involves a
single request-for-proposal type search, based on a pre-set statement of needs.Design involves
generating a custom-made solution, andIdea involves the development of pre-existing ideaswithin the organization.
In comparing these six tactics with our model, we note thatIntegrated Benchmarking,Benchmarking, andIdea tactics all seem to be alternative-focused processes while Search and
Design seem to be value-focused. Furthermore, theIdea tactic resembles the Garbage CanModel presented earlier, as Nutt himself points out (Nutt, 1999, page 15). Interestingly, Cyclical
Search seems to fit surprisingly well with our concept that a dynamic relationship should (ordoes) exist between criteria and alternatives. As stated by Nutt (page 17), Decision makers who
applied a cyclical search tactic set out to learn about possibilities and to apply this knowledge tofashion more sophisticated searches. Each new search incorporated what has been learned in past
searches. Upon close inspection, this tactic also displayed elements similar to Image Theory inits application.
What is more interesting is the success of each tactic. Using the same multidimensional success
metric as he used in his 1993 formulation study reported above, Nutt found Cyclical Search andIntegrated Benchmarkingto be the most successful of the six tactics, with the former more
successful than the latter. His explanation for the success of these two is based on the fact thatthey generated more alternatives than the other four tactics, they were not subjected to hidden
motives as often, and time was not wasted trying to adapt sub-optimal solutions. Furthermore,these two successful tactics encouraged learning and did not lead to a premature selection of a
course of action. The same might be said for a decision process utilizing our problem structuringmodel. Learning about what a decision maker values in a decision problem should stem from
knowledge and experience with alternatives, and learning about creative alternatives shouldoccur as one considers ones values and criteria. Taking the time to let each of these structuring
elements inform the other also should keep the decision process from settling prematurely.
In summary, this section provides empirical support for Figure 2 as a descriptive model of theproblem structuring process for some decision makers. That is, Nutts (1993)Reframing
formulation process and Nutts (1999) Cyclical Search alternative uncovering process each seem
7/27/2019 Corner - Structuring Decision Problems
14/20
to be similar to our model. However, we note that each of these processes were the least used in
Nutts two studies. This is not surprising. It is well known that when using iterative processes, adecision makers tendency toward satisficing often results in premature termination this is what
satisficing is. In practice, decision makers seem to stop early and consequently do not gain thebenefits from that extra iteration.
Also, since these two models ofReframingand Cyclical Search were the most successful in
Nutts studies, this suggests that Figure 2 has potential as a good prescriptive model for thedecision problem structuring process.
DISCUSSION
Our discussion is motivated by the following quote:
A manager should work on integrating his formal models of rational decision
making with hi s in tui tive, judgmental , common sense manner of solving choiceproblems and seek to adapt the former to f it the latter, rather than submit to a
bastardi zation of his in tui tion in the name of some modern mathematical
technique. (Soelberg, 1967, page 28)
Watson (1992) and Dando and Bennett (1981), among others, reject the supremacy of the
rational model of decision making as being impractical and at odds with actual decision makingbehavior. The tendency for decision makers to discard values and even attributes is well
documented. French (1984) and Kasanen, et al. (2000) argue that the assumptions assumed bymany methods may not agree with the practical findings of behavioral science. It should not be
surprising, then, that some non-quantitative methods, apparently without any rational basis,
attract the attention of decision makers. This suggests that there is a demand for decision makingtools provided they are tailored to actual decision maker behavior. Furthermore, Henig andBuchanan (1996), conclude that the decision making process should not focus on selection,
which is at the heart of the rational decision making paradigm, but rather involve understandingand expanding. They advocate a shift in the rational model of decision making and not a total
rejection of it. The goal of finding 'the right answer' is not what decision making should be allabout.
The reader may now justifiably ask, "so which method do you recommend for the decisionmaker who needs advice?" Our answer is: clearly, any method that endeavors to understand and
expand - in this case, as applied to alternatives and criteria. Figure 2 shows how this is done;
criteria not only evaluate the alternatives but also generate them. Alternatives are not only to beranked and selected by criteria, but they also reveal criteria. This paper does not suggest amethod for doing this in detail, but we have outlined here the theoretical, empirically grounded,
foundation for decision problem structuring. This shifts the focus of methods away fromtechniques of selection, which are too technical, and should attract at least a few decision
makers.
7/27/2019 Corner - Structuring Decision Problems
15/20
We reserve the development of a method for doing this for future research. Certainly, the
decision simulation ideas mentioned by Wright and Goodwin (1999) could be part of thisdevelopment, and the literature on creativity should contribute as well (for example, Altshuller,
1985). However, we anticipate that decision makers should iterate between alternatives andcriteria, at least twice, regardless of where they enter the loop. Certainly, more experienced
decision makers might have an established history of such iterations, and they may not need asmany new iterations as those less experienced. As Simon (1956) has pointed out, the observed
behavior of decision makers is to satisfice. In terms of our model, this means they perform onlyone iteration and prematurely stop the decision making process. The entire point of our model is
that they can do better in their decision making if they perform two or more iterations.
We have restricted our discussion to the field of multi-criteria, since most real-life problemsinvolve more than a single criterion. We note that our model, which attempts to allow decision
makers to understand and expand their thoughts regarding criteria and alternatives, is similar toideas mentioned by others in this field. Vanderpooten (1992) argues that in any multi-criteria
procedure, iteration should, or perhaps does, occur between two phases of the decision making
process: learning and search. He states that decision makers do not have well formedpreferences when facing a decision problem. Preference structures become better formed asdecision makers relate past experiences and present explorations of alternatives to the decision
problem at hand. He further states that the decision process is an iterative sequence of suchactivity, which appears conceptually similar to what we propose in this paper. Also, Kasanen, et
al. (2000) discuss how decision alternatives are not well developed during most decisionprocesses and therefore need to be developed further using creative methods. They further
discuss how not all decision-relevant criteria are known by the decision maker at the start andneed to be discovered during the decision process. We think that our model helps address these
problems in a practical way.
SUMMARY
This paper offers a new model of decision problem structuring which highlights the interactive
and dynamic nature of criteria and alternatives. It is offered as both a prescriptive and descriptivemodel. It is validated as a descriptive model of structuring behaviour based on well-acceptedtheories of decision making behaviour, as well as on large sample empirical studies of the
problem structuring process. The model is also prescriptive, based on the success of similarstructuring behaviours found in the large sample empirical studies, as well as having extended
the Henig and Buchanans (1996) prescription involving understanding and expanding in the
decision process. New methods for multi-criteria should consider building in mechanisms thatallow for criteria to generate new alternatives, and in turn, new alternatives to help re-define theset of criteria.
We caution, however, as does Watson (1992), that there probably is no single best problem
structuring method for all decisions. The influence of contextual factors on decision process issignificant (Dillon, 1998; Nutt 1999), as is the initial frame taken by the decision maker (Nutt
1998b). These and other influences should be studied further as they impact the problem
7/27/2019 Corner - Structuring Decision Problems
16/20
structuring process. In a world where the environment is chaotic, wants and desires are
constantly changing, and alternatives appear before decision makers perhaps randomly, problemstructuring methodologies need to be made more dynamic.
AcknowledgementsThis project was financed by grants from The Leon Recanati Graduate School of BusinessAdministration and the Waikato Management School. We also acknowledge The Arizona State
University, where the first author was on sabbatical leave during much of the writing of thispaper.
7/27/2019 Corner - Structuring Decision Problems
17/20
REFERENCES
Abualsamh, R. A., B. Carlin, and R.R. McDaniel, Jr. (1990). Problem Structuring Heuristics in
Strategic Decision Making. Organizational Behavior and Human Decision Processes, 45.159-174.
Altshuller, G. S. (1985). Creativity as an Exact Science. Gordon and Breach, NY.
Bazerman, M. (1990). Judgement in Managerial Decision Making. John Wiley, New York.
Beach, L. R.and T. R. Mitchell (1990). Image Theory: A Behavioral Theory of Decisions in
Organizations. InResearch in Organization Behavior(Vol. 12). B. M. Staw and L. L.Cummings, Eds. Jai Press, Greenwich, CT.
Buchanan, J., M. Henig and J. Corner (1999). Comment on Rethinking value elicitation for
personal consequential decisions, by G. Wright and P. Goodwin,Journal of Multi-
criteria Decision Analysis, 8,1. 15-17.
Buchanan, J. T., E. J. Henig, and M. I. Henig (1998). Objectivity and Subjectivity in the
Decision Making Process.Annals of Operations Research, 80. 333-345.
Checkland, P.B. (1981). Systems Thinking, Systems Practice. Wiley, Chichester.
Cohen, M. D., J. P. March, and J. P. Olsen (1976). A Garbage Can Model of OrganizationalChoice.Administrative Science Quarterly, 17. 1-25.
Dando, M. R. and P. G. Bennett (1981). A Kuhnian Crisis in Management Science?Journal of
the Operational Research Society, 32, 91-103.
Dillon, S. M. (1998). Descriptive decision making: Comparing theory with practice,Proceedingsof the 33
rdAnnual Conference of the New Zealand Operational Research Society, August
31 - September 1, 99-108.
Dillon, S. M. (2000). Defining Problem Structuring. Unpublished Working Paper, Departmentof Management Systems, University of Waikato, New Zealand.
Einhorn, H. J. (1970). The Use of Nonlinear, Noncompensatory Models in Decision Making.
British Psychological Society Bulletin, 73, 3. 221-230.
Gordon, L. A., D. M. Miller, and H. Mintzberg (1975). Normative Models in ManagerialDecision-Making. National Association of Accountants, New York.
Henig, M. and J. Buchanan (1996). Solving MCDM Problems: Process Concepts. Journal of
Multi-Criteria Decision Analysis, 5. 3-21.
Henig, M. and Katz (1996). R&D Project Selection: A Decision Process Approach.Journal of
7/27/2019 Corner - Structuring Decision Problems
18/20
Multi-Criteria Decision Analysis, 5, 3. 169-177.
Kasanen, E., H. Wallenius, J. Wallenius, and S. Zionts (2000). A Study of High-LevelManagerial Decision Processes, With Implications for MCDM Research.European
Journal of Operational Research, 120, 3. 496-510.
Keeney, R. (1992). Value Focused Thinking. Harvard University Press, MA.
Keeney, R. and H. Raiffa (1976). Decisions with Multiple Objectives. Wiley, New York.
Keller, L. R. and J. L. Ho (1988). Decision Problem Structuring: Generating Options. IEEETransactions on Systems, Man, and Cybernetics, 18, 5. 715-728.
Klein, G. A. (1989). Recognition Primed Decisions, inAdvances in Man-Machine Systems
Research. W. B. Rouse, Eds., JAI Press, Greenwich, CT. Vol. 5, 47-92.
Kleindorfer, P., H. Kunreuther and P. Schoemaker (1993). Decision Sciences: An IntegrativePerspective, Cambridge University Press.
Korhonen, P. (1997). Some Thoughts on the Future of MCDM and Our Schools. Newsletter,
European Working Group Multicriteria Aid for Decisions, 2, 10. 1-2.
Lipshitz, R. (1994). Decision Making in Three Modes. Journal for the Theory of SocialBehavior, 24, 1. 47-65.
Majone, G. (1980). An Anatomy of Pitfalls, inPitfalls of Analysis, G. Majone and E. Quade,
Eds. IIASA International Series on Applied Systems Analysis, Wiley, Chichester. 7-22.
Mintzberg, H., D. Raisinghani, and A. Theoret (1976). The Structure of Unstructured DecisionProcesses.Administrative Science Quarterly, 21, 2. 246-275.
Nutt, P. (1984). Types of Organizational Decision Processes.Administrative Science Quarterly,
29, 3. 414-450.
Nutt, P. (1993). The Formulation Processes and Tactics Used in Organizational DecisionMaking. Organization Science, 4,2. 226-251.
Nutt, P. (1998a). Surprising But True: Fifty Percent of Strategic Decisions Fail.Academy of
Management Executive, In Press.
Nutt, P. (1998b). Framing Strategic Decisions. Organization Science, 9, 2. 195-216.
Nutt, P. (1999). A Taxonomy of Strategic Decisions and Tactics for Uncovering Alternatives.Unpublished Working Paper, Ohio State University, Columbus.
7/27/2019 Corner - Structuring Decision Problems
19/20
Perry, W. and J. Moffat (1997). Developing Models of Decision Making. Journal of the
Operational Research Society, 48. 457-470.
Raiffa, H. (1968).Decision Analysis. Addison-Wesley, Reading, MA.
Rivett, B. H. P. (1968). Concepts of Operational Research. New Thinkers Library.
Saaty, T.L. (1980). The Analytic Hierarchy Process. McGraw-Hill, New York.
Schoemaker, P. J. H. (1980). Experiments on Decisions Under Risk: The Expected UtilityTheorem. Martinus Nijhoff Publishing, Boston.
Simon, H.A. (1955). A Behavioral Model of Rational Choice. Quarterly Journal of Economics,
69. 99-118.
Simon, H.A. (1956). Rational choice and the structure of the environment.Psychological
Review, 63. 129-152.
Simon, H. A. (1960). The New Science of Management Decision. Prentice Hall, NJ.
Soelberg, P. (1967). Unprogrammed Decision Making. Industrial Management Review, 8, 2.
19-29.
Taket, A. and L. White (1997). Wanted: Dead or Alive Ways of Using Problem-structuringMethods in Community OR.International Transactions in Operational Research, 4, 2.
99-108.
Thierauf, R. J. (1978).An Introductory Approach to Operations Research. Wiley, New York.
Tversky, A. (1972). Elimination By Aspects: A Theory of Choice.Psychological Review, 79, 4.281-299.
Vanderpooten, D. (1992). Three Basic Conceptions Underlying Multiple Criteria Interactive
Procedures. In Multiple Criteria Decision Making, Proceedings of the Ninth InternationalConference: Theory and Applications in Business, Industry, and Government. A.
Goicoechea, L. Duckstein, and S. Zionts, Eds. Springer-Verlag. 441-448.
Volkema, R. (1983). Problem Formulation in Planning and Design.Management Science, 29.639-652.
Watson, S. R. (1992). The Presumptions of Prescription.Acta Psychologica, 80, 7-31.
Wildavsky, A. (1979). Speaking Truth to Power, Chapter 5, Between Planning and Politics:
Intellect vs. Interaction as Analysis, Little Brown, Boston, MA.
7/27/2019 Corner - Structuring Decision Problems
20/20
Wright, G. and P. Goodwin (1999). Value Elicitation for Personal Consequential Decisions.
Journal of Multi-criteria Decision Analysis, 8,1. 3-10.
Woolley, R. N. and M. Pidd (1981). Problem Structuring A Literature Review.Journal of theOperational Research Society, 32. 197-206.