25
1-1 Ancient Notes on Decision Analysis 1.1 Introduction to Decision Analysis Almost everyone makes a great many decisions every day, usually without any form of detailed analysis. We decide what to eat for breakfast, how much time to spend on studying, which friends to visit, at what time to go to bed, and -- less frequently -- some weightier matters also. This course is directed to students of management and is concerned with the kinds of decisions that are made within business enterprises and similar organizations. These are like the familiar personal decisions of daily life in one very significant respect: the vast majority of business decisions, like the bulk of purely personal decisions, must be and are made on the basis of very little data, calculation, and thought. The busy executive cannot afford the time for prolonged deliberation as to what letters he will write Monday morning (or even as to what he will say in most of them) any more than in his nonbusiness life can he afford to ponder over his choice of breakfast cereal. However many decisions are of sufficient importance and complexity to merit more extensive analysis. In such cases the decision maker is apt to obtain some data, perform some calculations or analysis, and put together a report, recommendation, or other document intended to justify the selection of one course of action in preference to others. The amount of time and effort needed to prepare and present this analysis and conclusions will depend mainly on how much is at stake in the decision (as the executive sees it) and on how difficult it is to see and prove which course of action is the “right” one. Let us consider the following example, which is simplified for illustrative purposes. Example: “War Stars” . Mr. Fox, manager of Mount Para Productions, has recently received a script for a movie called “War Stars”. Responsible for all movie productions Mr. Fox has to decide whether or not to produce “War Stars”. Advisors told him it is typically an “all or nothing” (i.e., it will be either a “success” or a “failure”) script with a fifty-fifty chance of success. To his best estimate Mr. Fox believes the movie will result in a $6 million dollar net profit in the case that it is a success. However, Mount Para has to face an estimated net loss of $4 million dollars if the movie is not well received. Question : Should Mr. Fox produce “War Stars”? Decision Tree . In “solving” the problem we will make use of a decision tree (also called decision diagram ) which provides a symbolic representation of a sequential decision process. A decision tree shows, at one glance, when decisions can be made, what the possible consequences are, and what the resultant pay-offs will be. Another advantage of a decision tree is that the results of the computations are depicted directly on the tree, thus simplifying the analysis. Question : What does the “War Stars” decision tree look like? First a decision has to be made whether or not to produce the movie: Produce movie Do not produce movie

Notes on decision analysis

Embed Size (px)

DESCRIPTION

full decision analysis details

Citation preview

Page 1: Notes on decision analysis

1-1

Ancient Notes on Decision Analysis

1.1 Introduction to Decision Analysis

Almost everyone makes a great many decisions every day, usually without any form of detailed analysis. We decide what to eat for breakfast, how much time to spend on studying, which friends to visit, at what time to go to bed, and -- less frequently -- some weightier matters also. This course is directed to students of management and is concerned with the kinds of decisions that are made within business enterprises and similar organizations. These are like the familiar personal decisions of daily life in one very significant respect: the vast majority of business decisions, like the bulk of purely personal decisions, must be and are made on the basis of very little data, calculation, and thought. The busy executive cannot afford the time for prolonged deliberation as to what letters he will write Monday morning (or even as to what he will say in most of them) any more than in his nonbusiness life can he afford to ponder over his choice of breakfast cereal.

However many decisions are of sufficient importance and complexity to merit more extensive analysis. In such cases the decision maker is apt to obtain some data, perform some calculations or analysis, and put together a report, recommendation, or other document intended to justify the selection of one course of action in preference to others. The amount of time and effort needed to prepare and present this analysis and conclusions will depend mainly on how much is at stake in the decision (as the executive sees it) and on how difficult it is to see and prove which course of action is the “right” one.

Let us consider the following example, which is simplified for illustrative purposes. Example: “War Stars”.

Mr. Fox, manager of Mount Para Productions, has recently received a script for a movie called “War Stars”. Responsible for all movie productions Mr. Fox has to decide whether or not to produce “War Stars”. Advisors told him it is typically an “all or nothing” (i.e., it will be either a “success” or a “failure”) script with a fifty-fifty chance of success. To his best estimate Mr. Fox believes the movie will result in a $6 million dollar net profit in the case that it is a success. However, Mount Para has to face an estimated net loss of $4 million dollars if the movie is not well received. Question: Should Mr. Fox produce “War Stars”? Decision Tree.

In “solving” the problem we will make use of a decision tree (also called decision diagram) which provides a symbolic representation of a sequential decision process. A decision tree shows, at one glance, when decisions can be made, what the possible consequences are, and what the resultant pay-offs will be. Another advantage of a decision tree is that the results of the computations are depicted directly on the tree, thus simplifying the analysis. Question: What does the “War Stars” decision tree look like?

First a decision has to be made whether or not to produce the movie:

Produce movie

Do not produce movie

Page 2: Notes on decision analysis

1-2

The above is called a fork, in this case an act fork. It is always represented by a small box with

several branches emanating from it. The individual branches of the fork represent all the options which the decision maker wishes to consider in making his choice. It does not necessarily represent all possible options. (For example, it may be possible to use the script for a play, but Mr. Fox does not wish to consider this option as he is only concerned with movies).

Thus an act fork represents a decision point at which the decision maker has a choice.

If it is decided to produce the movie, two outcomes may occur, each with equal chance. (Recall: fifty-fifty chance of “success”):

“success” (50%)

“failure” (50%)

The above fork is called an event fork, for which we will always use a circle followed by several branches. The branches represent the relevant consequences of the preceding decision (act). An event fork indicates a chance event whose outcome is not known to the decision maker at the time the decision is to be made. For instance, in the War Stars example we “only” know there is a fifty-fifty chance for success or failure. The decision maker has no choice, the outcome of the event is out of his/her control.

If it is decided not to produce “War Stars”, no relevant event or act (decision) is anticipated. We are now ready to create the first decision tree using the previous forks:

“success” (50%)Produce movie

“failure” (50%) Do not produce movie

Even though the above diagram clarifies Mr. Fox’s decision process, it is possible to include more information. In the above tree there are clearly three end positions and each position actually refers to a sequence of branches from the beginning to this end position. And to each end position we can assign a net pay-off value, which we will call an end point. The three sequences and their end points are:

1) “produce movie” and “success”; Endpoint: $6 million. 2) “produce movie” plus “failure”; Endpoint: -$4 million. 3) “do not produce movie”; Endpoint: $0. The tree will be redrawn to include the above information:

“success” $6 million (.5) Produce movie

“failure”

Notice that the fifty-fifty chance is changed into the equivalent fractions of one. Also, instead of chance, we will often refer to such a fraction as the probability or likelihood. For example, the probability, that the movie will be a “success” is .5.

(.5) Do not produce movie -$4 million

Page 3: Notes on decision analysis

1-3

Before discussing the optimal strategy for Mr. Fox’s decision problem, we will indicate some

difficulties that may arise in more complicated and realistic situations. Also, some definitions are given. Requirements about Forks.

A fork is either an event fork or an act fork; any situation which might appear to be a mixture of chance and choice should be represented by two or more forks, including at least one act fork and one event fork.

Regardless of whether a fork is an event fork or an act fork, the events or acts represented by its branches should be of sufficient number and so labelled that they:

1) Include all possibilities under consideration, and 2) Include each possibility only once.

The term collectively exhaustive is a technical term used to mean that all possibilities under

consideration are included.

The following event fork represents the residential area of a randomly-selected full time U.B.C. student:

Vancouver

“Lower Mainland” excluding Vancouver

British Columbia excluding “Lower Mainland”

The branches of this fork do not exhaust all the possibilities. The student may live somewhere else, for instance in Bellingham (U.S.A.). One way of making the fork collectively exhaustive would be to add a branch “Outside British Columbia”. The technical term mutually exclusive means that each possibility is included only once. In other words, the descriptions of each of the branches do not overlap; the selection of one excludes all the others. For example, the following event fork - again representing the residential area of a randomly-selected full time U.B.C. student - does not have mutually exclusive branches:

Vancouver

“Lower Mainland”

British Columbia Outside British Columbia

Figure 1.1. A fork whose branches are not mutually exclusive.

Living in the “Lower Mainland” does not exclude living in Vancouver. Moreover, living in British Columbia does not exclude both living in Vancouver and living in the “Lower Mainland”

Page 4: Notes on decision analysis

1-4

Questions: 1.1.a. Are the branches of the fork in Figure 1.1 collectively exhaustive? 1.1.b. A world traveller is deciding where to go on her next vacation and drew the decision diagram

below. Are the branches mutually exclusive?

Travel to Europe first

Travel to Italy first

Travel to Amsterdam first

1.1.c. A student wishes to buy a typewriter and draws the diagram given below. Are the branches

mutually exclusive?

Buy a second-hand typewriter

Buy an electric typewriter

Do not buy a typewriter

1.1.d. Can you say whether the branches of the fork in problem 1.1.c are collectively exhaustive? if

not, make assumptions and modify (if necessary) the fork into one whose branches are collectively exhaustive.

Forks with a Large Number of Branches

There are many situations in which one would like to have a fork with a large number of branches. For instance, a production manager, in deciding on what quantity of a certain item to produce, may wish to consider a whole range of possibilities. Or there may be uncertainty about the demand for an item in a given time period, and this demand might have a great range of possible values. In such a case, it is impractical to include every branch on a decision diagram individually. Instead, such a fork is represented schematically by a “fan” which indicates that there is a great range of possibilities and shows a few typical branches. For example, if the above-mentioned production manager wishes to consider production runs lying between 50 and 275 items, a fan representing his act fork would look like that shown below.

50

27.5

The analysis of diagrams involving fans is conceptually no more difficult than analysis of diagrams involving simpler forks. However, because a greater number of possibilities are implied by a fan, the amount of computation involved is often greater.

In the “War Stars” example a fan would appear if it had not been an “all or nothing” script, but instead several levels of “success” and “failure” had been recognized, each having its own end point.

Page 5: Notes on decision analysis

1-5

Assignment of Probabilities

In many real life situations the probabilities for the different outcomes are not easy to obtain. It can be difficult and sometimes even impossible to assign the “right” probabilities. In the “War Stars” example advisors informed Mr. Fox about the probability of “War Stars” becoming a “success”. But even there, how did his advisors arrive at the probabilities?

In answering this question it may be useful to make the following distinction between sources of probabilities. Fran the point of view of decision analysis, relative frequency and subjective probabilities will be most useful. a. Subjective Probability

Probability obtained in this manner are based on personal degrees of belief. A manager may claim that the chance of losing money this year is only 1/100. Or a Vancouver resident claims that the probability of the Canucks reaching the Stanley Cup finals is 9/10. In order to arrive at a subjective “degree of belief” probability it may be useful to perform a “lottery”. For example, suppose you may choose between the following two lotteries:

- Toss a coin: if it lands heads you will receive $1,000, but if it lands tails you will not

receive anything.

- If you pass “Commerce 211” with a First Class you will receive $1,000, but if you do not, you will not receive anything.

Let us agree that in the first lottery your chance of winning $1,000 is equal to 1/2. (See Question

1.1.g.) Suppose you were not indifferent between the two lotteries and chose the second. This means you believe that the probability of your achieving a First Class in “Commerce 211” is greater than 1/2. By performing a sequence of similar lotteries it is possible to assess your own belief of what is the probability you receive a First Class. (See Question 1.1.h.) b. Objective Probability

i. Deductive Logic. A technical term to indicate that the assignment of the probability is determined logically from symmetry or geometric considerations associated with the experiment. For example, without actually rolling a fair die many times, we could say that the probability that a “five” appears is equal to 1/6.

ii. Relative Frequency definition of probability. For example, suppose you have a box with

many red and white balls, but you do not know how many are red and how many are white. Moreover, for some reason you are not able or allowed to count the balls one by one. To “estimate” the probability of drawing a red or white ball, you might perform the following experiment: Draw one ball at random, record its colour, and put it back into the box. Repeat this, say 100 times. Suppose you have drawn 70 red and 30 white balls, then you could say that the probability of drawing a red ball from the box is close to .7. More generally, the probability defined by relative frequency is n/N, where ii is the number of times the event “a red ball is drawn” occurs during N repeated experiments. In our example n=70, whereas N=100, thus n/N 70/100 = .7. Of course, .7 is only an estimate of the “true” probability. The statistical theory developed in Commerce 212 will enable us to measure the accuracy of this estimate.

Objective and subjective probabilities are fundamentally different. If we asked a number of

persons to determine the probability objectively, each would arrive at the same answer, provided they were given the same set of assumptions. But, if we asked them to determine the probability subjectively, each individual would arrive at his or her own answer.

Page 6: Notes on decision analysis

1-6

Finally, we often use a mathematical notation to express probabilities. For example, the probability

that a head appears in a coin-tossing experiment is equal to 1/2, can be denoted as follows:

H = Event that a head appears P(H) = 1/2

Questions: 1.1.e. A ball is randomly drawn from an urn containing four white and six red balls. What is the

probability that this ball is a red one? And what does “random” mean? What definition of probability did you actually use?

1.1.f. Suppose that in the past one out of 50 creditors of a firm defaulted. One could say that the

probability that a creditor will default in the future is .02. Is the “probability using historic data” objective or subjective? Is it possible to have a “probability using historic data” that is strictly objective?

1.1.g. What is the probability that a head appears in a coin-tossing experiment? What is the main

assumption you made? What definition of probability did you use? 1.1.h. Create a sequence of lotteries to discover what your neighbour in class believes his

probability of getting a first class for Commerce 211 is. 1.1.i. What method(s) do you think Mr. Fox’s advisors used to arrive at the probability that “War

Stars” would be a success? 1.1.j. Apply the new notation “P(...) = ...” to the “War Stars” example.

Page 7: Notes on decision analysis

1-7

1.2. Decision Analysis: Criteria of Choice

In the previous chapter a general introduction to Decision Analysis was given, a decision tree was constructed for the “War Stars” example, some difficulties that may arise in more complicated and more realistic situations were indicated, and finally some basic definitions were introduced.

However, no decision in Mr. Fox’s decision problem has yet been made!

In this chapter we discuss a few of many possible criteria of choice in differing decision making settings. A. Decisions Under Certainty: This is the situation when the decision tree does not contain any event

forks. Decision problems under certainty are called deterministic. The case study in Chapter 1.4., the Vancouver Electronics Company (A), is the only deterministic example we will encounter in this course. Linear programming is another example of deterministic decision making. A suitable criteria might be maximization of profit or minimization of cost.

B. Decisions Under Risk: “Expected Monetary Value” (EMV); This criterion makes use of known or

assessed probabilities. This course is mainly concerned with problems involving EMV. This criterion is applied to the “War Stars” example in Chapter 1.3. An alternative is to use utility theory (see 1.5).

C. Decisions Under Uncertainty: In this setting, we assure we have no knowledge of and refuse to

make any assumptions about the probabilities of occurrence of the uncertain event. We distinguish three possible criteria of choice for decisions under uncertainty and provide brief comments on their interpretation and application to the “War Stars” example.

C.1. Maximin (or Minimax) Criterion: Maximize minimum profit (or, in case the endpoints refer to

losses: minimize maximum loss). This very conservative criterion chooses the “least worst” decision and is most useful when bad consequences must be avoided at all costs. If Mr. Fox decides to produce the movie, the “worst” that can happen is a net loss of $4 million. If he decides not to produce the movie the “worst” that can happen is a “net loss” of $0 (in this case the only possibility). Clearly, the “least worse” is not to produce the movie.

C.2. Plunger Criterion (Maximax or Minimin): Choose the decision with best of all consequences.

Usually used by very optimistic decision makers or gamblers, in situations where consequences do not matter too much, or in desperate situations. If Mr. Fox wishes to apply this criterion he should decide to produce “War Stars” (and hope the movie is going to be a success).

C.3. Minimax Regret Criterion: This criterion is sometimes referred to as the “morning after” view.

The Regret is the difference between what you get from a decision and what you would have gotten if you had known the outcome before making the decision. If the movie were a success, Mr. Fox would choose to produce the movie. If the movie were known to be a failure, he would not produce the movie. If Mr. Fox decides not to produce the movie his “maximum regret” is $6 million (namely, he lost the opportunity to produce a successful movie). If Mr. Fox decides to produce the movie, his “maximum regret” is $4 million (namely, if the movie is a “failure” he could have done better by not producing the movie and avoid the $4 million net loss). Clearly, the minimum “maximum regret” criterion results in deciding to produce “War Stars”.

In practice we usually have a rough idea of the relevant probabilities and these criteria (C.1, C.2 and C.3) are of little practical importance. Notice that you do not need to know the probabilities for any of these criteria.

Page 8: Notes on decision analysis

1-8

Questions: 1.2.a. The decision tree below reflects the problem faced by Mr. S. Lake, Manager of the student

pub “The Pot”, just before a pending beer strike in the summer of 1978.

Strike (.1) $4,000 Stock-pile beer

No strike (.9)

In which way(s) is this diagram simplified and unrealistic?

1.2.b. Determine the strategy Mr. Lake should adopt assuming he would like to use: - Maximin Criterion; - Maximax (Plunger) Criterion; - Minimax Regret Criterion.

Under what circumstances do you think the above criteria might be applied?

Strike (.1)

No strike (.9)

$2,000

-$6,000

Maintain presentInventory policy $6,000

Page 9: Notes on decision analysis

1-9

1.3 Decision Analysis: Expected Monetary Value (EMV)

In this chapter we discuss the strategy Mr. Fox should adopt assuming he wishes to use “Expected Monetary Value” (EMV) as his criterion of choice. Should “War Stars” be produced or not?

The EMV approach prescribes that the decision maker select the alternative with the best expected (average) payoff. The expected payoff (or the EMV) of an alternative is the sum of all possible payoffs of that alternative, weighted by the probabilities of those payoffs occurring. For example, the EMV of the event fork in the “War Stars” example (of which the tree is repeated below) can be calculated as follows: Step 1: Multiply each payoff (endpoint) by its corresponding probability.

Example: “Success”: (.5) ($6 million) = $3 million “Failure”: (.5) (-$4 million) = -$2 million

Step 2: Sum up the results of the multiplication of Step 1; the total is the EMV.

Example: ($3 million) + (-$2 million) = $1 million. Thus, the EMV of the event is $1 million.

At the initial decision point, the decision maker has a choice between “produce movie” and “do not produce movie”. The first choice has an EMV of $1 million, whereas the second choice has an EMV of $0 (which in this case is the same as the end point).

EMV = $1 million EMV = $1 million

“Success” $6 million (.5) Produce movie

Figure 1.2 Complete “War Stars” Decision Tree

The choice is clear: The optimal strategy for “War Stars” using EMV is to produce the movie! Folding Back the Decision Tree (Backward Induction)

Notice that the calculation of EMV starts at the right most end of the tree and continues to the left until the origin has been reached. The calculation of the EMV at a chance point is different from the EMV calculation at a decision point.

At a chance point the EMV is calculated by a probability weighted average of all possible payoffs of that alternative. At a decision point the payoffs for each alternative are compared and the best one is selected as the EMV (the expected payoff) for that decision point. All other alternatives are disregarded, or pruned. (See “//“ in the tree of Figure 9.)

“Failure” (.5)

-$4 million

Do not produce movie $0

Page 10: Notes on decision analysis

1-10

EMV: “Playing the long-run averages”

The expected Monetary Value (EMV) decision criterion is typically used when many similar decisions have to be made under risk. Let us take the hypothetical situation where Mr. Fox will have to decide often, say 100 times whether or not to produce a movie under similar circumstances. If in all cases he decides to produce the movie, what would be his total net profit or loss? According to the Fifty-Fifty chance of success in all cases, he would expect about one half of these 100 movies, say about 50, to become a “success”, and the other half to become a “failure”. The total net profit would then be: 50 x ($6 million) + 50 x (-$4 million) = $100 million, which means on the long run a net profit of $1 million per movie. Notice that this amount is exactly the EMV at point 1 in the tree in Figure 1.2. The above illustrates, that using EMV implies “playing the long-run averages”. The above calculation illustrates the law of large numbers; an important result in probability. Questions: 1.3.a. Suppose the endpoint in Figure 1.2 for “success” is $3 million, what strategy should Mr. Fox

adopt, assuming he prefers to use EMV as his decision criterion? 1.3.b. Use EMV to determine Mr. Fox’s decision if the probability of “success” was .3, i.e.,

P(“success”) = .3 and P(“failure”) = .7. 1.3.c. Suppose the probability of “War Stars” becoming a success is p, where 0 ≤ p ≤ 1. For what

value of p would Mr. Fox be indifferent between producing and not producing the movie, i.e., what are the break-even probabilities? (Use EMV.)

1.3.d. Referring to Problem 1.2.q, determine the strategy the manager of “The Pot” should adopt,

assuming he prefers to use EMV as his criterion of choice? 1.3.e. Of course, Mount Para will not produce “War Stars” 100 times but EMV might still be an

appropriate decision making criterion. Why?

Page 11: Notes on decision analysis

1-11

1 .4. Case Study: Vancouver Electronics Company (A)

Vancouver Electronics Company (VEC) is a medium-sized manufacturer of electronic components founded in 1967. It produces specialized electronic parts that are purchased for use in automated factories.

In August of 1975, Mr. Andrew Howard, founder of VEC, was contacted by Mr. William Stone, president of Stone Manufacturing Company. Mr. Stone was preparing to build a new automated cement mix factory in White Rock and wanted to know if VEC could supply him with 100 electronic ampometers with housings at $1000 each for use in this factory.

Since his plant will be operating at fairly low capacity until January, Mr. Howard was inclined to accept this new job. VEC has never produced ampometers before, but it has produced a closely related product and no technical problems seemed to stand in the way. Mr. Howard called Mr. Peter Wong, his Plant Engineer, into his office and told him of Mr. Stone’s offer.

“Considering the slack capacity in the plant, I think we should accept this offer,” Mr. Wong said. “One worker could assemble one ampometer in about 10 hours, so at our current wage rates, that’s about $50. Raw materials would be about $450 per ampometer. There are two possible ways we could go with the housings for the ampometers. We could buy them at $300 each ... they are very similar to the ones we bought for that small subcontract last year ... or we could buy a mold from Farentox Burnaby and make them ourselves for about $50 each. The Farentox mold would cost us about $17,500, but then we’d have it if we ever needed to make additional ampometer housings.” Questions: 1.4.a. According to the above text, is Mr. Howard faced with any uncertainties? 1.4.b. What should Mr. Howard decide? 1 .4.c. How would Mr. Howard’s decision change if Mr. Stone had asked for 50 rather than 100

units? 1.4.d. At what number of units would Mr. Howard be indifferent as to which method be used? 1.4.e. If Mr. Howard decides to purchase the mold and make the housings, how many units must he

sell to start making a profit? How many units must he sell in order to make a profit if he desires to buy the housings instead?

Page 12: Notes on decision analysis

1-12

1.5 Decision Analysis: Utility

It is often assumed that expected profit or loss in dollars is the appropriate measure of the consequences of taking an action, given a state of nature. However, there are many situations where this is inappropriate. For example, suppose that an individual was offered the choice of accepting (1) a 50-50 chance of winning $10,000 or nothing, or (2) receiving $4,000 with certainty. Many people would prefer the $4,000 even though the expected payoff on the 50-50 chance of winning $10,000 is $5,000. A company may be unwilling to invest a large sum of money in a new product even if the expected profit is substantial if there is a risk of losing their investment and thereby becoming bankrupt. People buy insurance even though it is a poor investment with a negative EMV.

Do these examples invalidate the previous material? Fortunately, there is a way of transforming monetary or even non-monetary values into an appropriate scale that reflects the decision makers’ preferences. This scale is called the utility scale, and it can be used to measure the consequences of taking an action, given an outcome.

We will not study utility in depth in this course, but it is important to realize that the concept of utility is useful in certain situations. Questions: 1.5.a. You may choose between the following two lotteries: 1) P(Win $100) = 1/2 2) P(Win $10,000) = 1/2 P(Lose $10) = 1/2 P(Lose $9,910) = 1/2

Win $100 Win $10,000 $10,000$100

Assuming you are using EMV, show that you are indifferent between the two lotteries. 1.5.b. This time not using EMV, but instead your intuitive feeling, which lottery would you choose? 1.5.c. What would be your answer under 1.5.b. if you were a millionaire?

(.5) (.5)

Lose $10 Lose $9,910 (.5) (.5) -$10 -$9,910

Page 13: Notes on decision analysis

1-13

1.6. Decision Analysis: The Case of Petro Enterprises

An example of how to construct a more complicated decision diagram appears in the article “Better Decisions with Preference Theory”, the Harvard Business Review, Nov/Dec 1967. In the article which we will refer to as “The Case of Petro Enterprises”, it is important to realize the objective of the decision maker. Of course this is important for every decision problem. The decision maker may want to maximize profit, minimize net loss, minimize man hours lost, maximize units produced, minimize amount of pollution, and so forth.

Although not relevant for Petro Enterprises, we take the opportunity now to indicate differences between a maximization and a minimization problem. Consider the following decision tree:

EMV = 14 Event A (.5) 20

ACT I Event B (.3)

Figure 1.3. Decision Tree for Maximization or Minimization Problem

The EMV at a chance point is calculated in exactly the same way for both a minimization and a

maximization problem. In Figure 10 the EMV at the chance event is: (.5)x(20) + (.3)x(10) + (.2)x(5) = 14.

The EMV at a decision point is calculated differently for “min” and “max” problems. If Figure 10 refers to a maximization problem, the EMV at the decision point is 15. (Choose Act II and disregard (prune) decision I.) If Figure 1.3 refers to a minimization problem, the EMV is 14. (Choose Act I, and prune Act II.) Why?

In the Case of Petro Enterprises, the objective is to maximize the asset position of the firm. Notice that the ending asset positions include the beginning asset position of $130,000! This is particularly important if we decide to use utility theory to value our end points instead of F24V.

CASE OF PETRO ENTERPRISES

Petro Enterprises is a fledgling organization founded to wildcat in the Texas oil fields. Petro has a nontransferable short-term option to drill on a certain plot of land. The option is the only business deal in which the firm is involved now or that it expected to consider between now and December 31, 1967, the time drilling would be completed if the option were exercised. Two recent dry holes elsewhere have reduced Petro’s net liquid assets to $130,000 , and William Snyder, president and principal stockholder, must decide whether Petro should exercise its option or allow it to expire. It will expire in two weeks if drilling is not commenced by then. Snyder has three possible choices:

1. Drill immediately 2. Pay to have a seismic test run in the next few days, and then, depending on the result of the

test, decide whether or not to drill. 3. Let the option expire.

In order to decide which of the three choices he will make, Snyder must resolve the following two decisions.

ACT II Event C (.2)

10

5

Page 14: Notes on decision analysis

1-14

Drill Take the seismic test Don’t take the seismic test Don’t drill

He also faces two uncertainties that will affect his choices; these are: Oil Test Favorable Test Unfavorable No Oil

To create Mr. Snyder’s decision diagram requires connecting the forks together in proper sequence. This is done in the diagram below. Oil Drill No oil Don’t drill

Test favorable Oil

Test unfavorableTake seismic test No oil Drill Don’t drill

Don’t take seismic test Oil No oil Drill

Don’t drill

Note the resultant sequence; he first decides whether or not to take the test. If he decides to take the test, he then learns its outcome, decides on whether or not to drill, and finally learns (only if he drills) whether oil is present. If he decides against the seismic test, he then makes the drilling decision, learning whether oil is present only if he drills. The presence or absence of oil was determined eons before the time of Mr. Snyder’s decision, but he will not know whether oil exists until after a decision to drill has been made. Hence the event fork for the presence or absence of oil appears last. Using a Decision Diagram to Analyze a Problem

Once the decision diagram is constructed, there are several remaining tasks necessary to analyze a problem:

Page 15: Notes on decision analysis

1-15

1) Determine the appropriate probabilities that describe the relative likelihood of each branch on the

event forks. Since each event fork should have branches which are both “mutually exclusive” and “collectively exhaustive”, the probabilities at each event fork should always sum to one.

2) Determine a criterion which is an appropriate measure of the economic consequences of the

problem (for example, net cash flow) and evaluate this criterion at each end point of the diagram. 3) Use an expected value analysis to “fold back” the diagram, choosing that alternative course of

action which has the highest expected monetary value (EMV) at each decision point (act fork). 4) From this expected value analysis, determine the best set of decisions or optimal strategy for the

decision problem.

Having described Snyder’s possible choices, we consider their potential economic consequences. To conserve capital and maintain flexibility, Petro subcontracts all drilling and seismic tests; also, it immediately sells the rights to any oil discovered, instead of developing the oil fields itself. It can have the seismic test performed on short notice at overtime rates for a fixed fee of $30,000, and the well can be drilled for a fixed fee of $100,000. A large oil company has promised that if Petro drills and discovers oil, it will purchase all of Petro’s rights for a flat $400,000.

To complete the description, it is necessary to know the probabilities assigned to the various contingencies. The company’s geologist has examined the geology in the region and states that there is a .55 probability that if a well is sunk, oil will be discovered. Data on the reliability of the seismic test indicate that if the test result is favorable, the probability of finding oil will increase to .85; but if the test result is unfavorable, it will fall to .10. The geologist has computed that there is a .60 probability the result will be favorable if a test is made. (There is a simple, but important, logical interrelationship between these probabilities, but it is not discussed until Section 4.4 of notes)

This decision problem involving uncertainty can be structured in the form of the decision tree shown in Figure 1.4. The tree shows the probabilities, based on the judgment of the company geologist, for the various events; see the figures in parentheses on the event forks.

Page 16: Notes on decision analysis

1-16

Oil

Figure 1.4.

The Decision Diagram with Probabilities, Cash Flows and Ending Cash Positions

Figure 1.5 The Complete Decision Diagram with Expected Cash Positions of Final Decision

Oil (.85) $400,000

No oil (.15)

Drill

Don’t drillTest favorable (.60)

Test unfavorable (.40)

Don’t take seismic test

Drill

Don’t drill

Drill

Don’t drill

-$100,000

-$30,000 Take seismic test

$400,000

$400,000

$400,000

-$100,000

-$100,000

Oil (.55)

Oil (.10)

No oil (.90)

No oil (.45)

$0

$100,000 $400,000

$0

$100,000

$430,000

$ 30,000

$130,000

$340,000

$250,000

$ 40,000

(.85)

No oil (.15)

Drill

Don’t drillTest favorable(.60)

Test unfavorable (.40)

Don’t take seismic test

Drill

Don’t drill

Drill

Don’t drill

-$100,000

-$30,000 Take seismic test

$400,000 $400,000

$0

$100,000

$400,000

Oil $400,000 (.10)

No oil $0(.90) -$100,000

$100,000

$400,000

Oil $430,000 (.55)

No oil $ 30,000 (.45) -$100,000

$130,000

Page 17: Notes on decision analysis

1-17

Expected Value Analysis

How would Snyder’s problem be analyzed assuming that he is interested in ‘playing the averages’ and maximizing the mathematical expectation of his asset position (which is equivalent in this case to maximizing the mathematical expectation of profit. Why?) These steps would be followed:

1) Determine the asset positions Petro Enterprises would have if it arrived at each of the nine end positions on the decision tree in Figure 1.4.

2) Determine Petro’s best strategy by working backward through the tree; that is, at each fork

which represents a chance event (called an ‘event fork’) compute the expected value, and at each fork which represents a choice of action (an ‘act fork’) choose that act which has the highest expected value.

Computing Asset Positions

Having diagrammed the decision problem, we can put the cash flow associated with each act and event on the diagram as is shown in Figure 1.4. For example, taking the seismic test costs $30,000, so an outflow of this amount is indicated by writing ‘-$30,000’ beside ‘Take seismic test’. Similarly, the presence of oil results in an inflow of $400,000, so this figure appears by ‘Oil’.

The nine end positions of the tree represent the terminals of nine possible sequences of acts and events. Corresponding to each is an asset position for Petro Enterprises. These asset positions can be computed by summing the various cash flows from the origin of the diagram to each end position and adding the total to the form’s current asset position of $130,000. The results of these calculations are shown at the nine end positions show an asset position of $400,000. This is the sum of the receipts for the oil and the current asset position, minus the costs of taking the seismic test and drilling.

The economic quantity which the decision maker uses to describe the result of a particular path on his decision tree is called his criterion. In this case Snyder has chosen a criterion of net liquid assets, since his liquid asset position determines his ability to consider future deals. Other businessmen in other situations might well select earnings, net cash flow, or some other criterion. Obviously, the use of different criteria can lead to different decisions in some situations. Expectations and Choice

The terminal forks in Figure 1.5 event forks representing uncertainty about the results of drilling. At each terminal fork we compute the expected value of the firm’s asset position, which is simply the weighted average of the numbers at the end positions emanating from the fork. Taking the topmost terminal fork again for illustration, the expected value is $340,000 (i.e., .85 x $400,000 + .15 x $0).

An analysis based on mathematical expectation assumes that Snyder would accept a $340,000 sure asset position in exchange for a .85 chance of assets of $400,000 plus a .15 chance of $0 in assets and vice versa. In other words, the asset position and the chance event are equivalent. Using utilities instead of asset positions might be more realistic, but for the moment we will go along with it because it allows us to replace the event fork by its mathematical expectation. As a matter of fact, since each terminal event fork is assumed to be equivalent to its mathematical expectation, we can discard the terminal set of forks and replace them by their mathematical expectations. We are left with the reduced diagram shown in Figure 1.6.

Now the terminal forks are act forks where Snyder’s choice is between drilling and not drilling. If he is maximizing EMV, his choice is easy. He simply chooses the act with the highest EMV. Following a favorable seismic test result, for example, the choice is between drilling, with an EMV of $340,000, and not drilling, with an EMV of $100,000. Obviously, Snyder should decide to drill. Hence, if he were to arrive at the position of the diagram following a favorable seismic test result, we know he would choose to drill and thus would look forward to an asset position whose expected value is $340,000. It follows that the fork is equivalent to an expected value of $340,000, so we put $340,000 at the base of the act fork.

Page 18: Notes on decision analysis

1-18

Once the results of similar choices have been placed at the base of each of the terminal act forks in Exhibit 1.6, we can replace each act fork by its equivalent mathematical expectation, as shown in Figure 1 .4

Now we are faced with the reduction of the event fork representing the result of the test. The procedure is the same as with any event fork; we take the mathematical expectation of the numbers at the end positions - in this case $244,000 (i.e., .60 x $340,000 + .40 x $100,000).

After replacing the event fork by the expected value of its end positions, we are left with the single act fork in Exhibit 1.7. The resultant act choice is easy; since $250,000 is greater than $244,000, Snyder should not have his firm take the seismic test. Instead, he should drill immediately.

It is not_necessary to redraw the tree after reduction, as was done for illustrative purposes in Figures 1 .6 and 1.7. We can simply write the appropriate mathematical expectation at the base of each event or act fork and then prune the branch or branches not chosen. The Optimal Strategy

In the example shown above there were several alternative strategies or sets of decisions that Mr. Snyder could have chosen before analyzing the problem. Three of the most reasonable were:

1) Take the seismic test; drill if the test if favorable, don’t drill if the test is unfavorable. 2) Don’t take the seismic test, but drill immediately. 3) Don’t take the seismic test and don’t drill.

The analysis has shown us that the expected asset position (expected monetary value) is highest for strategy #2. Therefore, we shall refer to this as the optimal strategy, or more colloquially, the best set of possible decisions. The analysis of this problem has assumed that Mr. Snyder was willing to “play the averages”, that is, he was willing to use the expected value of the economic consequences as a basis for evaluating uncertain events. In many situations this is a reasonable assumption. However, in other situations, particularly those involving large possible losses, a decision maker may be very “risk averse”, that is, he may be unwilling to evaluate uncertain events using their expected value because of what he perceives as significant risks.

Figure 1.6

First Reduction of the Decision Diagram

Drill

Don’t drillTest favorable(.60)

Test unfavorable (.40)

Don’t take seismic test

Drill

Don’t drill

Drill

Don’t drill

-$30,000 Take seismic test

$340,000$340,000

$100,000

$ 40,000

$100,000$100,000

$250,000

$250,000 $130,000

Page 19: Notes on decision analysis

1-19

$340,000Test favorable(.60)

Test unfavorable (.40)

$244,000

Take seismic test

$100,000

$244,000 Take seismic test

Don’t take seismic test $250,000t

Don’t take seismic tes

Figure 1.7 Further Reductions of the Decision Diagram

$250,000

Page 20: Notes on decision analysis

1-20

1 .7 Decision Analysis: - Constructing a Decision Tree

The following is a set of “cookbook rules” for constructing a decision tree for a case study.

1) Read the case carefully.

2) Read the case once more. Identify and indicate (maybe underline) the decisions to be made and the events that may occur.

3) List choices and events. Collect the indicated choices (decisions to be made) and events in a “T-

account”: Choices Events

4) Chronological sequence of_choices and events. Order the choices and events (in the T-account)

according to a time sequence, simply by numbering the choices and events.1 5) Draw the tree. Once the chronological sequence of decisions and events is determined the tree

follows immediately. 6) Inspect the tree. Simply “climb” through the tree and check whether the tree is complete and the

sequences make sense. 7) Calculate the Partial Cash Flows. Many branches have direct consequences in terms of cash flow.

Usually the partial cash flows are entered along the branches. (In large trees these can get in the way so the final tree for analysis shall not include them.)

8) Insert the Probabilities at the event forks. 9) Evaluate the End Points. The sum of the partial cash flows out each branch (plus possibly a starting

cash flow) equals the end point value. (Of course units other than monetary ones are possible as well.)

10) Fold Back and Prune the Tree. Starting at the right-most end positions the tree will be folded back

by calculating the EMV at each decision point or chance event. (Keep the difference between decision and event points in mind.)

11) State the Optimal Strategy. As a result of Step 10, the optimal strategy can now be stated. For example: Choose “Act II”, if “Event A” occurs, choose “Act X”, if “Event B” occurs, choose “Act Y”, etc.

12) Further Analysis: - Sensitivity to incorrect probability assessments and on cash flow evaluation. - Break-even probabilities - EVPI (See Chapter 2.3) - Evaluation of intangibles 13) Write a Coherent Report describing the decisions to be made and providing a rationale for your

analysis. 1 There are two important expectations to the need to represent forks in sequence that are worth mentioning, because they

allow the analyst flexibility. They are: 1) A series of events may be shown in any order as long as there are no intervening acts. 2) A series of acts may be shown in any order as long as there are no intervening events.

The above exceptions are mentioned because they can occasionally be exploited to simplify an analysis. For example, sometimes it is easier or better to obtain a manager’s probability judgment if events are represented in one order as opposed to another. In the meantime, remember that one can never go wrong by sticking to the time sequence.

Page 21: Notes on decision analysis

1-21

1.8 Case Study:_ Vancouver Electronics Company (B)

Late in August 1975, Mr. Andrew Howard, founder of Vancouver Electronics Company (VEC) was trying to decide whether or not to accept a contract from Stone Manufacturing Company for 100 ampometers (see Vancouver Electronics Company (A)). Mr. Peter Wong, VEC’s Plant Engineer, is discussing the problem with Mr. Howard.

“I think we should have Farentox prepare the mold, Andrew, and make the 100 housings ourselves,” began Mr. Wong. “All of my projected costs show that we will definitely make more money than if we buy the housings.”

“The problem with that, Peter,” replied Mr. Howard, “is that there is a chance that the Farentox mold won’t give me a housing that’s acceptable to Stone. If this happens, then we’re back where we started, purchasing the housings, and we’ve sunk $17,500 in a useless mold.”

“I don’t think there’s much chance of that,” said Mr. Wong. “We’ve had good luck with this type of casting in the past, and it should work this time. If we go ahead with it, I can give you ten sample housings by early next week, and Stone can check them then. True, if Stone doesn’t like them we’ll have to buy the housings, but as I say, I’m almost sure this won’t happen.”

“There is still another option open to us,” Mr. Howard responded. “In the event that the housings are unacceptable to Stone, we can forfeit the contract and pay Stone a penalty of $1,000.” Problems: 1.8.a. What is the uncertainty Mr. Howard is faced with? 1.8.b. Draw a decision tree that accurately portrays Mr. Howard’s decision problem. (You may like

to use the “cookbook rules” in Chapter 1.7.) 1.8.c. If Mr. Howard decides to use expected monetary value as his decision making criterion, what

policy should he adopt?

Page 22: Notes on decision analysis

1-22

1.9 Case Study: Vancouver Electronics Company (C)

A few minutes before Mr. Howard was going to call Mr. Stone with his decision regarding the production of electronic ampometers he received a phone call. It was Mr. Stone.

“Mr. Howard, I have a problem, and it concerns the number of ampometers I’m going to need. We spoke earlier of 100, but now, due to the outside possibility of my automated factory being smaller than originally planned, I may need only 50. I won’t be able to tell you the quantity for two weeks, but I will have to know by tomorrow whether or not you will accept the contract. If you accept, I can test ten of your trial housings. I’ll be able to let you know if they’re okay within a week of delivery.” If you accept the contract I would like to receive the finished ampometers no later than January, 1976.”

Mr. Howard had little bargaining power with Mr. Stone since he could take his ampometer contract to Burnaby Components, a new struggling electronics company that was eager for any business it could get. Therefore, Mr. Howard knew he had to accept Mr. Stone’s conditions. Even though he was not happy with this recent turn of events, Mr. Howard still felt there was a tidy profit to be made on this contract.

Before Mr. Andrew Howard made his decision, he felt that there was additional information he should consider. He was concerned that he didn’t have a better feel for the likelihood of the various outcomes. He called Peter Wong, and asked him to come to his office. Mr. Wong arrived about five minutes later.

“Peter,” Mr. Howard said, “if we buy the mold, the first thing that we’re going to find out is whether or not the housings are acceptable. Can you give me a better idea of how likely it is that we’ll be successful?”

“I don’t think we have too much to worry about, Andrew,” Mr. Wong replied. “As I mentioned

before, we’ve had good luck with this type of casting -- I’d say that there’s an 80% chance of the mold producing good housings.”

After Mr. Wong left, Mr. Howard called Stone Manufacturing, and asked Mr. Stone, “Mr. Stone, you said earlier that there’s a chance that you won’t be needing 100 ampometers, but only 50. Can you give me some idea of how much of chance there is of that?”

“Well, I don’t think there’s too much of a chance -- the problem is that two of the board members think the new White Rock plant should be a small one, and their principal plant in Portland, Oregon should be expanded. The rest of the board members are pretty much convinced that it’s in their best interests to expand Canadian operations and build a large facility in White Rock. My best guess at this point in time is that there’s about an 85% chance of their building the factory that would require 100 ampometers. Of course, this will be resolved when the board meets in two weeks.” After thanking Mr. Stone for his information, Mr. Howard addressed himself to the problems of assessing the value of the mold and deciding whether or not to manufacture the housings. After much consideration he decided to be very conservative and assume that no more housings would be needed. He felt that in the fairly touchy liquid asset position of Vancouver Electronics, he should not make any assumptions concerning the future demand for a specialty product like electronic ampometers. His best estimate of the salvage value of the $17,500 mold was $500.

Finally, Farentox, the mold manufacturer informed Howard: “Sure Mr. Howard, we could supply you with the mold within two days. But I would just like to remind you that starting next Monday we will be closed for three weeks, our usual holiday period for all personnel, and we will be tied up on the Morris Contract for six months.”

Somewhat disappointed by the last message, Mr. Howard started to analyze his problem and the effect of the recent turn of events.

Page 23: Notes on decision analysis

1-23

Problems: 1.9.a. Diagram Mr. Howard’s decision problem, including all information you think necessary for

him to make a decision. Recall that he must decide whether or not to buy the mold within two days.

1 .9.b. If Mr. Howard decides that he will use expected monetary value as his decision making

criterion, which policy should he adopt? 1.9.c. Why was Howard disappointed by the information he received from Farentox? 1.9.d. Suppose the date required for delivery of the ampometers was instead January, 1977 what

should Howard do now? i.e. Find his best decision rule using the EMV criterion.

Page 24: Notes on decision analysis

1-24

1.10 Case Study: Central Valley Vineyards

Mr. Robert Burns, owner of Central Valley Vineyards, located in the Okanagan Valley, pondered a difficult decision late in August 1975. Burns raises grapes, and his problem concerned what to do with his “left-over” grapes.

The grape that Burns grows is a multiple-use variety that can be utilized for canning, fresh table consumption, wine production, or for sun-drying into raisins. The acreage of canning and table grapes is invariably contracted for at the beginning of the season, and then the remainder of the crop may be shifted late in the season to either wine grapes or to raisins. This is known as “going wet” or “going dry”, respectively.

This decision is usually made in August, near the end of the season, and once it has been made, it is irrevocable. The weather conditions after the decision is made are critical, and they are difficult to predict. Raisins in British Columbia are sun-dried completely in the open, and rain during this time can inflict heavy losses on a farmer who is going dry. If the farmer is going wet, the grapes remain on the vines for several weeks longer, and rain does not do as much damage.

Robert Burns has 100 acres of “uncommitted” grapes and wishes to consider three alternatives: 1) allocate all of his acreage to raisins; 2) allocate all of the acreage to wine grapes; 3) allocate approximately half of the acreage to each use. As for the weather, he feels that he can simplify his problem by assuming that the rainfall situation will either be none at all, light or heavy. Burns has available the past 20 years of September-October weather records, as listed in the table on the next page.

Mr. Burns summarized the rainfall conditions for the above Sept/Oct periods in the categories “no”, “light”, “heavy” according to whether the average rainfall was less than 1 cm, between 1.0 and 5.0 cm and greater than 5.0 cm. He then constructed the table below to show his expected dollar profit per acre under the various acreage alternatives and weather conditions.

Acreage Allocation Rainfall

Conditions Raisins Wine Both “No” “Light” “Heavy”

6050

-20

403020

504010

Problems 1.10.a. Estimate the probability of “no”, “light” and “heavy” rainfall. Which method of probability

assessment did you use? 1.1O.b. Which alternative course of action should Mr. Burns take if he decides to use EMV as his

decision making criterion? 1.10.c. Is the optimal decision rule sensitive to how you grouped the data? 1.10.d. In what way is this problem unrealistic. How would you change it? 1.10.e. Do you believe using the average rainfall during Sept./Oct. is an adequate measure, of

precipitation for making a decision? Why or why not?

Page 25: Notes on decision analysis

1-25

RAINFALL DATA (IN CENTIMETERS) OKANAGAN CENTRE

Year

Week 1

Week 2

Week 3

Week 4

Week 5

Week 6

Week 7

Week 8

Week 9

Average Sept/ Oct Rainfall

1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974

2 0.5 3 4 0 2 0 8 0 0 0 7 0 0 5 3 1

0.5 7.5 5

1 0.5 2 6 0 7 0 8 0 0 0 5

0.5 0 6 7 0

0.5 5.5 4.5

0 0 7 2 2

5.5 0 7 0 0 0

3.5 3 0

2.5 5 0 0 8 3

1 0 5 0 1 5 1 6 0 0 0

5.5 0 0

0.5 1

0.5 0 6 1

0 0

10 0 0

2.5 2 7 0 0 0 9 0 0 0 1 2 0 6 2

0 0.5 2 3 0 0 0 7 0 0 0 3 0 0 0 6

0.5 0.5 7 6

0 0

11 4 0 0 0 3

0.5 4.5 0

12.5 0

0.5 0.5 3 0 1

4.5 4

6 1 7 4

0.5 0 0 3

0.5 0 0 6 0

2.5 4 4 0 0

2.5 3

8 2 6 4 1 0 0 5

0.5 0 0 6 0 1 2 5 0

0.5 3 3

2 0.5 7 3

0.5 2.44 .33 6

0.17 0.5 0

7.50 0.39 0.44 2.28 3.89 0.44 0.33 5.56 3.59