60
EXPECTED VALUE OF INFORMATION Omkar Apahle

EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Embed Size (px)

Citation preview

Page 1: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EXPECTED VALUE OF INFORMATION

Omkar Apahle

Page 2: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

INFORMATION

A B

A B

A B

C

Page 3: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EXPECTED VALUE OF INFORMATION (EVI)

• EVI is required to negate the effects of ,- overconfidence, - underestimation of risk and - surprise• EVI is required often in,- Risk Analysis- Sensitivity Analysis- Decision problem

Page 4: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

DEFINITION

• Expected Value of Information (EVI) is the

integral over all possible posterior

distributions of the opportunity loss

prevented by improved information, weighted

by probability of that information.

Page 5: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

CLASSIFICATION

Page 6: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

• Example, • Weather condition and Camping activity

• EVPI = Highest price the decision maker is willing to pay for being able to know “Weather Condition” before making camping decision.

• EVII = Highest price the decision maker is willing to pay for being able to know “Weather Forecast” before making camping decision.

Page 7: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

CHARECTERISTICS OF EVI

• Expected Value of Information (EVI) can never be less than zero.

• No other information gathering / sharing activities can be more valuable than that quantified by value of perfect information.

Page 8: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

SOURCES

• Sample data

• Expert judgments

Page 9: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EVI & BAYES RULE

• Bayesian analysis relies on both sample information and prior information about uncertain prospects.

• Bayesian analysis provides a formal representation of human learning . An individual would update his / her “subjective beliefs” after receiving new information.

Page 10: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EVI & Bayes rule continued ….

• Investment in stock market

• Expert will provide perfect information

• Perfect information = always correct

• P ( Expert says “Market Up” Market Really Goes Up ) = 1

Page 11: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

• Applying Bayes’ theorem

P ( Market Up Exp Says “Up” ) =

P ( Exp “Up” Market Up ) P (Market Up )

P ( Exp “Up” Market Up ) P (Market Up ) +P ( Exp “Up” Market Down) P (Market Down)

EVI & Bayes rule continued ….

Page 12: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EVI & PRIOR DISTRIBUTION • EVI depends on the prior distribution used to

represent current information.

• Subject experts and lay people often produce distributions that are far too tight.

• New or more precise measurements are often found to be outside the reported error bars of old measurements.

Page 13: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

• Expected posterior probability before the results are known is exactly the prior probability content of that region.

• Expected value of posterior mean is equal to the prior mean.

• Prior variance = Posterior variance + Variance of posterior mean

Page 14: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

UNCERTAINTY

• A random variable y is more uncertain than another random variable z if

- y = z + random noise- Every risk averter prefers a gamble with

payoffs equal to z to one with payoffs equal to y

- The density of y can be obtained from density of z by shifting weight to the tails through a series of mean-preserving spreads.

Page 15: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Uncertainty continued…..

• The decision as to whether to include uncertainty or not purely depends on the decision maker

• Expected Value of Including Uncertainty (EVIU)

• Expected Value of Ignoring (Excluding) Uncertainty (EVEU)

Page 16: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

NOTATION• d ∈ D is a decision chosen from space D• x ∈ X is an uncertain variable in space X• L (d, x) is the loss function of d and x• f (x) is prior subjective probability density

on x• xiu = E (x) is the value of x when ignored

uncertainty• E [ L (d , x) ] = ∫x L (d , x) f (x) dx

is the prior expectation over x loss for d

Page 17: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Notation continued …

• Bayes’ decisiondy = Min-1d E [L(d , x)]

• Deterministic optimum decision ignoring uncertainty diu = Min-1d L(d , xiu)

Page 18: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EVIU

• Expected Value of Including Uncertainty (EVIU) is the expectation of the difference in loss between an optimal decision ignoring uncertainty and Bayes’ decision.

EVIU = E [ L (diu , x) ] - E [ L (dy , x) ]

Page 19: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

EVPI

• Expected Value of Perfect information (EVIU) is the expectation of the difference in loss between an Bayes’ decision and the decision made after the uncertainty is removed by obtaining perfect information on x.

EVPI = E [ L (dy , x) ] - E [ L (dpi (x), x) ]

Page 20: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

SOCRATIC RATIO

• Dimensionless index of relative severity of EVIU and EVPI

EVIUS iu =

EVPI

Page 21: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

LOSS FUNCTIONS

• When ignoring uncertainty doesn’t matter ?

• Classes of common loss functions,1.Linear 2.Quadratic3.Cubic

Page 22: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

LINEAR LOSS FUNCTION

• Assume, x iu = x

• d ∈ { d1, d2,….dn }

• Loss function,a1 + b1x if d = d1

• L (d, x) = a2 + b2x if d = d2

… ….an+ bnx if d = dn

Page 23: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Linear loss function continued….

• E [ L (d , x)] = L (d , x ) = L ( d , xiu )

• Bayes decision,dy = Mind

-1 E [ L (d , x)]

= Mind-1 L ( d , xiu ) = diu

• EVIU = E [ L (diu , x)] - E [ L (dy , x)] = 0

• Considering uncertainty makes no difference to decision and hence to the outcome.

Page 24: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

QUADRATIC LOSS FUNCTION

• Let the loss function be,L (d , x) = k ( d – x ) 2

• E [L (d , x) ] = k ( d 2 – 2d E(x) + x 2 )• On derivation we get,

2d – 2E(x) = 0dy = E(x) = diu

• Uncertainty makes no difference in decision

Page 25: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

CUBIC ERROR LOSS FUNCTION• Decisions involving uncertain future demand• L (d , x) = r ( d – x ) 2 + s ( d – x ) 3 r ,s > 0• Henrion (1989) showed that,

EVIU 1• Siu = <

EVPI 3

• Obtaining better information about x is better than including uncertainty ( in all cases ).

Page 26: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

BILINEAR LOSS FUNCTION

• Newsboy problem• How many newspapers to order ?• d = newspaper to be ordered

x = uncertain demanda = $ loses if ordered too manyb = $ forgoes if ordered too few

Page 27: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Newsboy problem continued ….

• Loss function, a ( d – x ) if d > x

L (d , x) = where a, b > 0 b ( x – d ) if d < x

Page 28: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Newsboy problem continued ….

• Uniform prior on x, with mean x and width w

1 / w if x - w/2 < x < x + w/2f(x) = w > 0

0 otherwise

• d iu = x

Page 29: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Newsboy problem continued ….

Paper Demanded (x)

Probability density:

f (x)

xd

w

Page 30: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Newsboy problem continued ….

b ( x – d ) a ( d – x )

Loss

Too few Too many

0

Error (excess newspapers) = ( d – x )

Page 31: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

• Results

• EVPI =

• EVIU =

• Siu = EVIU / EVPI = ( a – b ) 2 / 4 a b

Newsboy problem continued ….

w a b / 2 (a + b)

w ( a – b ) / 8 (a + b)

Page 32: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Newsboy problem continued ….

• Socratic ratio is independent of the uncertainty

• EVIU does not increase with uncertainty value relative to the EVPI

• Considering uncertainty is more important than getting better information

Page 33: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

CATASTROPHIC LOSS PROBLEM

• Plane-catching problem• How long to allow for the trip to the airport ?• d = decision

x = uncertainty ( actual travel time )k = marginal cost per minute of leaving earlierM = loss due to missing the plane

Page 34: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Plane-catching problem continued ….

• Loss function 0 if d > x

L ( d , x ) = k ( d – x ) + M if d < x

• k and M are positive

• M > k ( d – x ) for all d , x

Page 35: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Plane-catching problem continued ….

- 60 - 50 - 40 - 30 - 20 - 10 0

300

0

150

L ( d, x = 35 ):Loss as a function of d

M : Loss due to missing plane

k ( d – x ) = Wasted time

d : departure time ( minutes before plane )

Loss (in min )

Page 36: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Plane-catching problem continued ….

• x is uncertain and the decision is subjective• f ( x ) = subjective probability density function• Baye’s decision (dy) will allow x such that ,

kf (dy) =

M

Page 37: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Plane-catching problem continued ….

• In case we ignore uncertaintyd iu = x 0.5

where, x 0.5 = median value

• d iu will lead us to miss the plane half the time

• EVIU = E [ L (diu , x) ] - E [ L (dy , x) ]

• EVPI = E [ L (dy , x) ]

Page 38: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

HEURISTIC FACTORS

• Four heuristic factors contribute to understand the EVI

1.Uncertainty (about parameter value)

2. Informativeness (the extent to which the current uncertainty may be reduced)

Page 39: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Heuristic factors continued …

3. Promise(the probability that improved information will result in a different decision and the magnitude of the resulting gain)

4. Relevance(the extent to which uncertainty about the parameter to overall uncertainty)

Page 40: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Heuristic factors continued …

Example,Whether to permit or prohibit use of a food additive

• Expected social cost = Number of life years lost• θ = Risk from consuming additive

= Excess cancer risk• K = Expected social cost with use of substitute

in case additive is prohibited

Page 41: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Heuristic factors continued … • f 0 = probability distribution representing

current information about θ• f + = probability distribution representing θ

is hazardous• f- = probability distribution representing θ is

safe• L0 = expected social cost if additive is

permitted• L1 = expected social cost if additive is

permitted / prohibited after research

Page 42: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Heuristic factors continued …

• EVI = L0 – L1

• Condition: Additive is permitted if and only if L0 > K

• Substantial chance that additive is riskier than alternative and should be prohibited

Page 43: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Prohibit additive use

Permit additive use

f - f +

f 0

K L1

L0

θ

Expected Social Loss

Page 44: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Effect of greater prior uncertainty

Prohibit additive use

Permit additive use

f - f +

f 0

K L1

L0

θ

L-

Expected Social Loss

Page 45: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Effect of greater informativeness

Prohibit additive use

Permit additive use f - f +

f 0

K L1

L0

θ

Expected Social Loss

Page 46: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Effect of greater promise

Prohibit additive use

Permit additive use

f - f +

f 0

K L1

L0

θ

Expected Social Loss

Page 47: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Effect of relevance

θ1

θ2

Permit additive use Prohibit additive use

Page 48: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

RISK PREMIUM

• How to measure the monetary value of risk ?• Let,

a = uncertain monetary rewardw = initial wealtha + w = terminal wealthU (w + a) = Utility function

Page 49: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Selling price of risk

• Rs = selling price of risk

= sure amount of money a decision maker would be willing to receive to sell the risk a

• { w + Rs } ~* { w + a }

• Under expected utility function,U {w + Rs} = EU {w + a}

Page 50: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Bid price of risk

• Rb = bid price of risk

= sure amount of money a decision maker would be willing to pay to buy the risk a

• { w } ~* { w + a - Rb }

• Under expected utility function,U {w} = EU {w + a - Rb }

Page 51: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Risk premium• R = risk premium

= sure amount of money one would be willing to receive to become indifferent between receiving the risky return a versus receiving the sure amount

[E(a) – R]• { w + a } ~* { w + E(a) - R }

• EU {w + a } = U { w + E(a) - R }

Page 52: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Risk premium continued …

• v = { w + E(a) - R } = certainty equivalent

• R = E [v] – v E [v] = expected loss

• Willingness to insure

Page 53: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

COMBINING PRIORS• A single expert report an idiosyncratic

perception of a consensus

• Useful if use combined judgments

• Aggregation procedure – Weighted average– Bayes’ rule– Copula method

Page 54: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Example • Climate sensitivity ( Morgan and Keith , 1995)

• ∆ T 2x = equilibrium increase in global annual mean surface temperature as a result of doubling of atmospheric CO2 from its pre-industrial concentration

Page 55: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Example continued …

• Estimates gathered from different experts.• All experts are treated equal.• Range given by IPCC: 1.5 to 4.50C• Most likely value : 2.50C• Tails extended to account underestimation• Sensitivity analysis : include / exclude expert 5

Page 56: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Example continued …

-5

0

5

10

15

20

1 2 3 4a 4b 5 6 7 8 9 10 11 12 13 14 15 16

∆ T 2x

Experts

Page 57: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

Example continued …

• PDFAll Experts

Excluding 5

All experts with exponential tails

Page 58: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

BEST ESTIMATE

• xiu = Mean of the SPD

• What to choose – mean, median or mode ?• If Mean >> Median

Make decision based on median and ignore uncertainty

• If Mean ~ Median Make decision considering possibility of extreme scenarios

Page 59: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

IN CONCLUSION• “As for me, all I know is I know nothing.”

Socrates

• Expected Value of Information depends upon the expected benefits of Socratic wisdom (i.e. admitting one’s limits of knowledge) relative to the expected benefits of perfect wisdom (i.e. knowing the truth).

Page 60: EXPECTED VALUE OF INFORMATION Omkar Apahle. INFORMATION A B C

THNAK YOU !