Value of Choices • Consider value you derive (from some choice) Say, 2 choices, each with n consequences: c 1 , c 2 ,..., c n One of consequences c i will occur with probability p i Each consequence has some value: V(c i ) Which choice do you make? • Example: Buy a $1 lottery ticket (for $1M prize)? Probability of winning is 1/10 7 Buy : c 1 = win, c 2 = lose, V(c 1 ) = 10 6 – 1, V(c 2 ) = -1 Don’t Buy : c 1 = lose, V(c 1 ) = 0 E(buy) = 1/10 7 (10 6 – 1) + (1 – 1/10 7 ) (-1) ≈ -0.9 E(don’t buy) = 1 (0) = 0 “You can’t lose if you don’t play!”
Probability Tree • Model outcomes of probabilistic events with tree Also called “chance nodes” p Heads Coin flip 1 – p Tails • Useful for modeling decisions p $1,000,000 - 1 yes $-1 1 – p Buy ticket? no $0 Expected payoff: yes = p(1000000 - 1) + (1 - p)(-1) no = 0
Utility • Utility U(x) is “value” you derive from x 0.5 $20,000 yes $0 0.5 Play? no $10,000 0.5 U($20,000) yes U($0) 0.5 Play? no U($10,000) Can be monetary, but often includes intangibles o E.g., quality of life, life expectancy, personal beliefs, etc.
Micromort • A micromort is 1 in 1,000,000 chance of death How much would you need to be paid to take on the risk of a micromort? How much would you pay to avoid a micromort? o P(die in plane crash) ≈ 1 in 1,500,000 o P(killed by lightning) ≈ 1 in 1,400,000 How much would you need to be paid to take on a decimort (1 in 10 chance of death)? If you think this is morbid, companies actually do this o Car manufacturers o Insurance companies
Non-Linear Utility of Money • These two choices are different for most people 0.5 $10 yes $0 0.5 Play? no $2 0.5 $100,000,000 yes $0 0.5 Play? no $20,000,000
Utility Curves 180 170 160 150 Risk Preferring 140 130 Risk Neutral 120 Utility 110 Risk Averse 100 90 80 70 60 50 40 30 20 10 Dollars 0 0 10 20 30 40 50 60 70 80 90 100 • Utility curve determines your “risk preference” Can be different in different parts of the curve
Risk Neutral 180 170 160 150 140 130 Risk Neutral 120 Utility 110 100 90 80 70 60 50 40 30 20 10 Dollars 0 0 10 20 30 40 50 60 70 80 90 100 • First $50 is worth the same to you as “next” $50
Risk Averse 180 170 160 150 140 130 Risk Averse 120 Utility 110 100 90 80 70 60 50 40 30 20 10 Dollars 0 0 10 20 30 40 50 60 70 80 90 100 • First $50 is worth more to you than “next” $50
Risk Preferring 180 170 160 150 140 130 Risk Preferring 120 Utility 110 100 90 80 70 60 50 40 30 20 10 Dollars 0 0 10 20 30 40 50 60 70 80 90 100 • First $50 is worth less to you than “next” $50
Risk Profiles Most people are risk averse • Beyond some (reasonably small) amount Consider the notion of “necessities” vs. “luxuries” But there are some cases where people show risk • seeking behavior Small cost, high potential payoff (with very low probability) o E.g., playing the lottery Sometime “risk seeking” aspect is downplayed by giving utility to the “fun of playing” o Total utility = expected payoff of the game + fun of playing the game o E.g., gambling Utility functions change over time • Tend to become less risk averse as economic viability increases
Utility Function Properties Increasing function • More money is preferred to less Continuous (smooth) function • Does not change “drastically” A small change in input to function should not change output of function significantly Only the ordinal rankings of utility function matter for • making a choice Actual utility value may not be meaningful Sometimes the unit of measurement is called “utils”
Maximizing Expected Utility = Say your utility function is: ( ) U x x • Consider the following “gambles” • 0.5 $100 1.0 $36 $0 0.5 Gamble A Gamble B Compute expect utility • Gamble A: (0.5)U($100) + (0.5)U($0) = (0.5)10 + (0.5)0 = 5 Gamble B: (1.0)U($36) = (1.0)6 = 6 Select gamble that has maximal expected utility • Would choose Gamble B here
Compound Gamble I will flip a fair coin. • If “heads”, you win $5. If “tails”, I roll a 6-sided die, you win $X where X is number rolled 1/2 1/12 $5 $1 1/6 1/12 $1 $2 1/6 1/2 1/12 $2 $3 1/6 1/12 $3 $4 1/6 7/12 $4 $5 1/6 1/12 $5 $6 1/6 $6 Reduced Compound (simple) gamble gamble
Justifying Expected Utility Maximization Subscribing to these properties maximize expected utility • You are indifferent between compound gamble and simple gamble to which it reduces using probability theory For two gambles A and B, you are willing to say A ≥ B or B ≥ A If A ≥ B and B ≥ C, then A ≥ C (Transitivity) If A > B and B > C, then E > B > D, where Very small p Very small p Gamble C Gamble A Gamble A Gamble C 1 – p 1 – p Gamble D Gamble E (“almost C”) (“almost A”) If A > B, then D > E where for any p > 0 p p Gamble A Gamble B Gamble C Gamble C 1 – p 1 – p Gamble D Gamble E
Certain Equivalent • Consider playing this game: 0.5 $20 yes $0 0.5 Play? ← “Certain Equivalent” (CE) no $X • For what value of X are you indifferent to playing? X = 3 X = 7 X = 9 X = 10 X = 11 • Certain equivalent is value of game to you
Risk Premium • A slightly different game: 0.5 $20,000 yes 0.5 $0 Play? no $7,000 Say this is our CE Expected monetary value (EMV) = expected dollar value of game (here = $10,000) Risk premium = EMV – CE = $7,000 o How much you would pay (give up) to avoid risk o This is what insurance is all about –$30,000 0.02 –$600 no $0 0.98 Insure car? –$1000 yes
Let’s Do a Real Test • Game set-up I will flip a fair coin If “heads”, you win $50. If “tails”, you win $0 How much would you be willing to pay me to play? o $1 ? o $10 ? o $20 ? o $24.99 ? o $25.01 ? o $30 ? Who is willing to bid highest? o How did you determine that value?
Recommend
More recommend