phil309p methods in philosophy politics and economics
play

PHIL309P Methods in Philosophy, Politics and Economics Eric Pacuit - PowerPoint PPT Presentation

PHIL309P Methods in Philosophy, Politics and Economics Eric Pacuit University of Maryland 1 / 18 Utility Function A utility function on a set X is a function u : X R 2 / 18 Utility Function A utility function on a set X is a function u :


  1. PHIL309P Methods in Philosophy, Politics and Economics Eric Pacuit University of Maryland 1 / 18

  2. Utility Function A utility function on a set X is a function u : X → R 2 / 18

  3. Utility Function A utility function on a set X is a function u : X → R A preference ordering is represented by a utility function iff x is (weakly) preferred to y provided u ( x ) ≥ u ( y ) 2 / 18

  4. Utility Function A utility function on a set X is a function u : X → R A preference ordering is represented by a utility function iff x is (weakly) preferred to y provided u ( x ) ≥ u ( y ) What properties does such a preference ordering have? 2 / 18

  5. Ordinal Utility Theory Fact . Suppose that X is finite and � is a complete and transitive ordering over X , then there is a utility function u : X → R that represents � (i.e., x � y iff u ( x ) ≥ u ( y ) ) 3 / 18

  6. Ordinal Utility Theory Fact . Suppose that X is finite and � is a complete and transitive ordering over X , then there is a utility function u : X → R that represents � (i.e., x � y iff u ( x ) ≥ u ( y ) ) Utility is defined in terms of preference (so it is an error to say that the agent prefers x to y because she assigns a higher utility to x than to y ). 3 / 18

  7. Important All three of the utility functions represent the preference x ≻ y ≻ z Item u 1 u 2 u 3 x 3 10 1000 y 2 5 99 z 1 0 1 x ≻ y ≻ z is represented by both ( 3 , 2 , 1 ) and ( 1000 , 999 , 1 ) , so one cannot say that y is “closer” to x than to z . 4 / 18

  8. What is utility ? ◮ usefulness ◮ from Principle of Utility : an object’s “tendency to produce benefit, advantage, pleasure, good, or happiness” (Broome, p19) for all people ◮ a person’s personal, subjective good ◮ “the value of a function that represents a person’s preferences” (Reiss, p21) 5 / 18

  9. Economists primarily use the last sense of utility (as will we), which is not problematic, however, “[i]f...you use ‘utility’ to stand for a representation of a person’s preferences, and at the same time for the person’s good, you cannot even express the question [of whether or not persons always act so as to maximize their utility]. You will say: by definition, what a person prefers has more utility for her, so how can it fail to have more utility for her? The ambiguity is intolerable.” (Reiss, p. 21) 6 / 18

  10. Types of Choices ◮ certainty : highly confident about the relationship between actions and outcomes ◮ risk : clear sense of possibilities and their likelihoods ◮ (Knightian) uncertainty : the relationship between actions and outcomes is so imprecise that it is not possible to assign likelihoods 7 / 18

  11. Lotteries Suppose that X is a set of outcomes. A (simple) lottery over X is denoted [ x 1 : p 1 , x 2 : p 2 , . . . , x n : p n ] where for i = 1 , . . . , n , x i ∈ X and p i ∈ [ 0 , 1 ] , and � i p i = 1. Let L be the set of (simple) lotteries over X . We identify elements x ∈ X with the lottery [ x : 1 ] . 8 / 18

  12. Lotteries Suppose that X = { x 1 , . . . , x n } is a set of outcomes. A lottery over X is a tuple [ x 1 : p 1 , . . . , x n : p n ] where � i p i = 1. 9 / 18

  13. Lotteries Suppose that X = { x 1 , . . . , x n } is a set of outcomes. A lottery over X is a tuple [ x 1 : p 1 , . . . , x n : p n ] where � i p i = 1. p 1 p 2 p n − 1 p n · · · x 1 x 2 x n − 1 x n Let L be the set of lotteries. 9 / 18

  14. Expected Value of a Lottery Suppose that the outcomes of a lottery are monetary values. So, L = [ x 1 : p 1 , x 2 : p 2 , . . . , x n : p n ] , where each x i is an amount of money. Then, � EV ( L ) = p i × x i i 10 / 18

  15. Expected Value of a Lottery Suppose that the outcomes of a lottery are monetary values. So, L = [ x 1 : p 1 , x 2 : p 2 , . . . , x n : p n ] , where each x i is an amount of money. Then, � EV ( L ) = p i × x i i E.g., if L = [$ 100 : 0 . 55 , $ 50 : 0 . 25 , $ 0 : 0 . 20 ] , then EV ( L ) = 0 . 55 ∗ 100 + 0 . 25 ∗ 50 + 0 . 2 ∗ 0 = 80 10 / 18

  16. You are given a choice between two lotteries L 1 and L 2 . The outcome of the lotteries is determined by flipping a fair coin. The payoff for the two lotteries are given in the following table: Heads Tails L 1 $1M $1M L 2 $3M $0 Which of the two lotteries would you choose? 1. L 1 2. L 2 3. I am indifferent between the two lotteries 11 / 18

  17. Expected monetary value Suppose that the outcomes of a lottery are monetary values. So, L = [ x 1 : p 1 , x 2 : p 2 , . . . , x n : p n ] , where each x i is an amount of money. Then, � EV ( L ) = p i × x i i 12 / 18

  18. Expected monetary value Suppose that the outcomes of a lottery are monetary values. So, L = [ x 1 : p 1 , x 2 : p 2 , . . . , x n : p n ] , where each x i is an amount of money. Then, � EV ( L ) = p i × x i i E.g., if L = [$ 100 : 0 . 55 , $ 50 : 0 . 25 , $ 0 : 0 . 20 ] , then EV ( L ) = 0 . 55 ∗ 100 + 0 . 25 ∗ 50 + 0 . 2 ∗ 0 = 80 12 / 18

  19. Problems with using monetary payoffs ◮ Overly Restrictive: We care about more things than money. 13 / 18

  20. Problems with using monetary payoffs ◮ Overly Restrictive: We care about more things than money. ◮ The St. Petersburg Paradox: Consider the following wager: I will flip a fair coin until it comes up heads; if the first time it comes up heads is the n th toss, then I will pay you 2 n . What’s the most you’d be willing to pay for this wager? What is its expected monetary value? 13 / 18

  21. Problems with using monetary payoffs ◮ Overly Restrictive: We care about more things than money. ◮ The St. Petersburg Paradox: Consider the following wager: I will flip a fair coin until it comes up heads; if the first time it comes up heads is the n th toss, then I will pay you 2 n . What’s the most you’d be willing to pay for this wager? What is its expected monetary value? ◮ Valuing Money: Doesn’t the value of a wager depend on more than merely how much it’s expected to pay out? (I.e., your total fortune, how much you personally care about money, etc.) 13 / 18

  22. Problems with using monetary payoffs ◮ Overly Restrictive: We care about more things than money. ◮ The St. Petersburg Paradox: Consider the following wager: I will flip a fair coin until it comes up heads; if the first time it comes up heads is the n th toss, then I will pay you 2 n . What’s the most you’d be willing to pay for this wager? What is its expected monetary value? ◮ Valuing Money: Doesn’t the value of a wager depend on more than merely how much it’s expected to pay out? (I.e., your total fortune, how much you personally care about money, etc.) ◮ Risk-aversion: Is it irrational to prefer a sure-thing $x to a wager whose expected payout is $x? 13 / 18

  23. We should move away from “monetary payouts” to “utility”. 14 / 18

  24. Comments on Expected Utility Options 1/2 1/2 L 1 1 M 1 M L 2 3 M 0 M 15 / 18

  25. Comments on Expected Utility Options 1/2 1/2 L 1 1 M 1 M L 2 3 M 0 M EVM ( L 1 ) = 1 / 2 · 1 + 1 / 2 · 1 = 1 EVM ( L 1 ) = 1 / 2 · 3 + 1 / 2 · 0 = 1 . 5 15 / 18

  26. Comments on Expected Utility Options 1/2 1/2 L 1 1 M 1 M L 2 3 M 0 M EVM ( L 1 ) = 1 / 2 · 1 + 1 / 2 · 1 = 1 EVM ( L 1 ) = 1 / 2 · 3 + 1 / 2 · 0 = 1 . 5 What numbers should we use in place of monetary value? 15 / 18

  27. Comments on Expected Utility Options 1/2 1/2 L 1 1 M 1 M L 2 3 M 0 M EVM ( L 1 ) = 1 / 2 · 1 + 1 / 2 · 1 = 1 EVM ( L 1 ) = 1 / 2 · 3 + 1 / 2 · 0 = 1 . 5 What numbers should we use in place of monetary value? (moral) value? personal utility? 15 / 18

  28. Expected Utility Suppose that X = { x 1 , . . . , x n } and u : X → R is a utility function on X . This can be extended to an expected utility function EU : L ( X ) → R where EU ([ x 1 : p 1 , . . . , x n : p n ]) = p 1 × u ( x 1 ) + · · · + p n × u ( x n ) � n = i = 1 p i × u ( x i ) 16 / 18

  29. Suppose that Ann is faced with the choice between lotteries L 1 and L 2 where: L 1 = [ o 1 : 0 . 4 , o 2 : 0 . 6 ] L 2 = [ o 3 : 1 . 0 ] Can expected utility theory tell us how Ann should rank L 1 and L 2 ? 17 / 18

  30. Suppose that Ann is faced with the choice between lotteries L 1 and L 2 where: L 1 = [ o 1 : 0 . 4 , o 2 : 0 . 6 ] L 2 = [ o 3 : 1 . 0 ] Can expected utility theory tell us how Ann should rank L 1 and L 2 ? No! Suppose that Ann is also faced with the choice between lotteries L 3 and L 4 where: L 3 = [ o 1 : 0 . 2 , o 2 : 0 . 8 ] L 4 = [ o 3 : 0 . 5 , o 2 : 0 . 5 ] 17 / 18

  31. Suppose that Ann is faced with the choice between lotteries L 1 and L 2 where: L 1 = [ o 1 : 0 . 4 , o 2 : 0 . 6 ] L 2 = [ o 3 : 1 . 0 ] Can expected utility theory tell us how Ann should rank L 1 and L 2 ? No! Suppose that Ann is also faced with the choice between lotteries L 3 and L 4 where: L 3 = [ o 1 : 0 . 2 , o 2 : 0 . 8 ] L 4 = [ o 3 : 0 . 5 , o 2 : 0 . 5 ] If we know that Ann ranks L 2 over L 1 (e.g., L 2 ≻ L 1 ), can we conclude anything about how Ann ranks L 3 and L 4 ? 17 / 18

  32. Suppose that Ann is faced with the choice between lotteries L 1 and L 2 where: L 1 = [ o 1 : 0 . 4 , o 2 : 0 . 6 ] L 2 = [ o 3 : 1 . 0 ] Can expected utility theory tell us how Ann should rank L 1 and L 2 ? No! Suppose that Ann is also faced with the choice between lotteries L 3 and L 4 where: L 3 = [ o 1 : 0 . 2 , o 2 : 0 . 8 ] L 4 = [ o 3 : 0 . 5 , o 2 : 0 . 5 ] If we know that Ann ranks L 2 over L 1 (e.g., L 2 ≻ L 1 ), can we conclude anything about how Ann ranks L 3 and L 4 ? Yes: Ann must rank L 4 over L 3 (e.g., L 4 ≻ L 3 ). 17 / 18

Recommend


More recommend