anthropic decision theory
play

Anthropic Decision Theory I think, I am, therefore I am - PowerPoint PPT Presentation

Anthropic Decision Theory I think, I am, therefore I am therefore... I do? Why anthropic decisions make sense, but anthropic probabilitjes dont. A n t h r o p i c q u e s tj o n s ? ? ? ? Humanity on Earth implies... ...what


  1. Anthropic Decision Theory I think, I am, therefore I am therefore... I do? Why anthropic decisions make sense, but anthropic probabilitjes don’t.

  2. A n t h r o p i c q u e s tj o n s ? ? ⇒ ? ? Humanity on Earth implies... ...what about the universe?

  3. Sleeping Beauty I Amnesia Heads Zzzz... Zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz... Zzzz... Zzzzzz... Tails Zzzz... Sunday Amnesia Monday Tuesday

  4. Sleeping Beauty I Amnesia Heads Zzzz... Zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz... Zzzz... Zzzzzz... Tails Zzzz... Sunday Amnesia Monday Tuesday Upon awakening, what is the probability of Heads? Of Monday?

  5. Sleeping Beauty II Incubator Heads Room 1 Room 2 Tails Room 1 Room 2 Upon awakening, what is the probability of Heads? Of Room1?

  6. Standard resolutjons: probability • H a l f e r p o s i tj o n : 1 / 2 o n h e a d s .

  7. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. Those are the initjal odds. And you learn nothing new: no update.

  8. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. Those are the initjal odds. And you learn nothing new: no update. • Thirder positjon: 1/3 on heads.

  9. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. Those are the initjal odds. And you learn nothing new: no update. • Thirder positjon: 1/3 on heads. Because “(Monday, heads)”, “(Monday, tails)”, and “(Tuesday, tails)” are indistjnguishable.

  10. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. Those are the initjal odds. And you learn nothing new: no update. • Thirder positjon: 1/3 on heads. Because “(Monday, heads)”, “(Monday, tails)”, and “(Tuesday, tails)” are indistjnguishable. “(Tuesday, heads)” m u s t tell you something.

  11. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. S e l f - S a m p l i n g A s s u m p tj o n ( S S A ) : An observer is randomly selected from the set of all a c t u a l l y existent observers in their reference class. • Thirder positjon: 1/3 on heads. Self-Indicatjon Assumptjon (SIA) : An observer is randomly selected from the set of all possible observers.

  12. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. Self-Sampling Assumptjon (SSA) : An observer is randomly selected from the set of all actually existent observers in their reference class. • Thirder positjon: 1/3 on heads. Self-Indicatjon Assumptjon (SIA) : An observer is randomly selected from the set of all possible observers.

  13. Standard resolutjons: probability • Halfer positjon: 1/2 on heads. Self-Sampling Assumptjon (SSA) : An observer is randomly selected from the set of all actually existent observers in their reference class. • Thirder positjon: 1/3 on heads. Self-Indicatjon Assumptjon (SIA) : An observer is randomly selected from the set of all possible observers.

  14. Adam and Eve paradox SSA prefers small universes (present and future)

  15. Adam and Eve paradox SSA prefers small universes (present and future) ?

  16. Adam and Eve paradox SSA prefers small universes (present and future) ?

  17. Adam and Eve paradox SSA prefers small universes (present and future) ?

  18. Doomsday argument SSA prefers small universes (present and future)

  19. Presumptuous philosopher SIA prefers large universes (present, not future)

  20. Presumptuous philosopher SIA prefers large universes (present, not future) Λ= ?

  21. Presumptuous philosopher SIA prefers large universes (present, not future) Λ= ?

  22. Presumptuous philosopher SIA prefers large universes (present, not future) I know!!! Λ= ?

  23. Presumptuous philosopher SIA prefers large universes (present, not future) I know!!! Λ= ?

  24. Presumptuous philosopher SIA prefers large universes (present, not future) I know!!! Λ= ? I’ll bet you at odds of a trillion to one on the trillion tjmes bigger universe

  25. Presumptuous philosopher SIA prefers large universes (present, not future) I know!!! Λ= ? You can’t produce I’ll bet you at odds enough evidence to of a trillion to one on change my mind the trillion tjmes bigger universe

  26. Is anthropics the problem? Psy-Kosh’s non-anthropic Heads problem Room 1 Room 2 Tails Room 1 Room 2

  27. Is anthropics the problem? Psy-Kosh’s non-anthropic Heads problem Room 1 Room 2 Tails Room 1 Room 2

  28. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 Tails Room 1 Room 2

  29. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2

  30. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 If I say tails, she says...

  31. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 ... If I say tails, she says...

  32. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 ... If I say tails, she says... Evidentjal Decision Theory Causal Decision Theory

  33. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem How much do I care about her, anyway? Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 ... If I say tails, she says... Evidentjal Decision Theory Causal Decision Theory

  34. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem How much do I care about her, anyway? Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 ... If I say tails, she says... Evidentjal Decision Theory Altruistjc Causal Decision Theory Selfjsh (precommit?)

  35. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem How much Do I do this, do I care about or did we do it her, anyway? together? Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 ... If I say tails, she says... Evidentjal Decision Theory Altruistjc Causal Decision Theory Selfjsh (precommit?)

  36. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem How much Do I do this, do I care about or did we do it her, anyway? together? Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 ... If I say tails, she says... Evidentjal Decision Theory Altruistjc Total responsibility Causal Decision Theory Selfjsh (precommit?) Partjal responsibility

  37. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 Evidentjal Decision Theory Altruistjc Total responsibility Causal Decision Theory Selfjsh (precommit?) Partjal responsibility

  38. Is anthropics the problem? Psy-Kosh’s 1 decider: non-anthropic Heads gain if guess heads problem Room 1 Room 2 2 deciders: Tails gain if both guess tails Room 1 Room 2 Evidentjal Decision Theory Altruistjc Total responsibility SIA Causal Decision Theory Selfjsh (precommit?) Partjal responsibility SSA

  39. Anthropic probabilitjes don’t really make sense Frequentjsm: ... ... ...

  40. Anthropic probabilitjes don’t really make sense Frequentjsm: ... ... ... ? ? ?? ? ? How many tjmes were you right (SIA)? vs How many experiments were you right in (SSA)?

  41. Anthropic probabilitjes don’t really make sense Bayesianism: ? ? ?? ? ?

  42. Anthropic probabilitjes don’t really make sense Bayesianism: ? ? ?? ? ? Uncertain about the world with you in it (SSA)? vs Uncertain about you in the world (SIA)?

  43. Anthropic probabilitjes don’t really make sense Subjectjve credences and expectatjons: These were forged by evolutjon in non-anthropic situatjons.

  44. The morals of the talk Sleeping Beauty problem is underdefjned – need Beauty’s values . Even without anthropic probabilitjes, we can stjll reach the right decision .

  45. Decisions and values, not probabilitjes Upon each awakening, Beauty is ofgered a coupon at £X that pays £1 if the coin was tails.

  46. Decisions and values, not probabilitjes Upon each awakening, Beauty is ofgered a coupon at £X that pays £1 if the coin was tails. -x 1-x 1-x

  47. Decisions and values, not probabilitjes What would Sunday Beauty want? -x 1-x 1-x

  48. Decisions and values, not probabilitjes What would Sunday Beauty want? If all cash goes towards a “cause”: X < £2/3 -x Expected: 0.5(-X)+0.5(1-X) 2 1-x 1-x

  49. Decisions and values, not probabilitjes What would Sunday Beauty want? If all cash goes towards a “cause”: X < £2/3 -x Expected: 0.5(-X)+0.5(1-X) 2 1-x 1-x Axiom 1: Precommitments are possible.

Recommend


More recommend