the paradoxes of confirmation suppose we give up counting
play

The Paradoxes of Confirmation Suppose we give up counting how often - PowerPoint PPT Presentation

The Paradoxes of Confirmation Suppose we give up counting how often we see an event as the measure of its probability and instead move to looking at how expected an event is. At the start of the reading for today, John Leslie says the following:


  1. The Paradoxes of Confirmation

  2. Suppose we give up counting how often we see an event as the measure of its probability and instead move to looking at how expected an event is. At the start of the reading for today, John Leslie says the following: Prima facie, we should prefer theories which make our observations fairly much to be expected, rather than highly extraordinary. Waking up in the night, you form two theories. Each has a half chance of being right, you estimate. The first, that you left the back door open, gives the chances as 10 per cent that the neighbour’s cat is in your bedroom. The second, that you shut the door, puts those chances at 0.01 per cent. You switch on the light and see the cat. You should now much prefer the first theory. Here we have what seems to be a reasonable standard of evidence relative to what would be expected on a given theory: (henceforth we will use P(H | E) to mean the probability of H given E) E is evidence for T 1 over T 2 if P(E | T 1 ) > P(E | T 2 )

  3. E is evidence for T 1 over T 2 if P(E | T 1 ) > P(E | T 2 ) • Suppose there is a giant bowl and some number of marbles numbered in order starting with 1. A marble is drawn at random and it reads “3”. Should we then favor a theory in which the bowl contains 10,000 marbles, or one in which it contains 10 marbles? • It seems obvious we should favor the theory that there are 10 marbles in the urn because if there were 10 marbles, there would be a 10 % chance of getting “3” randomly, whereas there would be a 0.01 % chance of getting “3” randomly given 10,000 marbles. • Our claim above coincides with this; we should take the “3” marble as evidence in favor of there being 10 marbles, but it leaves a lot of other questions unanswered: ◮ How strong this evidence is? ◮ How strongly should we bet that there are only 10 marbles? ◮ If we though that the giant container made the 10,000 marble hypothesis much more likely, then is the “3” marble enough to overcome this initial intuition?

  4. Bayes’ Theorem The standard answer to how we update probabilities to evidence comes from the 18th century mathematician Thomas Bayes. One way to state Bayes’ theorem is P(A | B)= P ( A & B ) P ( B ) The conditional probability of A given that B is the probability of both of them occurring divided by the probability of B occurring Bayes’ Theorem P(A | B)= P ( B | A ) ∗ P ( A ) P ( B )

  5. Bayes’ Theorem P(H | E)= P ( E | H ) ∗ P ( H ) P ( E ) • One can think of this as saying that to figure out the probability of a hypothesis (H) given some evidence (E), it depends on the probability of the evidence given the hypothesis, the prior probability of the hypothesis, and the prior probability of the evidence. • Often, it is easier to figure out the probability of evidence given a hypothesis than the probability of the hypothesis given evidence.

  6. Bayes’ Theorem P(H | E)= P ( E | H ) ∗ P ( H ) P ( E ) Consider the marbles again. It is difficult to know the probabilities we should assign to the two theses given that a “3” came out, but we can calculate this with Bayes’ theorem. If H is the hypothesis that there are 10 marbles in the bowl, then P(E | H)=0.1 If we suppose that each hypothesis is equally likely, then P(H)=0.5 If each hypothesis has chance 0.5, then we can calculate the chance that “3” came out as P(E)=0.5(0.1)+0.5(0.0001)=0.05005 Using Bayes’ theorem, we can figure out the chance that there are only 10 marbles in the bowl give that “3” came out is 0 . 1(0 . 5) 0 . 05005 , which equals 0.999 This means, if you thought each hypothesis was equally probable, the fact that “3” came out should make you change your mind to thinking it is 99.9% likely that there are only 10 marbles in the bowl.

  7. Bayes’ Theorem P(H | E)= P ( E | H ) ∗ P ( H ) P ( E ) Suppose on the other hand that you thought the fact that there was a giant bowl made the 10,000 marble hypothesis more likely. Let’s say that it has a 95% chance of being right. Given that H is still the claim that there are 10 marbles, this will not change P(E | H); what it will change is the P(H) and the P(E) P(H) becomes 0.05 P(E) becomes 0.05(0.1)+0.95(0.0001)=0.005095 Thus our new P(H | E)= 0 . 1(0 . 05) 0 . 005095 =.981 So even if you were 95% sure that there were more than 10 marbles initially, upon seeing the “3” you should now be 98% sure that there are only 10 marbles. Using the same reasoning, if you were 99.9% sure that there were more than 10 marbles, you should now be exactly 50/50 on which hypothesis is right.

  8. Bayes’ Theorem P(H | E)= P ( E | H ) ∗ P ( H ) P ( E ) • Bayesian probabilities are psychological probabilities; they are mental probabilities we assign to chances of different claims being true. • It is sometimes helpful to think of them in terms of bets. The Bayesian probability is the rate at which you should accept a bet. If you refuse to follow these probabilities, you can be robbed of a lot of money in betting, which seems to be paradigmatically irrational. • The Bayesian understanding of probabilities also gives us a reason to reject the frequentist reasoning of the Ravens paradox • Noticing the frequency of something can help us set initial probabilities, but it cannot tell us by itself how exactly to update our belief set given some evidence

  9. Bayes’ Theorem P(H | E)= P ( E | H ) ∗ P ( H ) P ( E ) Leslie asks us to consider the following two hypotheses: Doom Soon The human race will be eliminated by some great catastrophe by the year 2150 with the total number of having been born before then being 500 billion. Doom Delayed The human race will survive at present population levels or higher for a few thousand centuries with the total number of humans being 50 thousand billion. I then add in one other piece of information − I am one of the first 50 billion people to exist. This turns out to be drastic evidence. Consider, on DS, the chance of my being born in the first 50 billion people is 0.1, while in DD, the chance of my being born in the first 50 billion people is 0.001

  10. Bayes’ Theorem P(H | E)= P ( E | H ) ∗ P ( H ) P ( E ) P(me | DS)=0.1; P(me | DD)=0.001 Suppose that, despite growing evidence, we want to be optimists and so we think the initial probability of DS is only 1%. On the current assumption that there are only two options regarding our doom, this makes the probability that I am born in the first 50 billion people 0.1(0.01)+0.001(0.99)=0.00199 Putting this together, we can conclude that P(DS | me)= 0 . 1(0 . 01) 0 . 00199 =0.503 (i.e. we should now slightly favor doom soon over doom delayed) The same reasoning says that if we were 50/50 initially, then upon reflecting about our place in the timeline, we should now in fact be over 99% sure that the human race will end by 2150. If we replace the eventual number of humans in DD with 50 million billion, then even if we assign a 1% initial chance to DS, we should now assign it a 99.9% chance.

  11. • One reason to consider this a paradox is that the numbers seem unbelievable. • The bigger reason to consider this a paradox is the way we arrive at the conclusion. It seems crazy that whether or not we would successfully colonize other planets determines whether or not we should believe that the human race will survive 2150. • One way to object to the paradox is that it is wrong for only considering two possible outcomes, rather than the infinitely many different possible outcomes. However, it is not clear that this helps. If we were open to any number of marbles in the bowl, and a “3” came out, how many should we think are in the bowl? • It seems more promising to challenge the analogy between my place in the birth order of humankind and a random drawing.

  12. Should we think of our place in the history of mankind random? • What is the doomsday equivalent of the number “3” in the random drawing? • It is supposed to by my place in the birth order, of humans such that I am among the first 50 billion • But if I am labeled “50 billionth human”, this is only the case because of when I was born • The marble equivalent would be writing numbers on each marble after it was drawn (“1” on the first, “2” on the second). • If we did this, the fact that the third marble drawn gets a “3” written on it counts as no evidence whatsoever about how many marbles there are. • I conclude that Bayesian reasoning stands, and that we need to be careful in applying it.

  13. Bonus Argument! David Hume famously offered the following argument against believing that a miracle has occurred: A miracle is a violation of the laws of nature; and as a firm and unalterable experience has established these laws, the proof against a miracle, from the very nature of the fact, is as entire as any argument from experience can possibly be imagined. ... The plain consequence is (and it is a general maxim worthy of our attention), ‘That no testimony is sufficient to establish a miracle, unless the testimony be of such a kind, that its falsehood would be more miraculous, than the fact, which it endeavours to establish; This is a notoriously difficult argument to interpret, but on many interpretations, Hume is wrongly assuming here a frequentist understanding of the likelihood of an event. Given Bayesianism, it is easy for ordinary evidence to prove unlikely events occurred. Also notable is the fact that whether or not it is rational to believe a miracle claim will be relative to your background beliefs and the prior probabilities you assign to things. Theories of confirmation are incredibly interesting when applied to miracle claims.

Recommend


More recommend