15 251 great theoretical ideas in computer science
play

15-251 Great Theoretical Ideas in Computer Science Lecture 21: - PowerPoint PPT Presentation

15-251 Great Theoretical Ideas in Computer Science Lecture 21: Introduction to Randomness and Probability Theory Review April 4th, 2017 Randomness and the Universe Randomness and the Universe Does the universe have true randomness?


  1. 15-251 Great Theoretical Ideas in Computer Science Lecture 21: Introduction to Randomness and Probability Theory Review April 4th, 2017

  2. Randomness and the Universe

  3. Randomness and the Universe Does the universe have “true” randomness? Newtonian physics: Universe evolves deterministically. Quantum physics: Wrong !

  4. Randomness and the Universe Does the universe have “true” randomness? God does not play dice with the world. - Albert Einstein Einstein, don’t tell God what to do. - Niels Bohr

  5. Randomness is an essential tool in modeling and analyzing nature . It also plays a key role in computer science .

  6. Randomness and Computer Science

  7. Statistics via Sampling Population : 300m Random sample size : 2000 Theorem : With more than 99% probability, % in sample = % in population ± 2%.

  8. Randomized Algorithms Dimer Problem: Given a region, in how many different ways can you tile it with 2x1 rectangles (dominoes)? e.g. 1024 tilings m x n rectangle tilings Captures thermodynamic properties of matter. - Fast randomized algs can approximately count. - No fast deterministic alg known.

  9. Distributed Computing Break symmetry with randomness. Many more examples in the field of distributed computing .

  10. Nash Equilibria in Games The Chicken Game Swerve Straight 1 1 0 2 Swerve 2 0 -3 -3 Straight Theorem [Nash] : Every game has a Nash Equilibrium provided players can pick a randomized strategy. Exercise : What is a NE for the game above?

  11. Cryptography Adversary Eavesdropper “I will cut your throat” “I will cut your throat”

  12. Cryptography Adversary Eavesdropper “loru23n8uladjkfb!#@” “I will cut your throat” “loru23n8uladjkfb!#@” encryption decryption “loru23n8uladjkfb!#@” “I will cut your throat” Shannon : A secret is as good as the amount of entropy/uncertainty/randomness in it.

  13. Error-Correcting Codes “bit.ly/vrxUBN” noisy channel Alice Bob Each symbol can be corrupted with a certain probability. How can Alice still get the message across?

  14. Communication Complexity Want to check if the contents of two databases are exactly the same. How many bits need to be communicated?

  15. Interactive Proofs Verifier Prover poly-time omniscient skeptical untrustworthy Can I convince you that I have proved P ≠ NP without revealing any information about the proof?

  16. Quantum Computing

  17. Some Probability Puzzles (Test Your Intuition) and Origins of Probability Theory

  18. Origins of Probability Theory France, 1654 Let’s bet: I will roll a dice four times. I win if I get a 1. “Chevalier de Méré” Antoine Gombaud

  19. Origins of Probability Theory France, 1654 Hmm. No one wants to take this bet anymore. :-( “Chevalier de Méré” Antoine Gombaud

  20. Origins of Probability Theory France, 1654 New bet: I will roll two dice, 24 times. I win if I get double-1’s. “Chevalier de Méré” Antoine Gombaud

  21. Origins of Probability Theory France, 1654 Hmm. I keep losing money! :-( “Chevalier de Méré” Antoine Gombaud

  22. Origins of Probability Theory France, 1654 Alice and Bob are flipping a coin. Alice gets a point for heads. Bob gets a point for tails. First one to 4 points wins 100 Fr. Alice is ahead 3-2 when gendarmes arrive to break up the game. “Chevalier de Méré” How should they divide the stakes? Antoine Gombaud

  23. Origins of Probability Theory Pascal Fermat Probability Theory is born!

  24. Probability Theory: The CS Approach

  25. The Big Picture The Non-CS Approach Real World Mathematical Model (random) probability space experiment/process

  26. The Big Picture Real World Mathematical Model Ω Flip a coin. H T 1/2 1/2 = “sample space” Ω = set of all possible outcomes Pr : Ω → [0 , 1] prob. distribution X Pr[ ` ] = 1 ` ∈ Ω

  27. The Big Picture Real World Mathematical Model Flip a coin. H T unit pie, area = 1 Pr[outcome] = area of outcome = area of outcome area of pie

  28. The Big Picture Real World Mathematical Model Ω HH HT Flip two coins. 1/4 1/4 TH TT 1/4 1/4 HH HT TH TT

  29. The Big Picture Real World Mathematical Model Ω Flip a coin. If it is Heads, throw a 3-sided die. ? If it is Tails, throw a 4-sided die. ?

  30. The Big Picture The CS Approach Real World Code Probability Tree = Mathematical Model

  31. The Big Picture Real World Code Probability Tree flip <— Bernoulli(1/2) Flip a coin. If it is Heads, throw if flip = 1: # i.e. Heads a 3-sided die. die <— RandInt(3) If it is Tails, throw a else : 4-sided die. die <— RandInt(4)

  32. Probability Tree flip <— Bernoulli(1/2) if flip = H: Bernoulli(1/2) die <— RandInt(3) else : H T die <— RandInt(4) 1/2 1/2 RandInt(3) RandInt(4) 1 2 3 1 2 3 4 1/3 1/3 1/4 1/4 1/3 1/4 1/4 (H,1) (H,2) (H,3) (T,1) (T,2) (T,3) (T,4) Outcomes: Prob: 1/6 1/6 1/6 1/8 1/8 1/8 1/8

  33. Events Real World Code Probability Tree flip <— Bernoulli(1/2) Flip a coin. If it is Heads, throw if flip = H: a 3-sided die. die <— RandInt(3) If it is Tails, throw a else : 4-sided die. die <— RandInt(4) What is the probability die roll is ≥ 3 ? “event” subset of outcomes/leaves

  34. Events Bernoulli(1/2) H T 1/2 1/2 RandInt(3) RandInt(4) 1 2 3 1 2 3 4 1/3 1/3 1/4 1/4 1/3 1/4 1/4 (H,1) (H,2) (H,3) (T,1) (T,2) (T,3) (T,4) Outcomes: Prob: 1/6 1/6 1/6 1/8 1/8 1/8 1/8 E = die roll is 3 or higher Extend Pr to: Pr : P ( Ω ) → [0 , 1] Pr[ E ] = 1 / 6 + 1 / 8 + 1 / 8 = 5 / 12

  35. Conditional Probability Real World Code Probability Tree flip <— Bernoulli(1/2) Flip a coin. If it is Heads, throw if flip = H: a 3-sided die. die <— RandInt(3) If it is Tails, throw a else : 4-sided die. die <— RandInt(4) What is the probability of flipping Heads conditional probability given the die roll is ≥ 3 ? conditioning on partial information

  36. Conditional Probability Revising probabilities based on ‘ partial information ’. ‘ partial information ’ = event E Conditioning on E = Assuming/promising E has happened

  37. Conditional Probability Bernoulli(1/2) H T 1/2 1/2 RandInt(3) RandInt(4) 1 2 3 1 2 3 4 1/3 1/3 1/4 1/4 1/3 1/4 1/4 (H,1) (H,2) (H,3) (T,1) (T,2) (T,3) (T,4) Outcomes: Prob: 1/6 1/6 1/6 1/8 1/8 1/8 1/8 Prob | E : 0 0 2/5 0 0 3/10 3/10 E = die roll is 3 or higher Pr[( H, 1) | E ] = 0 Pr[( H, 3) | E ] = 2 / 5

  38. Conditional Probability Bernoulli(1/2) H T 1/2 1/2 RandInt(3) RandInt(4) 1 2 3 1 2 3 4 1/3 1/3 1/4 1/4 1/3 1/4 1/4 (H,1) (H,2) (H,3) (T,1) (T,2) (T,3) (T,4) Outcomes: Prob: 1/6 1/6 1/6 1/8 1/8 1/8 1/8 Prob | E : 0 0 2/5 0 0 3/10 3/10 E = die roll is 3 or higher A = Tails was flipped Pr[ A | E ] = 3 / 5

  39. Conditioning Ω E (H,1) (H,2) (H,3) (H,3) 1/6 1/6 1/6 2/5 E (T,1) (T,2) (T,3) (T,4) (T,3) (T,4) 1/8 1/8 1/8 1/8 3/10 3/10 Pr : Ω → [0 , 1] Pr E : E → [0 , 1] (T,4) (H,3) (T,4) (H,3) (T,3) (T,3)

  40. Conditioning Pr : Ω → [0 , 1] Pr E : E → [0 , 1] (T,4) (H,3) (T,4) (H,3) (T,3) (T,3) def Pr[ ` | E ] = Pr E [ ` ] ( 0 if ` 62 E = Pr[ ` ] / Pr[ E ] if ` 2 E

  41. Conditioning Pr : Ω → [0 , 1] Pr E : E → [0 , 1] (T,4) (H,3) (T,4) A (H,3) (T,3) (T,3) Pr[ A ∩ E ] Pr[ A | E ] = Pr[ E ] (cannot condition on an event with prob. 0)

  42. Conditional Probability —> Chain Rule Pr[ A ∩ B ] = Pr[ A ] · Pr[ B | A ] “For A and B to occur: - first A must occur (probability Pr[ A ] ) - then B must occur given that A occured (probability Pr[ B | A ] ).” Generalizes to more than two events. e.g. Pr[ A ∩ B ∩ C ] = Pr[ A ] · Pr[ B | A ] · Pr[ C | A ∩ B ]

  43. Conditional Probability —> LTP LTP = Law of Total Probability Ω A c E E ∩ A E ∩ A c A Pr[ E ] = Pr[ E ∩ A ] + Pr[ E ∩ A c ] = Pr[ A ] · Pr[ E | A ] + Pr[ A c ] · Pr[ E | A c ]

  44. Conditional Probability —> LTP LTP = Law of Total Probability If partition , then A 1 , A 2 , . . . , A n Ω Pr[ E ] = Pr[ A 1 ] · Pr[ E | A 1 ]+ Pr[ A 2 ] · Pr[ E | A 2 ]+ · · · Pr[ A n ] · Pr[ E | A n ] .

  45. Conditional Probability —> Independence Two events A and B are independent if Pr[ A | B ] = Pr[ A ] . This is equivalent to: Pr[ B | A ] = Pr[ B ] . This is equivalent to: Pr[ A ∩ B ] = Pr[ A ] · Pr[ B ] (except that this equality can be used even when , or .) Pr[ A ] = 0 Pr[ B ] = 0 So this is actually used for the definition of independence.

  46. Problem with Independence Definition Want to calculate Pr[ A ∩ B ] . If they are independent, we can use . Pr[ A ∩ B ] = Pr[ A ] · Pr[ B ] (but we need to show this equality to show independence) Argue independence by informally arguing: if B happens, this cannot affect the probability of A happening. Then use . Pr[ A ∩ B ] = Pr[ A ] · Pr[ B ]

  47. Problem with Independence Definition Real World Mathematical Model some notion of independence Pr[ A ∩ B ] = Pr[ A ] · Pr[ B ] of A and B (the secret definition of independence) problem : real-world description not always very rigorous.

Recommend


More recommend