the source coding theorem
play

The Source Coding Theorem Mathias Winther Madsen - PowerPoint PPT Presentation

The Source Coding Theorem Mathias Winther Madsen mathias.winther@gmail.com Institute for Logic, Language, and Computation University of Amsterdam March 2015 The Convergence of Averages Problem Which of the following is more probable? 1. an


  1. The Source Coding Theorem Mathias Winther Madsen mathias.winther@gmail.com Institute for Logic, Language, and Computation University of Amsterdam March 2015

  2. The Convergence of Averages Problem Which of the following is more probable? 1. an average of 4,000 in 1,000 dice rolls; 2. an average of 4,000,000 in 1,000,000 dice rolls.

  3. The Convergence of Averages The Weak Law of Large Numbers For every ε > 0 and α > 0 there is a t such that � n �� � i = 1 X i � � � Pr − E [ X ] ≤ α. � > ε � � n � 6 100 4 50 2 0 0 0 10 20 0 10 20

  4. Sequence Probabilities Problem With the point probabilities x t s e p ( x ) . 25 . 50 . 25 Given that we draw 10 letters from this distribution, 1. what is Pr ( stetsesses ) ? 2. what is the most probable sequence?

  5. Sequence Probabilities 1 0 . 8 Probability 0 . 6 0 . 4 0 . 2 0 · s t e t s e s s e s

  6. Sequence Probabilities 10 0 10 − 1 10 − 2 Probability 10 − 3 10 − 4 10 − 5 10 − 6 · s t e t s e s s e s

  7. Sequence Probabilities 0 Logarithmic probability − 5 − 10 − 15 − 20 · s t e t s e s s e s

  8. Sequence Probabilities 0 Logarithmic probability − 1 − 2 − 3 · s t e t s e s s e s

  9. Typical Sequences Definition The entropy of a random variable X is � 1 � H = E = − E [ log p ( X )] . log p ( X ) Definition An ε -typical sequence of length n is a sequence for which � � 1 � � � log p ( x 1 , x 2 , . . . , x n ) − Hn � < ε. � �

  10. Typical Sequences The Asymptotic Equipartition Property Eventually, everything has the same probability. The Source Coding Theorem For large n , there are only 2 Hn sequences worth caring about.

  11. Typical Sequences

Recommend


More recommend