combinatorial characterization of transducers with
play

Combinatorial Characterization of Transducers with Bounded Variance - PowerPoint PPT Presentation

Combinatorial Characterization of Transducers with Bounded Variance Sara Kropf Alpen-Adria-Universit at Klagenfurt Joint work with Clemens Heuberger and Stephan Wagner AofA Strobl, June 12, 2015 1 Motivation Theorem (Hwangs


  1. Combinatorial Characterization of Transducers with Bounded Variance Sara Kropf Alpen-Adria-Universit¨ at Klagenfurt Joint work with Clemens Heuberger and Stephan Wagner AofA Strobl, June 12, 2015 1

  2. Motivation Theorem (Hwang’s Quasi-Power-Theorem) Let Ω n be a sequence of real random variables. Suppose the moment generating function satisfies E ( e Ω n s ) = e u ( s )Φ( n )+ v ( s ) (1 + O ( κ − 1 n )) under some conditions. Then E Ω n = u ′ (0)Φ( n ) + O (1) , V Ω n = u ′′ (0)Φ( n ) + O (1) . If σ 2 := u ′′ (0) � = 0 , then Ω n − E Ω n is asymptotically normally √ V Ω n distributed. When is the variance bounded? 2

  3. Transducers transducer T with a finite 1 | 0 number of states 1 0 | 1 1 | 1 0 0 | 0 3

  4. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left 3

  5. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 3

  6. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 1 3

  7. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 11 3

  8. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 011 3

  9. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 1011 3

  10. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 01011 3

  11. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = output: 101011 3

  12. Transducers transducer T with a finite 1 | 0 number of states 1 Output( X n ) = sum of the output random word X n ∈ A n as 0 | 1 1 | 1 input today: equidistribution on 0 A n 0 | 0 read from right to left Example with X n = 11001 input: 11001 Output(11001) = 4 output: 101011 3

  13. Other Probability Model and Several Outputs All results also possible for: inputs coming from a Markov chain 0 . 2 | (0 , 3) for every transition a probability sum of probabilities of 0 . 8 | (1 , 9) output transitions is 1 0 . 6 | (1 , 7) Some results are independent of the choice of this Markov chain. 0 . 4 | (0 , 1) Several simultaneous outputs. 4

  14. Applications algorithms with finite memory usage many digit expansions: Hamming weight sum of digits function, . . . many recursions motifs 5

  15. Applications algorithms with finite memory usage many digit expansions: Hamming weight sum of digits function, . . . many recursions motifs completely q -additive functions digital sequences q -automatic sequences 5

  16. Applications 0 | 0 1 | 1 digit sum of binary expansion 6

  17. Applications 0 | 0 1 | 1 digit sum of binary expansion Hamming weight of non-adjacent form 0 | 0 1 | 0 (NAF): 0 | 1 0 | 0 digits { 0 , ± 1 } , base 2 1 | 0 1 | 1 0 1 1 at least one of any two adjacent digits is 0 6

  18. Applications 0 | 0 1 | 1 digit sum of binary expansion Hamming weight of non-adjacent form 0 | 0 1 | 0 (NAF): 0 | 1 0 | 0 digits { 0 , ± 1 } , base 2 1 | 0 1 | 1 0 1 1 at least one of any two adjacent digits is 0 w − 1 Hamming weight of width- w NAF: digits { 0 , ± 1 , ± 3 , . . . , ± (2 w − 1 − 1) } , 0 | 0 1 | 0 base 2 at least w − 1 of w consecutive 0 | 0 1 | 0 digits are 0 w + 1 w 1 1 | 1 0 | 1 0 | 0 1 | 0 6

  19. Variability Condition Theorem (Hwang’s Quasi-Power-Theorem) Let Ω n be a sequence of real random variables. Suppose the moment generating function satisfies E ( e Ω n s ) = e u ( s )Φ( n )+ v ( s ) (1 + O ( κ − 1 n )) under some conditions. Then E Ω n = u ′ (0)Φ( n ) + O (1) , V Ω n = u ′′ (0)Φ( n ) + O (1) . If σ 2 := u ′′ (0) � = 0 , then Ω n − E Ω n is asymptotically normally √ V Ω n distributed. Assume that T is strongly connected. Output( X n ) satisfies all asumptions, except maybe the variability condition σ 2 � = 0. 7

  20. Bounded Variance Theorem (Heuberger–K.–Wagner 2015) Let T be strongly connected. Then the following assertions are equivalent: 1 The asymptotic variance σ 2 is 0 . 2 There is a constant k such that the average output of every cycle is k. 3 There is a constant k such that Output( X n ) = kn + O (1) . 8

  21. Bounded Variance Theorem (Heuberger–K.–Wagner 2015) Let T be strongly connected. Then the following assertions are equivalent: 1 The asymptotic variance σ 2 is 0 . 2 There is a constant k such that the average output of every cycle is k. 3 There is a constant k such that Output( X n ) = kn + O (1) . Corollary (Heuberger–K.–Wagner 201) Let T be strongly connected, aperiodic with output alphabet { 0 , 1 } . Then the asymptotic variance σ 2 is 0 if and only if all output letters are the same. 8

  22. Small Example 1 | 0 1 | 1 1 | 0 1 | 0 0 | 1 1 | 1 0 | 0 0 | 0 0 | 0 1 | 1 0 | 0 1 | 0 0 | 1 1 | 0 0 | 0 0 | 0 1 | 0 0 | 0 0 | 0 1 | 0 0 | 2 1 | 2 1 | 2 0 | 2 0 | 0 1 | 0 1 | 2 0 | 2 � asymptotic variance � = 0 9

  23. Small Example 1 | 0 1 | 1 1 | 0 1 | 0 0 | 1 1 | 1 0 | 0 0 | 0 0 | 0 1 | 1 0 | 0 1 | 0 0 | 1 1 | 0 0 | 0 0 | 0 1 | 0 0 | 0 0 | 0 1 | 0 0 | 2 1 | 2 1 | 2 0 | 2 0 | 0 1 | 0 1 | 2 0 | 2 � asymptotic variance � = 0 Sage: σ 2 = 432 2197 9

  24. Example: τ -adic Digit Expansion algebraic integer τ joint expansion of d -dimensional vectors in Z [ τ ] d redundant digit set D which satisfies D ∩ τ Z d = { 0 } a subadditivity condition 10

  25. Example: τ -adic Digit Expansion algebraic integer τ joint expansion of d -dimensional vectors in Z [ τ ] d redundant digit set D which satisfies D ∩ τ Z d = { 0 } a subadditivity condition input: τ -adic expansions with the irredundant digit set A of length ≤ n with equidistribution 10

  26. Example: τ -adic Digit Expansion algebraic integer τ joint expansion of d -dimensional vectors in Z [ τ ] d redundant digit set D which satisfies D ∩ τ Z d = { 0 } a subadditivity condition input: τ -adic expansions with the irredundant digit set A of length ≤ n with equidistribution Theorem (Heigl–Heuberger 2012) If the asymptotic variance σ 2 of the minimal Hamming weight with digit set D is � = 0 , then the minimal Hamming weight is asymptotically normally distributed. 10

  27. Example: τ -adic Digit Expansion Heigl–Heuberger construct a transducer for each τ and D : cycle with average output 0 0 | 0 11

  28. Example: τ -adic Digit Expansion Heigl–Heuberger construct a transducer for each τ and D : cycle with average output 0 but not all minimal weights are 0 0 · · · 0 always leads to the initial state � cycle with average output � = 0 0 | 0 11

  29. Example: τ -adic Digit Expansion Heigl–Heuberger construct a transducer for each τ and D : cycle with average output 0 but not all minimal weights are 0 0 · · · 0 always leads to the initial state � cycle with average output � = 0 variability condition is satisfied � asymptotic normality 0 | 0 11

  30. Bounded Variance Theorem (Heuberger–K.–Wagner 2015) Let T be strongly connected. Then the following assertions are equivalent: 1 The asymptotic variance σ 2 is 0 . 2 There is a constant k such that the average output of every cycle is k. 3 There is a constant k such that Output( X n ) = kn + O (1) . 12

  31. Idea of the Proof of the Theorem 1 ⇔ 2: assume: asymptotic expected value of Output( X n ) is 0 probability generating function ∞ � � a ln K − n y l z n A ( y , z ) = l ∈ R n =0 with K = |A| and a ln = number of input words of length n with output sum l A (1 , z ) has a simple dominant pole at z = 1 13

  32. Idea of the Proof of the Theorem 1 ⇔ 2: assume: asymptotic expected value of Output( X n ) is 0 probability generating function ∞ � � a ln K − n y l z n A ( y , z ) = l ∈ R n =0 with K = |A| and a ln = number of input words of length n with output sum l A (1 , z ) has a simple dominant pole at z = 1 E (Output( X n )) = [ z n ] A y (1 , z ) = O (1) V (Output( X n )) = [ z n ] A yy (1 , z ) + O (1) 13

  33. Idea of the Proof of the Theorem Decomposition: ∈ C ∈ C ∈ P 14

Recommend


More recommend