pseudo random number generators
play

Pseudo-Random Number Generators Functional Programming and - PowerPoint PPT Presentation

Pseudo-Random Number Generators Functional Programming and Intelligent Algorithms Prof Hans Georg Schaathun Hgskolen i lesund 14th February 2017 1 Randomness 1. What is randomness? 2 Randomness 1. What is randomness? 2. How do we


  1. Pseudo-Random Number Generators Functional Programming and Intelligent Algorithms Prof Hans Georg Schaathun Høgskolen i Ålesund 14th February 2017 1

  2. Randomness 1. What is randomness? 2

  3. Randomness 1. What is randomness? 2. How do we create probabilistic computer programs? 2

  4. Randomness 1. What is randomness? 2. How do we create probabilistic computer programs? 3. I.e. how do we make the computer act at random? 2

  5. Two options 3

  6. Two options True randomness uses physical sources of entropy 1. /dev/random on many systems 2. random-fu in Haskell 3

  7. Two options True randomness uses physical sources of entropy 1. /dev/random on many systems 2. random-fu in Haskell Pseudo-random number generators (PRNG) are deterministic but random- looking — random , standard package in Haskell — random-tf , more recent Haskell package 3

  8. Linear Congruential Generators mod m , x i = a + cx i − 1 x 0 is a given seed — Pseudo-random sequence [ x 0 , x 1 , x 2 , . . . ] — Aka. Lehmer’s algorithm 4

  9. Ciphers in counter mode e k ( m ) Alice Bob Eve x i = e k ( i ) Pseudo-random sequence x 0 is a given seed [ x 0 , x 1 , x 2 , . . . ] 5

  10. The PRNG is a state machine next t e x n s 2 s 3 n e x t s 1 t x s 4 e n next next s 6 s 7 s 5 6

  11. The PRNG is a state machine x 1 x 2 x 3 next t e x n s 2 s 3 n e x t s 1 t x s 4 e n next next s 6 s 7 s 5 x 5 x 6 x 4 6

  12. The PRNG is a state machine x 1 x 2 x 3 next t e x n s 2 s 3 n e x t s 1 t x s 4 e n next next s 6 s 7 s 5 x 5 x 6 x 4 — next :: State -> (State,Int) 6

  13. The PRNG is a state machine x 1 x 2 x 3 next t e x n s 2 s 3 n e x t s 1 t x s 4 e n next next s 6 s 7 s 5 x 5 x 6 x 4 — next :: State -> (State,Int) — Lehmer: next s = (s’,s’) where s’ = (a + x*s) ‘mod‘ m 6

  14. The PRNG is a state machine x 1 x 2 x 3 next t e x n s 2 s 3 n e x t s 1 t x s 4 e n next next s 6 s 7 s 5 x 5 x 6 x 4 — next :: State -> (State,Int) — Lehmer: next s = (s’,s’) where s’ = (a + x*s) ‘mod‘ m — Cipher: next s = (s + 1 ‘mod‘ m, encrypt k s) 6

  15. random-tf package 1. next :: TFGen -> (TFGen,Word32) Exercise Given a TFGen object, how do you generate an random, infinite list of Word32 objects? 7

  16. Splitting a PRNG 1. split :: TFGen -> (TFGen,TFGen) 2. (g’,newstate) = split g 3. Use g’ to generate the list 4. newstate is your new state 8

  17. Where do you get the initial state? 9

  18. Where do you get the initial state? 1. Hardcode an arbitrary seed 2. Use initialisation functions in the library 2.1 initTFGen 3. Use a library which provides true random values • random-fu 9

  19. Tuning parameters 1. Distribution of random initial weights? 2. β in the sigmoid function? 3. Number of iterations? 10

  20. Some guidelines — Weights: − 1 / √ n ≤ w ≤ 1 / √ n • where n is the number of inputs to the layer — The weights should have similar magnitude — Small β — β ≤ 3 1. β = 1 is a good starting point 11

  21. Number of epochs 12

  22. Exercise — Random starting weights 1. initNeuron 2. initNetwork — Test your network — Experiment by varying 1. magnitude of initial weights 2. β 3. number of epochs 13

Recommend


More recommend