Musings on the Logistic Map Shane Celis and Yun Tao PHY 256
Hypothesis • Entropy rate measures randomness • Lyapunov exponent measures randomness • Is there a relationship between the two? • Perhaps there’s a functional form something like this: λ = lim p →∞ f ( h µ ) • And how does partition resolution affect the transient entropy rate?
Method • Start simple. Use a 1D map, the beloved logistic map. x n +1 = r x n (1 − x n )
Transient Uncertainty of Length One 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 2 Partitions 0.1 0 0 0.5 1 1.5 2 2.5 3 3.5 4
Transient Uncertainty of Length One 2 1.8 1.6 1.4 1.2 1 0.8 0.6 0.4 4 Partitions 0.2 0 0 0.5 1 1.5 2 2.5 3 3.5 4
Transient Uncertainty of Length One 3 2.5 2 1.5 1 0.5 8 Partitions 0 3 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 4
Transient Entropy Convergence Rate 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.5 1 1.5 2 2.5 3 3.5 4
1 Lovely 0.8 0.6 0.4 0.2 100 1.5 2 2.5 3 3.5 4 1.5 2 2.5 3 3.5 4 -100 -200 -300 -400 -500
Entropy 4 3 2 1 2.5 2.75 3.25 3.5 3.75 4
Entropy and 4 Lyapunov 3 2 1 2.5 2.75 3.25 3.5 3.75 4 100 2.25 2.5 2.75 3.25 3.5 3.75 4 -100 -200 -300 -400 -500
Bifurcation and 1 Entropy 0.8 0.6 0.4 0.2 4 2.25 2.5 2.75 3.25 3.5 3.75 4 3 2 1 2.5 2.75 3.25 3.5 3.75 4
Entropy Rate 0.7 0.6 0.5 0.4 0.3 0.2 0.1 3.2 3.4 3.6 3.8 4
Entropy Rate vs. Lyapunov Exponent 0.6 0.5 0.4 0.3 0.2 0.1 -300 -200 -100 100
FOILED! 0.6 0.5 0.4 really? 0.3 0.2 0.1 -300 -200 -100 100
Just for fun 4 3.5 3 2.5 2 1.5 -300 -200 -100 100
Entropy vs Lyapunov Connected 4 3.5 3 2.5 2 1.5 -300 -200 -100 100
Recommend
More recommend