Part 2: Cellular Automata 8/27/04 Lecture 4 Approach Langton’s Investigation • Investigate 1D CAs with: – random transition rules Under what conditions can we expect a – starting in random initial states complex dynamics of information to emerge • Systematically vary a simple parameter spontaneously and come to dominate the characterizing the rule behavior of a CA? • Evaluate qualitative behavior (Wolfram class) 8/27/04 1 8/27/04 2 Assumptions Langton’s Lambda • Periodic boundary conditions – no special place • Designate one state to be quiescent state • Strong quiescence: – if all the states in the neighborhood are the same, then • Let K = number of states the new state will be the same • Let N = 2 r + 1 = area of neighborhood – persistence of uniformity • Let T = K N = number of entries in table • Spatial isotropy: – all rotations of neighborhood state result in same new • Let n q = number mapping to quiescent state state – no special direction • Then � = T � n q • Totalistic [not used by Langton]: T – depend only on sum of states in neighborhood – implies spatial isotropy 8/27/04 3 8/27/04 4 1
Part 2: Cellular Automata 8/27/04 Example Range of Lambda Parameter • If all configurations map to quiescent state: • States: K = 5 � = 0 • Radius: r = 1 • If no configurations map to quiescent state: � = 1 • Initial state: random • If every state is represented equally : • Transition function: random (given � ) � = 1 – 1/ K • A sort of measure of “excitability” 8/27/04 5 8/27/04 6 Class I ( � = 0.2) Class I ( � = 0.2) Closeup time 8/27/04 7 8/27/04 8 2
Part 2: Cellular Automata 8/27/04 Class II ( � = 0.4) Class II ( � = 0.4) Closeup 8/27/04 9 8/27/04 10 Class II ( � = 0.31) Class II ( � = 0.31) Closeup 8/27/04 11 8/27/04 12 3
Part 2: Cellular Automata 8/27/04 Class II ( � = 0.37) Class II ( � = 0.37) Closeup 8/27/04 13 8/27/04 14 Class III ( � = 0.5) Class III ( � = 0.5) Closeup 8/27/04 15 8/27/04 16 4
Part 2: Cellular Automata 8/27/04 Class IV ( � = 0.35) Class IV ( � = 0.35) Closeup 8/27/04 17 8/27/04 18 Class IV ( � = 0.34) 8/27/04 19 8/27/04 20 5
Part 2: Cellular Automata 8/27/04 8/27/04 21 8/27/04 22 Class IV Shows Some of the Characteristics of Computation • Persistent, but not perpetual storage • Terminating cyclic activity • Global transfer of control/information 8/27/04 23 8/27/04 24 6
Part 2: Cellular Automata 8/27/04 � of Life Transient Length (I, II) • For Life, � � 0.273 • which is near the critical region for CAs with: K = 2 N = 9 8/27/04 25 8/27/04 26 Shannon Information Transient Length (III) (very briefly!) • Information varies directly with surprise • Information varies inversely with probability • Information is additive • � The information content of a message is proportional to the negative log of its probability I s { } = � lgPr s { } 8/27/04 27 8/27/04 28 7
Part 2: Cellular Automata 8/27/04 Maximum and Minimum Entropy Entropy • Suppose have source S of symbols from • Maximum entropy is achieved when all ensemble { s 1 , s 2 , …, s N } signals are equally likely • Average information per symbol: No ability to guess; maximum surprise N N � � H max = lg N ( ) Pr s k { } I s k { } = Pr s k { } � lgPr s k { } k = 1 k = 1 • Minimum entropy occurs when one symbol is certain and the others are impossible • This is the entropy of the source: No uncertainty; no surprise N � H S { } = � Pr s k { } lgPr s k { } H min = 0 k = 1 8/27/04 29 8/27/04 30 Entropy of Transition Rules Entropy Examples • Among other things, a way to measure the uniformity of a distribution � H = � p i lg p i i • Distinction of quiescent state is arbitrary • Let n k = number mapping into state k • Then p k = n k / T K H = lg T � 1 � n k lg n k T k = 1 8/27/04 31 8/27/04 32 8
Part 2: Cellular Automata 8/27/04 Further Investigations by Entropy Range Langton • Maximum entropy ( � = 1 – 1/ K ): • 2-D CAs uniform as possible all n k = T / K • K = 8 H max = lg K • N = 5 • 64 � 64 lattice • Minimum entropy ( � = 0 or � = 1): non-uniform as possible • periodic boundary conditions one n s = T all other n r = 0 ( r � s ) H min = 0 8/27/04 33 8/27/04 34 Avg. Transient Length vs. � Avg. Cell Entropy vs. � ( K =4, N =5) ( K =4, N =5) H ( A ) = K � p k lg p k k = 1 8/27/04 35 8/27/04 36 9
Part 2: Cellular Automata 8/27/04 Avg. Cell Entropy vs. � Avg. Cell Entropy vs. � ( K =4, N =5) ( K =4, N =5) 8/27/04 37 8/27/04 38 Avg. Cell Entropy vs. �� Avg. Cell Entropy vs. � ( K =4, N =5) ( K =4, N =5) 8/27/04 39 8/27/04 40 10
Part 2: Cellular Automata 8/27/04 Avg. Cell Entropy vs. �� Entropy of Independent Systems ( K =4, N =5) • Suppose sources A and B are independent • Let p j = Pr{ a j } and q k = Pr{ b k } • Then Pr{ a j , b k } = Pr{ a j } Pr{ b k } = p j q k � ( ) lgPr a j , b k ( ) H ( A , B ) = Pr a j , b k j , k � ( ) � ( ) = p j q k lg p j q k = p j q k lg p j + lg q k j , k j , k � � = p j lg p j + q k lg q k = H ( A ) + H ( B ) j k 8/27/04 41 8/27/04 42 Avg. Mutual Info vs. � Mutual Information ( K =4, N =5) • Mutual information measures the degree to which two sources are not independent • A measure of their correlation I ( A , B ) = ( ) = H A ( ) + H B ( ) � H A , B ( ) I A , B H ( A ) + H ( B ) – H ( A , B ) • I ( A , B ) = 0 for completely independent sources • I ( A , B ) = H ( A ) = H ( B ) for completely correlated sources 8/27/04 43 8/27/04 44 11
Part 2: Cellular Automata 8/27/04 Avg. Mutual Info vs. �� Complexity vs. � ( K =4, N =5) 8/27/04 45 8/27/04 46 Schematic of Additional Bibliography CA Rule Space vs. � 1. Langton, Christopher G. “Computation at the Edge of Chaos: Phase Transitions and Emergent Computation,” in Emergent Computation , ed. Stephanie Forrest. North-Holland, 1990. 2. Langton, Christopher G. “Life at the Edge of Chaos,” in Artificial Life II , ed. Langton et al. Addison-Wesley, 1992. 3. Emmeche, Claus. The Garden in the Machine: The Emerging Science of Artificial Life . Princeton, 1994. 8/27/04 Fig. from Langton, “Life at Edge of Chaos” 47 8/27/04 48 12
Recommend
More recommend