Simple nonunifilar binary word generators Sarah Marzen June 1, 2013
Outline ✤ Motivation ✤ Results ✤ Simple nonunifilar source (SNS) ✤ Variation on the SNS ✤ A different set of nonunifilar binary word generators (preliminary) ✤ Future work
Motivation ✤ ɛ -machines are most useful when we have no understanding of the system-- perfect for biological modeling ✤ Problem: neurobiological data is highly subsampled. fMRI, EEG, ECOG, electrophysiology Physics is Observe fun! behavior
Motivation ✤ These problems can maybe be couched as nonunifilar HMMs. Active Asleep
Outline: Results ✤ Simple nonunifilar source (the one studied in class) ✤ Simple nonunifilar source with adjustable transition probabilities ✤ Attempt to extend to continuous case ✤ Binary subsampled HMMs of a particular form, to be described
SNS 1 | 1 / 2 A B 1 | 1 / 2 1 | 1 / 2 0 | 1 / 2 ✓ 1 ✓ 1 1 ✓ ◆ ◆ ◆ 0 0 T (0) = T (1) = 2 2 2 π = 1 1 1 0 0 2 2 2
SNS 1 | 1 / 2 s 1 : s 2 : s n : s infty : 1 | M 12 1 | M 23 1 | M n − 1 ,n ...01 ...01 2 ...01 n ...01 inf ... ... 0 | M 10 0 | M n, 0 0 | M 10 1 | 1 0 | 1 / 2 s 0 : ...0 T (1) � n T (0) π 1 T � M n − 1 ,n = T (1) � n − 1 T (0) π 1 T � M n, 0 = 1 − M n,n +1
SNS 1 | 1 / 2 s 1 : s 2 : s n : s infty : 1 | M 12 1 | M 23 1 | M n − 1 ,n ...01 ...01 2 ...01 n ...01 inf ... ... 0 | M 10 0 | M n, 0 0 | M 10 1 | 1 0 | 1 / 2 s 0 : ...0 π n = M n − 1 ,n π n − 1 π n = 1 n + 1 ∞ 4 2 n X π n = 1 n =0
SNS: Stat. complexity and entropy rate π n = 1 n + 1 4 2 n M n − 1 ,n = n + 1 2 n ∞ X C µ = − π n log 2 π n = 2 . 71 bits n =0 ∞ X h µ = π n H [ M n, 0 ] = 0 . 678 bits n =0
SNS: E from causal shielding 1 | 1 / 2 s 1 : s 2 : s n : s infty : 1 | M 12 1 | M 23 1 | M n − 1 ,n ...01 ...01 2 ...01 n ...01 inf ... ... 0 | M 10 0 | M n, 0 0 | M 10 1 | 1 0 | 1 / 2 s 0 : ∞ ...0 X E = h µ ( L ) � h µ ' 0 . 147 bits L =0 h µ ( L ) = H ( L + 1) − H ( L ) = H [ X L +1 | R L +1 , R 0 = µ 0 ] χ = C µ − E = 2 . 56 bits
SNS: Time reversed process? 1 | 1 / 2 A B 1 | 1 / 2 1 | 1 / 2 0 | 1 / 2 ✓ 1 ✓ 1 1 ✓ ◆ ◆ ◆ 0 0 T (0) = T (1) = 2 2 2 π = 1 1 1 0 0 2 2 2 µ , χ + = χ − , Ξ = 0 C + µ = C −
SNS v. 2 1 | p 1 | 1 − p A B 1 | 1 − q 0 | q q ✓ 1 − p ◆ ✓ 0 ◆ ✓ ◆ 0 q T (1) = , T (0) = p + q , π = p p 1 − q 0 0 p + q
SNS v. 2 s 1 : s 2 : s n : s infty : 1 | M 12 1 | M 23 1 | M n − 1 ,n ...01 ...01 2 ...01 n ...01 inf ... ... 0 | M 10 0 | M n, 0 0 | M 10 1 | 1 0 | q s 0 : ...0 Same recurrent causal states! T (1) � n T (0) π 1 T � p (1 − q ) n − (1 − p ) n q M n − 1 ,n = = T (1) � n − 1 T (0) π p (1 − q ) n − 1 − q (1 − p ) n − 1 1 T �
SNS v. 2 π n = p (1 − q ) n − q (1 − p ) n pq × p + q p − q
SNS v. 2
SNS v. 2: Calculating E from causal shields s 1 : s 2 : s n : s infty : 1 | M 12 1 | M 23 1 | M n − 1 ,n ...01 ...01 2 ...01 n ...01 inf ... ... 0 | M 10 0 | M n, 0 0 | M 10 1 | 1 0 | q s 0 : ...0 T (1) � n − 1 π M − n, 0 = 1 T T (0) � 0 T (1) � n − 1 π 0 1 T � 0 1 1 s -1 s -2 s -inf ...
SNS v. 2
SNS v. 2
SNS v. 2: Time reversed process? 1 | p 1 | 1 − p A B 1 | 1 − q 0 | q C µ ( p, q ) = C µ ( p, q ) ⇒ C + µ = C − µ ⇒ χ + µ = χ − µ ⇒ Ξ = 0 1 | q 1 | 1 − q A B 1 | 1 − p 0 | p
SNS v. 2: Attempt at continuous time ✓ p ( A, t ) ◆ ✓ − k AB ◆ ✓ p ( A, t ) ◆ d k BA Continuous = p ( B, t ) p ( B, t ) k AB − k BA time dt ✓ p ( A, t + ∆ t ) ◆ ✓ 1 − k AB ∆ t ◆ ✓ p ( A, t ) ◆ k BA ∆ t Discretized = p ( B, t + ∆ t ) 1 − k BA ∆ t p ( B, t ) k AB ∆ t time ⇒ p = k AB ∆ t, q = k BA ∆ t
SNS v. 2 π t ∆ t = ∆ t → 0 ,n ∆ t = t π n ( p = k AB ∆ t, q = k BA ∆ t ) lim k AB e − k BA t − k BA e − k AB t k AB k BA π t = k AB + k BA k AB − k BA 1.2 1.0 k AB = 2 , k BA = 3 P(s t ) 0.8 Statistical complexity: differential 0.6 entropy of this probability distribution? 0.4 0.2 1 2 3 4 5 s t
SNS v. 2: Continuous time stat. comp.
SNS v. 2 k A e − k B t − k B e − k A t � � h t = H [ k A k B ∆ t ] k 2 A e − k B t − k 2 B e − k A t ! � k A e − k B t − k B e − k A t � � k A e − k B t − k B e − k A t � = k A k B k A k B 1 ∆ t log 2 − log 2 k 2 A e − k B t − k 2 k 2 A e − k B t − k 2 B e − k A t B e − k A t k A e − k B t − k B e − k A t � � − k A k B ∆ t log 2 ∆ t k 2 A e − k B t − k 2 B e − k A t Not sure what to do with these weird factors of time resolution-- they seem to suggest the entropy rate is 0.
SNS v. 2: Excess entropy in cont. time? ✤ Did not unifilarize the time-reversed epsilon machine, so did not get a closed form analytic expression for excess entropy ✤ However, if excess entropy is mainly coming from the rule “a 0 must be followed by a 1” then E ∼ k 2 AB k 2 BA ∆ t ( k AB + k BA ) 3 E
SNS v. 2 ✤ Excess entropy and statistical complexity capture very different ideas. ✤ E captures how often you are synchronized to internal states ✤ Stat. comp. captures how long-tailed the probability distribution over causal states is ✤ Going to continuous time maybe introduces an uncountable infinity of causal states, differential entropies (negative stat. comp.???), discontinuities in stat. comp. vs. parameters ✤ E captures relaxation of probability distribution over all mixed states to stationarity
Last nonunifilar model Group 0 Group 1 A B 1 B 2 B n ... Fully connected, randomly chosen kinetic rates between states
Last nonunifilar model s 1 : s 2 : s n : s infty : 1 | M 12 1 | M 23 1 | M n − 1 ,n ...01 ...01 2 ...01 n ...01 inf ... ... 0 | M 10 0 | M n, 0 0 | M 10 1 | 1 s 0 : ...0 Same recurrent causal states! T (1) � n T (0) π 1 T � M n − 1 ,n = T (1) � n − 1 T (0) π 1 T �
Preliminary results This n is # of hidden states T (1) � n T (0) π 1 T � π n = I − T (1) � − 1 T (0) π 1 T � T (1) � n +1 T (0) π h n = H [1 T � T (1) � n T (0) π ] 1 T �
Future directions ✤ Finish up calculating stuff for the last nonunifilar model. ✤ Maybe this has a practical application-- you can estimate the number of hidden states by knowing the average transition rates and calculating crypticity? We’ll see. ✤ More nonunifilar models, continuous time, everything.
Recommend
More recommend