approximated newton algorithm for the ising model
play

Approximated Newton Algorithm for the Ising Model Inference Speeds - PowerPoint PPT Presentation

Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids


  1. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Ulisse Ferrari Institut de la Vision, Sorbonne Universités, UPMC New Frontiers in Non-equilibrium Physics 2015

  2. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Outlook of the seminar 1 Introduction with an application of pairwise Ising Model to Neuroscience 2 Maximal Entropy model and the Vanilla (Standard) Learning Algorithm 3 Approximate Newton Method 4 The Long-Time Limit: Stochastic Dynamics 5 Properties of the Stationary Distribution 6 Conclusions and Perspectives

  3. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Model Inference : Finding the probability distribution reproducing the data system statistics.

  4. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Model Inference : Finding the probability distribution reproducing the data system statistics. Useful for characterizing the behavior of systems of many, strongly correlated, units: neurons, proteins, virus, species distribution, bird flocks but...

  5. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Model Inference : Finding the probability distribution reproducing the data system statistics. Useful for characterizing the behavior of systems of many, strongly correlated, units: neurons, proteins, virus, species distribution, bird flocks but... which distribution?

  6. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Model Inference : Finding the probability distribution reproducing the data system statistics. Useful for characterizing the behavior of systems of many, strongly correlated, units: neurons, proteins, virus, species distribution, bird flocks but... which distribution? Maximum Entropy (MaxEnt) Inference: Search for the largest entropy distribution satisfying a set of constraints.

  7. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Example: pairwise Ising Model Given binary units data-set of B configurations of N units: � B � { σ i ( b ) } N i = 1 b = 1 Find the MaxEnt model reproducing single and pairwise correlations: � σ i � MODEL = � σ i � DATA ≡ 1 � b σ i ( b ) B � σ i σ j � MODEL = � σ i σ j � DATA ≡ 1 � b σ i ( b ) σ j ( b ) B

  8. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Example: pairwise Ising Model Given binary units data-set of B configurations of N units: � B � { σ i ( b ) } N i = 1 b = 1 Find the MaxEnt model reproducing single and pairwise correlations: � σ i � MODEL = � σ i � DATA ≡ 1 � b σ i ( b ) B � σ i σ j � MODEL = � σ i σ j � DATA ≡ 1 � b σ i ( b ) σ j ( b ) B Finely tune the parameters { h , J } of the pairwise Ising model: � � � P h , j ( σ ) = exp i h i σ i + � ij J ij σ i σ j / Z [ h , J ]

  9. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction In vivo Pre-Frontal Cortex Recording:

  10. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction In vivo Pre-Frontal Cortex Recording: 97 experimental sessions of: Peyrache et al. Nat. Neurosci. (2009)

  11. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Ising Model Inference σ i ( b ) = 1 if neuron i spiked during time-bin b Ask to reproduce neurons firing rates and correlations. Schneidman et al. Nature 2006; Cocco, Monasson ,PRL (2011)

  12. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Ising Model Inference ⇒ ⇒ ⇒ σ i ( b ) = 1 if neuron i spiked during time-bin b Ask to reproduce neurons firing rates and correlations. Schneidman et al. Nature 2006; Cocco, Monasson ,PRL (2011)

  13. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Ising Model Inference ⇒ ⇒ ⇒ σ i ( b ) = 1 if neuron i spiked during time-bin b Ask to reproduce neurons firing rates and correlations. 97 × 3 couplings network sets ( 97 × { PRE, TASK , POST } ) Schneidman et al. Nature 2006; Cocco, Monasson ,PRL (2011)

  14. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Learning related coupling Adjustement � � � � � J TASK − J PRE J POST − J PRE A = sign · ij ij ij ij i , j : J TASK , J POST � 0

  15. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Learning related coupling Adjustement � � � � � J TASK − J PRE J POST − J PRE A = sign · ij ij ij ij i , j : J TASK , J POST � 0

  16. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Introduction Learning related coupling Adjustement � � � � � J TASK − J PRE J POST − J PRE A = sign · ij ij ij ij i , j : J TASK , J POST � 0

  17. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm Maximal Entropy Models and the Vanilla (standard) 1 Learning Algorithm Approximated Newton Method 2 The Long-Time Limit: Stochastic Dynamics 3 4 Properties of the Stationary Distribution

  18. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm General MaxEnt Given a list of D observables to reproduce { Σ a ( σ ) } D a = 1 (generic functions of the system units) Find the MaxEnt model parameters { X a } D a = 1 � � � P X ( σ ) = exp a X a Σ a ( σ ) / Z [ X ] reproducing the observables averages: � Σ a � DATA ≡ P a = Q a [ X ] ≡ � Σ a � X

  19. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm Equivalent to log-likelihood maximization: X ∗ = arg max X � � � � logL[ X ] ≡ arg max X X · P − log Z [ X ]

  20. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm Equivalent to log-likelihood maximization: X ∗ = arg max X � � � � logL[ X ] ≡ arg max X X · P − log Z [ X ] in fact: � � d ∇ a logL[ X ] = X · P − log Z [ X ] = P a − Q a [ X ] dX a

  21. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm Equivalent to log-likelihood maximization: X ∗ = arg max X � � � � logL[ X ] ≡ arg max X X · P − log Z [ X ] in fact: � � d ∇ a logL[ X ] = X · P − log Z [ X ] = P a − Q a [ X ] dX a Cannot be solved analytically. Ackley, Hinton and Sejnowski (Vanilla Gradient): X t + 1 = X t + δ X VG δ X VG = α ( P − Q [ X t ]) ; t t

  22. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm Equivalent to log-likelihood maximization: X ∗ = arg max X � � � � logL[ X ] ≡ arg max X X · P − log Z [ X ] in fact: � � d ∇ a logL[ X ] = X · P − log Z [ X ] = P a − Q a [ X ] dX a Cannot be solved analytically. Ackley, Hinton and Sejnowski (Vanilla Gradient): X t + 1 = X t + δ X VG δ X VG = α ( P − Q [ X t ]) ; t t If 0 < P a < 1 for all a = 1 , . . . D , the problem is well posed: X ∗ exists and is unique and the dynamics converges (for infinitesimally small α )

  23. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm A 2-dimensional example: 2 ( u − u ∞ ) 2 − b logL [ u , v ] = − a 2 ( v − v ∞ ) 2

  24. Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting Maximal Entropy Models and the Vanilla (standard) Learning Algorithm A 2-dimensional example: 2 ( u − u ∞ ) 2 − b logL [ u , v ] = − a 2 ( v − v ∞ ) 2

Recommend


More recommend