multiplicative chaos in random matrix theory and related
play

Multiplicative chaos in random matrix theory and related fields - PowerPoint PPT Presentation

Multiplicative chaos in random matrix theory and related fields Christian Webb Aalto University, Finland ICMP 2018 Montr eal July 24, 2018 1/12 The GUE eigenvalue counting function. Let 1 ... N be the ordered


  1. Multiplicative chaos in random matrix theory and related fields Christian Webb Aalto University, Finland ICMP 2018 Montr´ eal – July 24, 2018 1/12

  2. The GUE eigenvalue counting function. • Let λ 1 ≤ ... ≤ λ N be the ordered eigenvalues of a GUE ( N ) random matrix – normalized to have limiting spectrum [ − 1 , 1]. 2/12

  3. The GUE eigenvalue counting function. • Let λ 1 ≤ ... ≤ λ N be the ordered eigenvalues of a GUE ( N ) random matrix – normalized to have limiting spectrum [ − 1 , 1]. • For x ∈ ( − 1 , 1), let N � V N ( x ) = 1 { λ k ≤ x } . k =1 2/12

  4. The GUE eigenvalue counting function. • Let λ 1 ≤ ... ≤ λ N be the ordered eigenvalues of a GUE ( N ) random matrix – normalized to have limiting spectrum [ − 1 , 1]. • For x ∈ ( − 1 , 1), let N � V N ( x ) = 1 { λ k ≤ x } . k =1 e γ VN ( x ) • Consider the stochastic process E e γ VN ( x ) for x ∈ ( − 1 , 1) and γ ∈ R . 2/12

  5. The GUE eigenvalue counting function. • Let λ 1 ≤ ... ≤ λ N be the ordered eigenvalues of a GUE ( N ) random matrix – normalized to have limiting spectrum [ − 1 , 1]. • For x ∈ ( − 1 , 1), let N � V N ( x ) = 1 { λ k ≤ x } . k =1 e γ VN ( x ) • Consider the stochastic process E e γ VN ( x ) for x ∈ ( − 1 , 1) and γ ∈ R . • Moments converge as N → ∞ : Theorem (Charlier 2017) Let x 1 , ..., x k ∈ ( − 1 , 1) be fixed and distinct. Then � � γ 2 � � � � 2 π 2 1 − x 2 1 − x 2 k 1 − x p x q + � � e γ V N ( x j ) � � p q � � lim E e γ V N ( x j ) = N →∞ E � � x p − x q � � j =1 1 ≤ p < q ≤ k 2/12

  6. The GUE eigenvalue counting function. • Let λ 1 ≤ ... ≤ λ N be the ordered eigenvalues of a GUE ( N ) random matrix – normalized to have limiting spectrum [ − 1 , 1]. • For x ∈ ( − 1 , 1), let N � V N ( x ) = 1 { λ k ≤ x } . k =1 e γ VN ( x ) • Consider the stochastic process E e γ VN ( x ) for x ∈ ( − 1 , 1) and γ ∈ R . • Moments converge as N → ∞ : Theorem (Charlier 2017) Let x 1 , ..., x k ∈ ( − 1 , 1) be fixed and distinct. Then � � γ 2 � � � � 2 π 2 1 − x 2 1 − x 2 k 1 − x p x q + � � e γ V N ( x j ) � � p q � � lim E e γ V N ( x j ) = N →∞ E � � x p − x q � � j =1 1 ≤ p < q ≤ k e γ VN ( x ) • Is there a process with such moments? Does E e γ VN ( x ) converge to it? What would this say about the GUE? 2/12

  7. The limiting process – heuristics • For ( Y k ) ∞ k =1 i.i.d. standard Gaussians and ( U j ) ∞ j =0 Chebyshev polynomials of the second kind, let (formally): ∞ � � X ( x ) = 1 Y k 1 − x 2 . √ U k − 1 ( x ) π k k =1 3/12

  8. The limiting process – heuristics • For ( Y k ) ∞ k =1 i.i.d. standard Gaussians and ( U j ) ∞ j =0 Chebyshev polynomials of the second kind, let (formally): ∞ � � X ( x ) = 1 Y k 1 − x 2 . √ U k − 1 ( x ) π k k =1 • Covariance structure (formally): for x , y ∈ ( − 1 , 1) √ 1 − x 2 � 1 − y 2 2 π 2 log 1 − xy + 1 E X ( x ) X ( y ) = . | x − y | 3/12

  9. The limiting process – heuristics • For ( Y k ) ∞ k =1 i.i.d. standard Gaussians and ( U j ) ∞ j =0 Chebyshev polynomials of the second kind, let (formally): ∞ � � X ( x ) = 1 Y k 1 − x 2 . √ U k − 1 ( x ) π k k =1 • Covariance structure (formally): for x , y ∈ ( − 1 , 1) √ 1 − x 2 � 1 − y 2 2 π 2 log 1 − xy + 1 E X ( x ) X ( y ) = . | x − y | 2 E X ( x ) 2 (formally) • For µ γ ( x ) = e γ X ( x ) − γ 2 � � γ 2 1 − x p x q + √ √ k � � � � 2 π 2 1 − x 2 1 − x 2 � � p q µ γ ( x j ) = . E � � x p − x q j =1 1 ≤ p < q ≤ k 3/12

  10. The limiting process – heuristics • For ( Y k ) ∞ k =1 i.i.d. standard Gaussians and ( U j ) ∞ j =0 Chebyshev polynomials of the second kind, let (formally): ∞ � � X ( x ) = 1 Y k 1 − x 2 . √ U k − 1 ( x ) π k k =1 • Covariance structure (formally): for x , y ∈ ( − 1 , 1) √ 1 − x 2 � 1 − y 2 2 π 2 log 1 − xy + 1 E X ( x ) X ( y ) = . | x − y | 2 E X ( x ) 2 (formally) • For µ γ ( x ) = e γ X ( x ) − γ 2 � � γ 2 1 − x p x q + √ √ k � � � � 2 π 2 1 − x 2 1 − x 2 � � p q µ γ ( x j ) = . E � � x p − x q j =1 1 ≤ p < q ≤ k • Precisely the moments we want! � 3/12

  11. The limiting process – heuristics • For ( Y k ) ∞ k =1 i.i.d. standard Gaussians and ( U j ) ∞ j =0 Chebyshev polynomials of the second kind, let (formally): ∞ � � X ( x ) = 1 Y k 1 − x 2 . √ U k − 1 ( x ) π k k =1 • Covariance structure (formally): for x , y ∈ ( − 1 , 1) √ 1 − x 2 � 1 − y 2 2 π 2 log 1 − xy + 1 E X ( x ) X ( y ) = . | x − y | 2 E X ( x ) 2 (formally) • For µ γ ( x ) = e γ X ( x ) − γ 2 � � γ 2 1 − x p x q + √ √ k � � � � 2 π 2 1 − x 2 1 − x 2 � � p q µ γ ( x j ) = . E � � x p − x q j =1 1 ≤ p < q ≤ k • Precisely the moments we want! � • For each x , the sum defining X ( x ) diverges almost surely and E X ( x ) 2 = ∞ . What does µ γ mean? � 3/12

  12. Gaussian multiplicative chaos – rigorous construction • Problem: X doesn’t exist in a pointwise sense – E X ( x ) 2 = ∞ ? 4/12

  13. Gaussian multiplicative chaos – rigorous construction • Problem: X doesn’t exist in a pointwise sense – E X ( x ) 2 = ∞ ? � 1 • ” − 1 X ( x ) f ( x ) dx ” does make sense for smooth enough f → The sum defining X converges as a random generalized function, but how to exponentiate such an object? 4/12

  14. Gaussian multiplicative chaos – rigorous construction • Problem: X doesn’t exist in a pointwise sense – E X ( x ) 2 = ∞ ? � 1 • ” − 1 X ( x ) f ( x ) dx ” does make sense for smooth enough f → The sum defining X converges as a random generalized function, but how to exponentiate such an object? • Solution: regularize and treat as a measure or distribution : N � � X N ( x ) = 1 Y k √ 1 − x 2 . U k − 1 ( x ) π k k =1 4/12

  15. Gaussian multiplicative chaos – rigorous construction • Problem: X doesn’t exist in a pointwise sense – E X ( x ) 2 = ∞ ? � 1 • ” − 1 X ( x ) f ( x ) dx ” does make sense for smooth enough f → The sum defining X converges as a random generalized function, but how to exponentiate such an object? • Solution: regularize and treat as a measure or distribution : N � � X N ( x ) = 1 Y k √ 1 − x 2 . U k − 1 ( x ) π k k =1 � 1 f ( x ) e γ X N ( x ) � µ γ , f � := lim E e γ X N ( x ) dx N →∞ − 1 4/12

  16. Gaussian multiplicative chaos – rigorous construction • Problem: X doesn’t exist in a pointwise sense – E X ( x ) 2 = ∞ ? � 1 • ” − 1 X ( x ) f ( x ) dx ” does make sense for smooth enough f → The sum defining X converges as a random generalized function, but how to exponentiate such an object? • Solution: regularize and treat as a measure or distribution : N � � X N ( x ) = 1 Y k √ 1 − x 2 . U k − 1 ( x ) π k k =1 � 1 f ( x ) e γ X N ( x ) � µ γ , f � := lim E e γ X N ( x ) dx N →∞ − 1 √ √ • Can check that for nice test functions f and for − 2 π < γ < 2 π the limits exist as we’re dealing with L 2 -bounded martingales (actually OK for − 2 π < γ < 2 π – L p -bounded martingale). 4/12

  17. Gaussian multiplicative chaos – rigorous construction • Problem: X doesn’t exist in a pointwise sense – E X ( x ) 2 = ∞ ? � 1 • ” − 1 X ( x ) f ( x ) dx ” does make sense for smooth enough f → The sum defining X converges as a random generalized function, but how to exponentiate such an object? • Solution: regularize and treat as a measure or distribution : N � � X N ( x ) = 1 Y k √ 1 − x 2 . U k − 1 ( x ) π k k =1 � 1 f ( x ) e γ X N ( x ) � µ γ , f � := lim E e γ X N ( x ) dx N →∞ − 1 √ √ • Can check that for nice test functions f and for − 2 π < γ < 2 π the limits exist as we’re dealing with L 2 -bounded martingales (actually OK for − 2 π < γ < 2 π – L p -bounded martingale). • This procedure defines random measures/distributions. These are the objects we are after – correlation kernels agree with the limiting GUE-moments. 4/12

  18. Real Gaussian multiplicative chaos – the general picture • A centered log-correlated Gaussian field G ( x ) is (formally) a Gaussian process on R d with covariance C ( x , y ) := E G ( x ) G ( y ) = − log | x − y | + continuous 5/12

  19. Real Gaussian multiplicative chaos – the general picture • A centered log-correlated Gaussian field G ( x ) is (formally) a Gaussian process on R d with covariance C ( x , y ) := E G ( x ) G ( y ) = − log | x − y | + continuous • Under mild conditions on C , honest Gaussian processes G N with covariance converging to C exist (K-L expansion, convolution, ...). 5/12

Recommend


More recommend