On the R´ enyi Entropy of Log-Concave Sequences On the R´ enyi Entropy of Log-Concave Sequences James Melbourne University of Minnesota melbo013@umn.edu Tomasz Tkocz Carnagie Mellon University ttkocz@andrew.cmu.edu ISIT June 8, 2020
On the R´ enyi Entropy of Log-Concave Sequences Outline M. & Tkocz. “Reversals of R´ enyi Entropy Inequalities under arXiv:2005.10930 . Log-Concavity.” Definitions 1 Results 2 Methods 3
On the R´ enyi Entropy of Log-Concave Sequences Definitions R´ enyi Entropy Definition: For f density function with respect to a measure γ , and α ∈ (0 , 1) ∪ (1 , ∞ ) f α d γ � h α,γ ( f ) = log , 1 − α h ∞ ,γ ( f ) = − log � f � γ, ∞ , � h γ, 1 ( f ) = h γ ( f ) = − f log fd γ , h γ, 0 ( f ) = log γ ( supp ( f )) X ∼ f , h γ,α ( X ) := h γ,α ( f ) γ = Lebesgue: h α ( f ) γ = counting measure: H α ( f )
On the R´ enyi Entropy of Log-Concave Sequences Definitions Log-Concavity Log-Concavity on the integers An f : Z → [0 , ∞ ) with interval support, log-concave when f 2 ( n ) ≥ f ( n + 1) f ( n − 1) . Closed under convolution Weak limits Examples: Bernoulli, Binomial, Poisson, Geometric, Hypergeometric
On the R´ enyi Entropy of Log-Concave Sequences Definitions Log-concavity Where it appears: i =0 a i X i has real roots, { a i } is log-concave Combinatorics - � n Stanley ’89 Brenti ’89 Br¨ ad´ en ’14 Alexandrov-Fenchel inequality ⇒ “intrinsic Convex Geometry - volumes” associated to convex bodies are log-concave Amelunxen, Lotz, McCoy, & Tropp ’14 , Stanley ’81 McCoy & Tropp ’14 Probability - Theory of Negative dependence en, and Liggett ’09 . Joag-Dev & Proschan ’83 Pemantle ’00 Borcea, Br¨ and´ Information Theory - Maximum entropy properties Poisson Johnson ’06 Johnson, Kontoyannis, & Madiman ’11
On the R´ enyi Entropy of Log-Concave Sequences Definitions Continuous Log-concavity f : R d → [0 , ∞ ) f ((1 − t ) x + ty ) ≥ f 1 − t ( x ) f t ( y ). Rich theory, intersection of functional analysis, convex geometry, and probability with connections to multitude of fields, Statistics, Economics, Physics, as well as Information Theory. For background on connections to information theory Madiman, M., & Xu ’17 Inspiration from the following: Theorem Bobkov & Madiman ’11 X log-concave on R d and α < β 1 h β ( X ) ≤ h α ( X ) ≤ h β ( X ) + d log α α − 1 (1) 1 β β − 1
On the R´ enyi Entropy of Log-Concave Sequences Results Results Theorem (M. & Tkocz) For X log-concave on Z , and α ∈ [0 , ∞ ], 1 H α ( X ) < H ∞ ( X ) + log α α − 1 H ( X ) < H ∞ ( X ) + log e Entropy preserved under rearranging, log concavity not. Jensen’s inequality ⇒ H ∞ ( X ) ≤ H α ( X ) Theorem (M. & Tkocz) X a discrete distribution, with a log-concave arrangement on Z 1 and α ∈ [0 , ∞ ], H ∞ ( X ) ≤ H α ( X ) < H ∞ ( X ) + log α α − 1
On the R´ enyi Entropy of Log-Concave Sequences Results Results Theorem (M. & Tkocz) X a discrete distribution, with a log-concave arrangement on Z and α ∈ [0 , ∞ ], 1 H ∞ ( X ) ≤ H α ( X ) < H ∞ ( X ) + log α α − 1 Strict Sharp Geometric( p ), p → 0
On the R´ enyi Entropy of Log-Concave Sequences Results Results Corollary (M. & Tkocz) For X , Y iid and log-concave, H α ( X − Y ) < H α ( X ) + log c ( α ) 1 � α − 1 , 2 α if α ∈ (2 , ∞ ] , c ( α ) = 1 α − 1 , if α ∈ (0 , 2] . α H ( X − Y ) < H ( X ) + log e Upper bounds in Theorem and Corollary are strict. Sharp when α ∈ { 2 , ∞} Take X to be Geometric( p ), p → 0.
On the R´ enyi Entropy of Log-Concave Sequences Methods Technical Definitions Definition: Two-sided geometric distribution Density ϕ on Z two-sided geometric distributionfor p , q ∈ [0 , 1) ϕ ( n ) = (1 − p )(1 − q ) f ( n ) . 1 − pq with � p n for n ≥ 0 f ( n ) = q − n for n ≤ 0 . Take 0 0 = 1.
On the R´ enyi Entropy of Log-Concave Sequences Methods Technical Definitions Majorization i =1 f ↓ i =1 g ↓ Density f majorizes g , f ≻ g when � k i ≥ � k i , holds ∀ k . f ↓ i denotes decreasing rearrangement. X ≻ Y when X ∼ f , Y ∼ g and f ≻ g . Schur-Concavity Φ is Schur-concave when f ≻ g implies Φ( f ) ≤ Φ( g ) . (2) R´ enyi entropy is Schur-concave
b b b b b b b b b b b b b b b On the R´ enyi Entropy of Log-Concave Sequences Methods Reduction Lemma (M. and Tkocz) For Y log-concave, there exists X two-sided exponential, with Y ≻ X and H ∞ ( X ) = H ∞ ( Y ) . log a − log a + log f Figure: Y ∼ f , X ∼ a
On the R´ enyi Entropy of Log-Concave Sequences Methods Reduction Lemma (M. and Tkocz) For Y log-concave, there exits X two-sided exponential, with Y ≻ X and H ∞ ( X ) = H ∞ ( Y ) . By Schur concavity of R´ enyi entropy, H α ( X ) ≥ H α ( Y ). Problem reduced to two-sided exponential H α ( Y ) − H ∞ ( Y ) ≤ H α ( X ) − H ∞ ( X ) . (3)
On the R´ enyi Entropy of Log-Concave Sequences Methods Reduced Problem Suffices to Prove For p , q ∈ (0 , 1), � � 1 1 1 − p α + 1 − q α − 1 log 1 1 1 − p + 1 − q − 1 1 H α ( X ) − H ∞ ( X ) = < log α (4) α − 1 1 − α After some algebra, it is enough to show (For α � = 1) � � 1 1 F ( α ) = α 1 − p α + 1 − q α − 1 is strictly increasing. Calculus and some substitutions show F ′ ( α ) > 0 α = 1 is a corollary of argument.
On the R´ enyi Entropy of Log-Concave Sequences Methods Proof Proof: Given Y , by majorization argument, exists two sided geometric X st H α ( Y ) − H ∞ ( Y ) ≤ H α ( X ) − H ∞ ( X ) . By direct argument 1 α − 1 . H α ( X ) − H ∞ ( X ) < log α Direct computation on Geometric distribution f ( n ) = (1 − p ) n p with p → 0 yields equality.
On the R´ enyi Entropy of Log-Concave Sequences Methods Entropic Roger-Shephard for Discrete Log-Concave Variables M. & Tkocz For X and Y iid and log-concave on Z then H α ( X − Y ) < H α ( X ) + log c ( α ) , (5) for universal c ( α ). Proof Sketch: H 2 ( X − Y ) = H ∞ ( X ) By R´ enyi entropy comparison H α ( X − Y ) can be compared to H 2 ( X − Y ) = H ∞ ( X ) which can be compared to H α ( X ).
On the R´ enyi Entropy of Log-Concave Sequences Methods Entropic Roger-Shephard for Discrete Log-Concave Variables M. & Tkocz For X and Y iid and log-concave on Z then H α ( X ) ≤ H α ( X − Y ) ≤ H α ( X ) + log c ( α ) . (6) 1 1 α − 1 for α > 2, and c ( α ) = α α − 1 for α ≤ 2. c ( α ) = 2 α Consequences: H 2 ( X − Y ) < H 2 ( X ) + log 2 (Sharp) H ∞ ( X − Y ) < H ∞ ( X ) + log 2 (Sharp) H ( X − Y ) < H ( X ) + log e
On the R´ enyi Entropy of Log-Concave Sequences Methods Entropic Rogers-Shephard for Log-Concave Vectors Conjecture: Madiman & Kontoyannis ’15 For X and Y iid log-concave random vectors in R d h ( X − Y ) ≤ h ( X ) + d log 2 (7) Rogers-Shephard ’57 : � 2 d � h 0 ( X − Y ) ≤ h 0 ( X ) + log (8) d Equality for X uniform on simplex. � 2 d � ∼ 4 d , h 0 ( X − Y ) ≤ h 0 ( X ) + d log 4 d
On the R´ enyi Entropy of Log-Concave Sequences Methods Entropic Rogers-Shephard for Log-Concave Vectors Theorem (M. & Tkocz) For X and Y iid random vectors in R d and α ∈ [2 , ∞ ] h α ( X ) ≤ h α ( X − Y ) ≤ h α ( X ) + d log 2 (9) when α ∈ (0 , 2) 1 h α ( X ) ≤ h α ( X − Y ) ≤ h α ( X ) + d log α (10) α − 1 For α ≥ 2, sharp for exponential distribution. (tensorize for d -dimensional result). h ( X − Y ) ≤ h ( X ) + d log e Bobkov & Madiman ’11 � 2 d � ∼ 4 d , h 0 ( X − Y ) ≤ h 0 ( X ) + d log 4 d
On the R´ enyi Entropy of Log-Concave Sequences Methods Summary For a broad class of discrete variables, R´ enyi entropies are equivalent up to an additive constant Deepen parallels between discrete and continuous log-concavity theories Reversals of “R´ enyi entropy power inequalities” Furthers connections between convex geometric and information theoretic inequalities
On the R´ enyi Entropy of Log-Concave Sequences Methods The end Thank you! An open question: Conjecture (M. & Tkocz) Let ( y n ) N n =1 be a finite positive monotone and concave sequence, that is y n ≥ y n − 1 + y n +1 , 1 < n < N . Then for every γ > 0, the 2 function N y t /γ � K ( t ) = ( t + γ ) n n =1 is log-concave, that is log K ( t ) is concave on ( − γ, + ∞ ).
Recommend
More recommend