setting
play

Setting Adaptativity of Stochastic Gradient Descent Aymeric - PowerPoint PPT Presentation

Setting Adaptativity of Stochastic Gradient Descent Aymeric Dieuleveut F. Bach, Non parametric stochastic approximation with large step sizes , in the Annals of Statistics Setting : random-design least-squares regression problem in a RKHS


  1. Setting Adaptativity of Stochastic Gradient Descent Aymeric Dieuleveut F. Bach, Non parametric stochastic approximation with large step sizes , in the Annals of Statistics Setting : random-design least-squares regression problem in a RKHS framework. Risk : for g : X → R ( g ( X ) − Y ) 2 � � ε ( g ) := E ρ . We thus want to minimize prediction error . Regression function : g ρ ( X ) = E [ Y | X ] minimises ε on L 2 ρ X . We build a sequence ( g k ) of estimators in an RKHS H . Why considering RKHS ? hypothesis space for non parametric regression, high dimensional problem ( d >> n ) analysis framework, natural analysis when mapping data in feature space via a p.d. kernel. Aymeric Dieuleveut Adaptativity of SGD 1 / 3

  2. Regularity assumptions Algorithm (Stochastic approximation) Simple one pass stochastic gradient descent with constant step sizes and averaging. Difficulty of the problem Let Σ = E [ K x K t x ] be the covariance operator. We assume that tr(Σ 1 /α ) < ∞ We assume g ρ ∈ Σ r ( L 2 ρ X ) . ( α, r ) encode the difficulty of the problem. Aymeric Dieuleveut Adaptativity of SGD 2 / 3

  3. Results Theorem (Non parametric regression) Under a suitable choice of the learning rate, we get the optimal rate of convergence for non parametric regression. Theorem (Adaptativity in Euclidean spaces) If H is a d-dimensional Euclidean space : 16 σ 2 tr(Σ 1 /α )( γ n ) 1 /α + 8 || T − q θ H || 2 � � H E [ ε (¯ g n ) − ε ( g ρ )] � min . ( n γ ) 2 q +1 n 1 � α, − 1 2 � q � 1 2 SGD is adaptative to the regularity of the objective function and to the decay of the spectrum of the covariance matrix. � explains behaviour for d >> n. Aymeric Dieuleveut Adaptativity of SGD 3 / 3

Recommend


More recommend