SLIDE 15 ⋆ We will invoke this when we discuss the proximal gradient descent algorithm
More Subgradient Calculus: Proximal Operator
Following functions are again convex, but again, may not be differentiable everywhere. How does one compute their subgradients at points of non-differentiability? Infimum:Ifc(x,y)is convex in(x,y)andCis a convex set, thend(x) =inf
c(x,y)is
y∈C
Letd(x,C)that returns the distance of a pointxto a convex setC. That is d(x,C) = inf ||x−y||= ||x−P
C(x)||, where, PC(x) = argmind(x,C)
.
▶
Thend(x,C)is a
y∈C
convex function and∇d(x,C) = x− P C(x) ....The point of intersection of convex sets ∥x−P C(x)∥ C1,C 2,...Cm by minimizing... (Subgradients and Alternating Projections) argmin d(x,C)is a special case of the proximity operator:prox c(x) = argmin PROXc(x)of a
▶
y y∈C
convex functionc(x). Here,PROX c(x) =c(y) +
1 ||x−y||The special case is whenc(y)is 2
Proximal the indicator functionI C(y)introduced earlier to eliminate the contraints of an optimization will be done in details later
problem.
Recall that∂I C(y) =N C(y) ={h∈ ℜ
n :h Ty≥h Tzfor anyz∈C}
The subdifferential∂PROX c(x) =∂c(y) +y−xwhich can now be obtained for the special
⋆ ⋆
casec(y) = I
C(y). September 1, 2018 91 / 402