We consider the problem of minimization for a function with Lipschitz continuous gradient on a proximally smooth and smooth manifold in a finite dimensional Euclidean space. We consider the Lezanski-Polyak-Lojasiewicz (LPL) conditions in this problem of constrained optimization.
No! Suppose is convex has a Lipschitz continuous gradient with Lipschitz constant Lr . Define a new function in transformed coordinate system: f(z) , (T z) : Using the properties on p. 3.6, this function also has a Lipschitz continuous gradient and where xk , T zk and P = T T 0.
Here we review the general form of gradient descent (GD) for convex minimization problems; the LS application is simply a special case. We focus initially on the numerous SIPML applications where the cost function is convex and smooth, meaning it has a Lipschitz continuous gradient.
where L is the Lipschitz constant of the gradient rf(x). In contrast, Nesterov’s fast gradient method (p. 3.40) has a worst-case cost function decrease at rate at least O(1=k2), which can be improved (and has) by only a constant factor .