Category Archives: Convex Optimization
A very nice reference for this post is [this blog post]. Mirror Descent is a first-order algorithm to solve a convex optimization problem In this post we will learn several equivalent formulations of this algorithm and discuss how they are related to each other.
A recurring trick in optimization is the characterization of Lagrange dual problem in terms of the much simpler Fenchel duality. This trick is used in dual methods for Machine Learning and Signal Processing, ADMM, Augmented Lagrangian, etc.
Strong convexity provides an improved lower bound for a convex function; therefore are in our right to expect the improvement of the rate of convergence of PGD. In this post we will find out how much we actually gain.
In this post we find out dimension-free oracle complexity for the following problem class function is -Lipschitz on (with respect to Euclidean norm), i.e. for any is contained in the euclidean ball of radius .
To learn convex optimization methods and stack this structure in my head, I decided to write the material down by topics. The focus will be on (constrained) large-scale, or dimension-free, optimization corresponding to first-order methods, with applications to machine learning in mind. I don’t pretend to original contribution … Continue reading
A useful little fact for constrained optimization. Let be a convex set, and consider and its projection on As quickly follows from the separation theorem with hyperplane containing for any the angle between and is obtuse. … Continue reading
If a function has a Lipschitz gradient, i.e. for any and then