# Category Archives: Course on Convex Optimization

## Formulations of Mirror Descent

A very nice reference for this post is [this blog post]. Mirror Descent is a first-order algorithm to solve a convex optimization problem    In this post we will learn several equivalent formulations of this algorithm and discuss how they are related to each other.

## Projected Gradient Descent III — Strong convexity

Strong convexity provides an improved lower bound for a convex function; therefore are in our right to expect the improvement of the rate of convergence of PGD. In this post we will find out how much we actually gain.

## Projected Gradient Descent I — Lipschitz functions

In this post we find out dimension-free oracle complexity for the following problem class   function    is  -Lipschitz on    (with respect to Euclidean norm), i.e.  for any  is contained in the euclidean ball of radius  .

## Black box approach to optimization

To  learn convex optimization methods and stack this structure in my head, I decided to write the material down by topics. The focus will be on (constrained) large-scale, or dimension-free, optimization corresponding to first-order methods, with applications to machine learning in mind. I don’t pretend to original contribution … Continue reading

## Projection on a convex set is closer to any point of the set

A useful little fact for constrained optimization. Let    be a convex set, and consider      and its projection on   As quickly follows from the separation theorem with hyperplane containing  for any    the angle between    and    is obtuse. … Continue reading