 PhD student with interests in statistics, optimization, and machine learning.

## Thresholding for sparse signals. Square-Root Lasso

In this post we continue with soft thresholding. Our estimator  is an optimal solution (unique due to strong convexity) to and the observation model is      note that we included the noise variance   in .  We are interested … Continue reading

Posted in Sparsity, Statistics, Uncategorized | 1 Comment

## Thresholding over the l1-ball

Since long ago I wanted to write a post with an elementary introduction to soft / hard thresholding. The time has come to pay the dues, so here we go.

## Formulations of Mirror Descent

A very nice reference for this post is [this blog post]. Mirror Descent is a first-order algorithm to solve a convex optimization problem    In this post we will learn several equivalent formulations of this algorithm and discuss how they are related to each other.

## Brunn-Minkowski, Prékopa-Leindler, and isoperimetry

Found a great overview of the subject with a lot of insight. Another brief intro is here.

## Lagrange duality via the Fenchel conjugate. The dual of ERM

A recurring trick in optimization is the characterization of Lagrange dual problem in terms of the much simpler Fenchel duality. This trick is used in dual methods for Machine Learning and Signal Processing, ADMM, Augmented Lagrangian, etc.

Posted in Optimization tricks | 1 Comment

## ‘Shear’ convex polyhedra: solution

Here I describe the solution to the problem from the last post. Spoilers!