### My personal webpage

### Find

### Categories

### Tags

### Archives

### Meta

# Author Archives: Dmitry Ostrovsky

## Thresholding for sparse signals. Square-Root Lasso

In this post we continue with soft thresholding. Our estimator is an optimal solution (unique due to strong convexity) to and the observation model is note that we included the noise variance in . We are interested … Continue reading

Posted in Sparsity, Statistics, Uncategorized
Leave a comment

## Thresholding over the l1-ball

Since long ago I wanted to write a post with an elementary introduction to soft / hard thresholding. The time has come to pay the dues, so here we go.

Posted in Memos, Sparsity, Statistics
Leave a comment

## Formulations of Mirror Descent

A very nice reference for this post is [this blog post]. Mirror Descent is a first-order algorithm to solve a convex optimization problem In this post we will learn several equivalent formulations of this algorithm and discuss how they are related to each other.

Posted in Course on Convex Optimization, Memos
Leave a comment

## Brunn-Minkowski, Prékopa-Leindler, and isoperimetry

Found a great overview of the subject with a lot of insight. Another brief intro is here.

## Lagrange duality via the Fenchel conjugate. The dual of ERM

A recurring trick in optimization is the characterization of Lagrange dual problem in terms of the much simpler Fenchel duality. This trick is used in dual methods for Machine Learning and Signal Processing, ADMM, Augmented Lagrangian, etc.

Posted in Optimization tricks
Leave a comment

## ‘Shear’ convex polyhedra: solution

Here I describe the solution to the problem from the last post. Spoilers!

## ‘Shear’ convex polyhedra

I came up with what seems quite a beautiful geometric construction solving the problem described below. On a compact convex set consider the function where we denote In other words, “hanging” in the point we measure the “width” of the … Continue reading