Author Archives: Dmitry Ostrovsky
In this post we continue with soft thresholding. Our estimator is an optimal solution (unique due to strong convexity) to and the observation model is note that we included the noise variance in . We are interested … Continue reading
Since long ago I wanted to write a post with an elementary introduction to soft / hard thresholding. The time has come to pay the dues, so here we go.
A very nice reference for this post is [this blog post]. Mirror Descent is a first-order algorithm to solve a convex optimization problem In this post we will learn several equivalent formulations of this algorithm and discuss how they are related to each other.
Found a great overview of the subject with a lot of insight. Another brief intro is here.
A recurring trick in optimization is the characterization of Lagrange dual problem in terms of the much simpler Fenchel duality. This trick is used in dual methods for Machine Learning and Signal Processing, ADMM, Augmented Lagrangian, etc.
Here I describe the solution to the problem from the last post. Spoilers!