In this post we will finish our affair with subexponential distributions. We will consider yet another example where they can be applied, an important statement called Johnson–Lindenstrauss lemma. Essentially, it states that points in a high-dimensional Euclidean space may be projected in a subspace with dimension only such that all pairwise distances are well-preserved.

Theorem (Johnson–Lindenstrauss)Let and For any

there exists a linear map such that for any

**Remark 1. **Note that depends on but not on It is no wonder since the affine dimension of points is obviously at most

**Remark 2. **One may actually require that is a projection, but then the proof becomes a bit more technical.

**Remark 3. **In the disrete case, the analogous statement holds for the Hamming distance.

**Proof. **Consider the following random map, which is ‘almost’ a projection due to the normalization

for the random matrix with i.i.d. entries. Note that which is -subexponential. As a result, we get that for any fixed

Applying the union bound for points of the form we obtain

and picking as in the premise of the theorem forces the probability be less than which guarantees the existence of the required map. Moreover, choosing

guarantees that a random map will preserve distances up to precision with probability at least ♦

**Remark. **As an immediate consequence, we get that the random Gaussian matrix will satisfy RIP with high probability. A matrix satisfies Restricted Isometry Property (RIP) with constant if the double inequality of the Theorem holds for any -sparse vector Note that since this inequality is homogeneous in we can just prove that it holds only for sparse unit vectors; there are such vectors. Taking the logarithm, we get that the random Gaussian matrix will be -RIP w.h.p. if

The Compressed Sensing theory then shows that a -sparse signal in a high-dimensional ambient space may be recovered via noisy linear measurements