Abstract: Under which conditions and with which distortions can we preserve the pairwise-distances of low-complexity vectors, e.g., for structured sets such as the set of sparse vectors or the one of low-rank matrices, when these are mapped in a finite set of vectors? This work addresses this general question through the specific use of a quantized and dithered random linear mapping which combines, in the following order, a sub-Gaussian random projection in \(\mathbb R^M\) of vectors in \(\mathbb R^N\), a random translation, or “dither”, of the projected vectors and a uniform scalar quantizer of resolution \(\delta>0\) applied componentwise. Thanks to this quantized mapping we are first able to show that, with high probability, an embedding of a bounded set \(\mathcal K \subset \mathbb R^N\) in \(\delta \mathbb Z^M\) can be achieved when distances in the quantized and in the original domains are measured with the \(\ell_1\)- and \(\ell_2\)-norm, respectively, and provided the number of quantized observations \(M\) is large before the square of the “Gaussian mean width” of \(\mathcal K\). In this case, we show that the embedding is actually “quasi-isometric” and only suffers of both multiplicative and additive distortions whose magnitudes decrease as \(M^{-1/5}\) for general sets, and as \(M^{-1/2}\) for structured set, when \(M\) increases. Second, when one is only interested in characterizing the maximal distance separating two elements of \(\mathcal K\) mapped to the same quantized vector, i.e., the “consistency width” of the mapping, we show that for a similar number of measurements and with high probability this width decays as \(M^{-1/4}\) for general sets and as \(1/M\) for structured ones when \(M\) increases. Finally, as an important aspect of our work, we also establish how the non-Gaussianity of the mapping impacts the class of vectors that can be embedded or whose consistency width provably decays when \(M\) increases.