Error Decay of (almost) Consistent Signal Estimations from Quantized Gaussian Random Projections

Publication
IEEE Transactions on Information Theory

Abstract: This paper provides new error bounds on “consistent” reconstruction methods for signals observed from quantized random projections. Those signal estimation techniques guarantee a perfect matching between the available quantized data and a new observation of the estimated signal under the same sensing model. Focusing on dithered uniform scalar quantization of resolution \(\delta>0\), we prove first that, given a Gaussian random frame of \(\mathbb R^N\) with \(M\) vectors, the worst-case \(\ell_2\)-error of consistent signal reconstruction decays with high probability as \(O(\frac{K}{M} \log \frac{M}{N} \sqrt{K^{3}})\) uniformly for all signals of the unit ball \(\mathbb B^N \subset \mathbb R^N\). Up to a log factor, this matches a known lower bound in \(\Omega(N/M)\) and former empirical validations in \(O(N/M)\). Equivalently, if \(M\) exceeds a minimal number of frame coefficients growing like \(O(\frac{K}{M} \log\frac{MN}{\sqrt K^{3}})\), any vectors in \(\mathbb B^N\) with \(M\) identical quantized projections are at most \(\epsilon_0\) apart with high probability. Second, in the context of Quantized Compressed Sensing with \(M\) Gaussian random measurements and under the same scalar quantization scheme, consistent reconstructions of \(K\)-sparse signals of \(\mathbb R^N\) have a worst-case error that decreases with high probability as uniformly for all such signals. Finally, we show that the proximity of vectors whose quantized random projections are only approximately consistent can still be bounded with high probability. A certain level of corruption is thus allowed in the quantization process, up to the appearance of a systematic bias in the reconstruction error of (almost) consistent signal estimates."'

Related