Thursday, April 14, 2011

CS: A precision by Ori Shental

On top what I just said in the previous entry, Ori Shental kindly added:

This work doesn't contradict in any way the well-known fact that WGN is non-compressible. In this work we discuss sparse representation of noise *realizations* given certain dictionary *realizations*. Still the indices of the non-zero entries of the sparse representation of WGN will differ from one WGN instance to another such that in order to describe the ensemble of *all* WGN realizations one would need at least as the length of the observed WGN vector (i.e. k>=m). So no real compression here, just sparse representation of noise realizations. Thus saying the noise is "compressible" (in quotation marks) is in place.

On the second part of the paper, we use this observed "compressibility" of WGN to translate the noisy CS/channel problem to a noiseless one with denser effective input, allowing us to derive some new results on the former based on already known results about the latter. Still, I share your feeling that there are a lot of other applications to this (e.g. multiplicative noise channel as you say, etc.). Another interesting task down the road is to repeat this analysis under the practical L1-norm constraint, rather than the theoretical L0-norm.

Thanks Ori

No comments:

Printfriendly