Thursday, November 27, 2014

Upcoming CSJobs and more ....

Yves Wiaux, who soon will have many more firends, just sent me the following


  Hi Igor

Jason McEwen and myself have just been awarded £2M from UK research councils for research on compressive imaging in astronomy and medicine, along with Mike Davies and other colleagues.

Press releases about the funded initiative may be found on Edinburgh Heriot-Watt University website at http://www.hw.ac.uk/news-events/news/pioneering-work-helps-join-dots-across-known-19548.htm ,



Do not hesitate to post [this] on Nuit Blanche if you find it suitable.

Multiple positions will open very soon.


Cheers

Yves
___________________________________
Dr Yves Wiaux, Assoc. Prof., BASP Director
Institute of Sensors, Signals & Systems
School of Engineering & Physical Sciences
Heriot-Watt University, Edinburgh
 
Fanstastic news, Yves
One small note: While the conversation piece seems to have actively removed several instances of it, there was still one instance of the word "partial" in the writeup, which, in my view, was not necessary.
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Wednesday, November 26, 2014

Stable Autoencoding: A Flexible Framework for Regularized Low-Rank Matrix Estimation

In the Matrix Factorization Jungle Page, there is a section for subspace clsutering that reads like:

Subspace Clustering: A = AX  with unknown X, solve for sparse/other conditions on X 


In recent times, when looking for subspace clustering algorithms, the conditions on X were focused on X having a zero main diagonal and sparse entries otherwise but in the following paper, the authors seek a low rank version of X instead and call that matrix/operator a linear autoencoder.


Let us note that AX now provides an evaluation of the SVD of X. Following this line of thought, subspace clustering could also be construed as another instance of a Linear Autoencoder. Can this help design better noisy autoencoders as we recently saw [1]. Without further due, here is: Stable Autoencoding: A Flexible Framework for Regularized Low-Rank Matrix Estimation by Julie Josse, Stefan Wager

We develop a framework for low-rank matrix estimation that allows us to transform noise models into regularization schemes via a simple parametric bootstrap. Effectively, our procedure seeks an autoencoding basis for the observed matrix that is robust with respect to the specified noise model. In the simplest case, with an isotropic noise model, our procedure is equivalent to a classical singular value shrinkage estimator. For non-isotropic noise models, however, our method does not reduce to singular value shrinkage, and instead yields new estimators that perform well in experiments. Moreover, by iterating our stable autoencoding scheme, we can automatically generate low-rank estimates without specifying the target rank as a tuning parameter.  



From [2]

Let us note that  in the subspace clustering approach if diag(Z) = 0, then Tr(Z) = 0 which really means that this is a transformation that is volume preserving (see Lie Algebra) and one wonders if, like in fluid mechanics, we should be aiming to find a mixed decomposition with a volume preserving transformation (really quantifying the deformation) and one that quantifies (low) volume change (low rank matrix, the paper featured today) for the autoencoders. Let us also note that while the aim for these subspace clustering algorithms is to have a zero diagonal, the regularizer may provide a solution that is close enough but is also low rank (see LRR for instance).  Let us hope that this approach can provide some light on how to devise nonlinear autoencoders !

Relevant:

 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Tuesday, November 25, 2014

NIPS 2014 proceedings are out, what paper did you fancy ?

The NIPS 2014 proceedings are out, what paper did you fancy ?

Advances in Neural Information Processing Systems 27 (NIPS 2014)

The papers below appear in Advances in Neural Information Processing Systems 27 edited by Z. Ghahramani and M. Welling and C. Cortes and N.D. Lawrence and K.Q. Weinberger.
They are proceedings from the conference Neural Information Processing Systems 2014.
 
 
 
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

Printfriendly