Monday, March 27, 2017

CSjob; Internship (Spring/Summer/Fall 2017), IFP Energies nouvelles, France

Laurent just sent me the following:

Dear Igor


I have a late internship proposal at IFP Energies nouvelles. I would be delighted if you could advertise it. The update page is:
http://www.laurent-duval.eu//lcd-2017-intern-sparse-regression-dim-reduction.html
and the pdf file is here:
http://www.laurent-duval.eu//Documents/IFPEN_2017_SUBJ_Robust-sparse-regression.pdf


A text (same as the webpage if you need html code): 
Sparse regression and dimension reduction for sensor measurements and data normalization

The instrumental context is that of multiple 1D data or measurements ym related to the the same phenomenon x, corrupted by random effects nm and a different scaling parameter am, due to uncontrolled sensor calibrations or measurement variability. The model is thus: 
ym(k) = am x(k) + nm(k) .

The aim of the internship is to robustly estimate scaling parameters am (with confidence bounds) in the presence of missing data or outliers for potentially small, real-life signals x with large amplitude variations. The estimation should be as automatized as possible, based on data properties and priors (e.g. sparsity, positivity), so as to be used by non-expert users. Signals under study are for instance: vibration, analytical chemistry or biological data. Of particular interest for this internship is the study and performance assessment of robust loss or penalty functions (around the l2,1-norm) such as the R1-PCA or low-rank decomposition.


Best


Sure Laurent !

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly