Tuesday, March 04, 2014

Slides: Greedy Algorithms, Frank-Wolfe and Friends - A modern perspective




Motivation: Greedy algorithms and related first-order optimization algorithms are at the core of many of the state of the art sparse methods in machine learning, signal processing, harmonic analysis, statistics and other seemingly unrelated areas, with very different goals at first sight. Examples include matching pursuit, boosting, greedy methods for sub-modular optimization, structured prediction, and many more. In the field of optimization, the recent renewed interest in Frank-Wolfe/conditional gradient algorithms opens up an interesting perspective towards a unified understanding of these methods, with a big potential to translate the rich existing knowledge about the respective greedy methods between the different fields.
The scope of this workshop is to gather renowned experts working on those algorithms in machine learning, optimization, signal processing, statistics and harmonic analysis, in order to engender a fruitful exchange of ideas and discussions and to push further the boundaries of scalable and efficient optimization for learning problems.

Goals: The goals of the workshop are threefold. First, to provide an accessible review and synthesis of not only recent results but also old-but-forgotten results describing the behavior and performance of the related greedy algorithms. Second, to cover recent advances on extensions of such algorithms relevant to machine learners, such as learning with atomic-norm regularization (Rao et al., 2012; Tewari et al., 2011, Harchaoui et al., 2012), submodular optimization (Bach, 2011), herding algorithms (Chen et al., 2010; Bach et al., 2012), or structured prediction (Lacoste-Julien et al., 2013). One important example of interest here is the study lower bounds as in (Lan, 2013, Jaggi 2013) to better understand the limitations of such greedy algorithms and of sparse methods. Third, the goal is to provide a forum for open problems, and to serve as stepping-stone for cutting-edge research on this family of scalable algorithms for big data problems.


Here are the presentations slides:


TimeSpeakerTitleLinks
7:30 - 7:40amOrganizersIntroduction and Poster SetupSlides ]  
7:40 - 8:20amRobert M. FreundRemarks on Frank-Wolfe and Structural FriendsSlides ]  
8:20 - 9:00amBen RechtThe Algorithmic Frontiers of Atomic Norm Minimization: Relaxation, Discretization, and Greedy PursuitSlides ] 
9:00 - 9:30pmCoffee Break
9:30 - 9:45amNikhil Rao, Parikshit Shah and Stephen WrightConditional Gradient with Enhancement and Truncation for Atomic Norm RegularizationSlides ]  
9:45 - 9:55amHector Allende, Emanuele Frandi, Ricardo Nanculef, Claudio SartoriPairwise Away Steps for the Frank-Wolfe Algorithm
9:55 - 10:05amSimon Lacoste-Julien and Martin JaggiAn Affine Invariant Linear Convergence Analysis for Frank-Wolfe AlgorithmsSlides ]   
10:05 - 10:15amVamsi Potluru, Jonathan Le Roux, Barak Pearlmutter, John Hershey and Matthew BrandCoordinate Descent for mixed-norm NMF
Slides ]  
10:15 - 10:30amRobert M. Freund and Paul GrigasNew Analysis and Results for the Conditional Gradient MethodSlides ]  
10:30 - 3:30pmLunch Break
3:30 - 3:45pmMarguerite FrankHonorary Guest               
3:45 - 4:25pmShai Shalev-SchwartzEfficiently Training Sum-Product Neural Networks using Forward Greedy SelectionSlides ]   
4:25 - 4:40pmXiaocheng Tang and Katya ScheinbergComplexity of Inexact Proximal Newton MethodsSlides ] 

Vladimir TemlyakovFrom Greedy Approximation to Greedy OptimizationSlides ]
4:40 - 4:50pmJacob Steinhardt and Jonathan HugginsA Greedy Framework for First-Order OptimizationSlides ]   
4:50 - 5:00pmAhmed Farahat, Ali Ghodsi and Mohamed KamelA Fast Greedy Algorithm for Generalized Column Subset SelectionSlides ]
Poster ]
5:00 - 5:30pmCoffee Break
5:30 - 6:10pmFrancis BachConditional Gradients EverywhereSlides ]  
6:10 - 6:25pmDavid Belanger, Dan Sheldon and Andrew McCallumMarginal Inference in MRFs using Frank-WolfeSlides ] 











Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly