Monday, October 10, 2016

Approximate Sparse Linear Regression

Some new results for the sparse regression problem and some extension of it:


Approximate Sparse Linear Regression by Sariel Har-Peled, Piotr Indyk, Sepideh Mahabadi
In the Sparse Linear Regression (SLR) problem, given a d×n matrix M and a d-dimensional vector q, we want to compute a k-sparse vector τ such that the error ||Mτq|| is minimized. In this paper, we present algorithms and conditional lower bounds for several variants of this problem. In particular, we consider (i) the Affine SLR where we add the constraint that iτi=1 and (ii) the Convex SLR where we further add the constraint that τ0. Furthermore, we consider (i) the batched (offline) setting, where the matrix M and the vector q are given as inputs in advance, and (ii) the query(online) setting, where an algorithm preprocesses the matrix M to quickly answer such queries. All of the aforementioned variants have been well-studied and have many applications in statistics, machine learning and sparse recovery.
We consider the approximate variants of these problems in the "low sparsity regime" where the value of the sparsity bound k is low. In particular, we show that the online variant of all three problems can be solved with query time O~(nk1). This provides non-trivial improvements over the naive algorithm that exhaustively searches all (nk) subsets B. We also show that solving the offline variant of all three problems, would require an exponential dependence of the form Ω~(nk/2/ek), under a natural complexity-theoretic conjecture. Improving this lower bound for the case of k=4 would imply a nontrivial lower bound for the famous Hopcroft's problem. Moreover, solving the offline variant of affine SLR in o(nk1) would imply an upper bound of o(nd) for the problem of testing whether a given set of n points in a d-dimensional space is degenerate. However, this is conjectured to require Ω(nd) time.







Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly