We present an efficient algorithm and theoretical analysis for an infinite-dimensional analogue of the Lasso.
Compressed sensing seeks to explain an observed signal as the noisy measurement of a few weighted sources. Identifying a weighted collection of parametric sources with a measure on the parameter space allows one to estimate the underlying sources using an infinitedimensional analogue of the Lasso. In many applications the mapping from parameters to observations is differentiable; we propose an algorithm and theoretical analysis that leverages structure in the measurement mapping.
We propose an algorithm that interleaves conditional gradient steps on the infinite-dimensional Lasso with nonconvex local search over the parameter space. We outline applications of our algorithm to super-resolution imaging, low-rank matrix completion, linear system identification, and radiation treatment planning.
We prove that in ideal settings the optimization problem recovers the true parameters. Precisely, we restrict our theoretical analysis to the setting with a one-dimensional parameter space, positive weights, and noiseless observations. We establish conditions on the measurement function such that minimizing the weighted total variation recovers the true parameters and weights, with no minimum separation condition on the source locations