We study connections between Dykstra's algorithm for projecting onto an intersection of convex sets, the augmented Lagrangian method of multipliers or ADMM, and block coordinate descent. We prove that coordinate descent for a regularized regression problem, in which the penalty is a separable sum of support functions, is exactly equivalent to Dykstra's algorithm applied to the dual problem. ADMM on the dual problem is also seen to be equivalent, in the special case of two sets, with one being a linear subspace. These connections, aside from being interesting in their own right, suggest new ways of analyzing and extending coordinate de- scent. For example, from existing convergence theory on Dykstra's algorithm over polyhedra, we discern that coordinate descent for the lasso problem converges at an (asymptotically) linear rate. We also develop two parallel versions of coordinate descent, based on the Dykstra and ADMM connections.

**Bio:**

Ryan Tibshirani is an associate professor in the Department of Statistics and the Machine Learning Department at Carnegie Mellon University. In the fall of 2011, he joined Carnegie Mellon University as a faculty member. He obtained his Ph.D. in Statistics from Stanford University in 2011, under the supervision of Jonathan Taylor. Prior to that, he obtained a B.S. in Mathematics and minor in Computer Science from Stanford University in 2007. His research interests lie broadly in statistics, machine learning, and optimization. More specifically, he is interested in high-dimensional statistics, post-selection inference, nonparametric estimation, convex optimization, and convex geometry. He is also interested in developing methods for epidemiological forecasting, particularly flu forecasting.