IT Forum: Low rank approximation for faster optimization
Abstract: Low rank structure is pervasive in real-world datasets. This talk shows how to accelerate the solution of fundamental computational problems, including eigenvalue decomposition, linear system solves, composite convex optimization, and stochastic optimization (including deep learning), by exploiting this low rank structure. We present a simple method based on randomized numerical linear algebra for efficiently computing approximate top eigendecompositions, which can be used to replace large matrices (such as Hessians and constraint matrices) with low rank surrogates that are faster to apply and invert. The resulting solvers for linear systems (NystromPCG), composite convex optimization (NysADMM), and deep learning (SketchySGD) demonstrate strong theoretical and numerical support, outperforming state-of-the-art methods in terms of speed and robustness to hyperparameters.
Bio: Madeleine Udell is Assistant Professor of Management Science and Engineering at Stanford University, with an affiliation with the Institute for Computational and Mathematical Engineering (ICME) and courtesy appointment in Electrical Engineering. Her research aims to accelerate and simplify large-scale data analysis and optimization, with impact on challenges in healthcare, finance, marketing, operations, and engineering systems design, among others. Her work in optimization seeks to detect and exploit novel structures, leading to faster and more memory-efficient algorithms. Her work in machine learning centers on challenges of data preprocessing, interpretability, and causality, which are critical to practical application of machine learning methods. She is a Kavli Fellow (2023) and Alfred P. Sloan Research Fellow (2021). Other awards include a National Science Foundation CAREER award (2020) and an Office of Naval Research (ONR) Young Investigator Award (2020).