I will summarize some recent works on differentially private (DP) algorithms for stochastic convex optimization: the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions. In the standard l2/l2 setting, we will see two approaches to getting optimal rates for this problem. We show that for a wide range of parameters, privacy causes no additional overhead in accuracy or run time. In the process, we will develop techniques for private stochastic optimization that work for other geometries. For the LASSO setting when optimizing over the l1 ball, we will see private algorithms that achieve optimal rates. Based on joint works with various subsets of Hilal Asi, Raef Bassily, Vitaly Feldman, Tomer Koren and Abhradeep Thakurta.
Bio: Kunal Talwar is a Research Scientist at Apple, leading a research group of foundations of ML and Private Data Analysis. His research interests span various aspects of Computer Science including Differential Privacy, Machine Learning, Algorithms and Data Structures. He got his B.Tech. from IIT Delhi (2000) and his PhD from UC Berkeley (2004). Prior to joining Apple, he worked at Microsoft Research in Silicon Valley from 2005 to 2014, and at Google Brain from 2014 to 2019. He has made major contributions to Differential Privacy, Metric Embeddings and Discrepancy Theory. His work has been recognized by the Privacy Enhancing Technologies award in 2009 and the ICLR Best Paper award in 2017.
This talk is hosted by the ISL Colloquium. To receive talk announcements, subscribe to the mailing list firstname.lastname@example.org.