EE Student Information

IT Forum presents "Distribution-Free, Risk-Controlling Prediction Sets"

Topic: 
Distribution-Free, Risk-Controlling Prediction Sets
Friday, February 19, 2021 - 1:00pm to Saturday, February 20, 2021 - 12:55pm
Speaker: 
Anastasios Angelopoulos (UC Berkeley)
Abstract / Description: 

To communicate instance-wise uncertainty for prediction tasks, we show how to generate set-valued predictions for black-box predictors that control the expected loss on future test points at a user-specified level. Our approach provides explicit finite-sample guarantees for any dataset by using a holdout set to calibrate the size of the prediction sets. This framework enables simple, distribution-free, rigorous error control for many tasks, and we demonstrate it in five large-scale machine learning problems: (1) classification problems where some mistakes are more costly than others; (2) multi-label classification, where each observation has multiple associated labels; (3) classification problems where the labels have a hierarchical structure; (4) image segmentation, where we wish to predict a set of pixels containing an object of interest; and (5) protein structure prediction.

Bio: Anastasios Nikolas Angelopoulos is a second-year Ph.D. student at the University of California, Berkeley. He is privileged to be advised by Michael I. Jordan and Jitendra Malik. From 2016 to 2019, he was an electrical engineering student at Stanford University advised by Stephen P. Boyd.