EE Student Information

Statistics Seminar: Improving knockoffs with conditional calibration

Topic: 
Improving knockoffs with conditional calibration
Tuesday, January 4, 2022 - 4:30pm
Speaker: 
Will Fithian (UC Berkeley)
Abstract / Description: 

The knockoff filter (Barber and Candès, 2015) is a flexible framework for multiple testing in supervised learning models, based on introducing synthetic predictor variables to control the false discovery rate. I will discuss why knockoffs can outperform the (dependence-adjusted) Benjamini–Hochberg (BH) procedure in some contexts, and introduce the calibrated knockoff filter, which uses the conditional calibration framework of Fithian and Lei (2020) to uniformly improve the power of any knockoffs method. The improvement is especially notable in two contexts where baseline knockoff methods underperform: when the rejection set is small, and when the structure of the design matrix prevents us from constructing good knockoff variables. In these contexts, where baseline knockoff methods can be nearly powerless, calibrated knockoffs can nevertheless often outperform BH and other competitors.

This is joint work with Yixiang Luo and Lihua Lei.