## EE Student Information

### The Department of Electrical Engineering supports Black Lives Matter. Read more.

• • • • •

EE Student Information, Spring Quarter through Academic Year 2020-2021: FAQs and Updated EE Course List.

Updates will be posted on this page, as well as emailed to the EE student mail list.

As always, use your best judgement and consider your own and others' well-being at all times.

# IT-Forum presents "High-accuracy Optimality and Limitation of the Profile Maximum Likelihood"

Topic:
High-accuracy Optimality and Limitation of the Profile Maximum Likelihood
Friday, September 25, 2020 - 1:15pm
Venue:
Zoom registration req'd
Speaker:
Yanjun Han (Stanford)
Abstract / Description:

Symmetric properties of distributions arise in multiple settings, where for each of them separate estimators have been developed. Recently, Orlitsky et al. showed that a single estimator, called the profile maximum likelihood (PML), achieves the optimal sample complexity universally for many properties and any accuracy parameter larger than \$n^{-1/4}\$, where \$n\$ is the sample size. They also raised the question whether this low-accuracy range is an artifact of the analysis or a fundamental limitation of the PML, which remained open after several subsequent work.

In this talk, we provide a complete answer to this question and characterize the tight performance of PML in the high-accuracy regime. On the positive side, we show that the PML remains sample-optimal for any accuracy parameter larger than \$n^{-1/3}\$ using a novel chaining property of the PML distributions. In particular, the PML distribution itself is an optimal estimator of the sorted hidden distribution. On the negative side, we show that the PML as well as any adaptive approach cannot be universally sample-optimal when the accuracy parameter is below \$n^{-1/3}\$, and characterize the exact penalty for adaptation via a matching adaptation lower bound.

Based on joint work with Kirankumar Shiragur. The full papers are available online at https://arxiv.org/abs/2004.03166 and https://arxiv.org/abs/2008.11964.

The Information Theory Forum (IT-Forum) at Stanford ISL is an interdisciplinary academic forum which focuses on mathematical aspects of information processing. With a primary emphasis on information theory, we also welcome researchers from signal processing, learning and statistical inference, control and optimization to deliver talks at our forum. We also warmly welcome industrial affiliates in the above fields. The forum is typically held every Friday at 1:15 pm during the academic year.

Until further notice, the IT Forum convenes exclusively via Zoom (on Fridays at 1:15pm PT) due to the ongoing pandemic. To avoid "Zoom-bombing", we ask attendees to input their email address here https://stanford.zoom.us/meeting/register/tJwkf-uvqjoqHNIWxY4HHon4K107QMo22PVR to receive the Zoom meeting details via email.

Bio:

Yanjun Han is a 6th-year PhD candidate in Electrical Engineering at Stanford University, advised by Tsachy Weissman. His research interests include high-dimensional and nonparametric statistics, information theory, online learning, applied probability, and their applications.