Mutual information emerged in Shannon’s 1948 masterpiece as the answer to the most fundamental questions of compression and communication. Since that time, however, it has been adopted and widely used in a variety of other disciplines. In particular, its estimation has emerged as a key component in fields such as machine learning, computer vision, systems biology, medical imaging, neuroscience, genomics, economics, ecology, and physics. The two parts of this talk will respectively address two questions: why should we care about estimating mutual information, and how should we go about estimating it?
The first part will present a recent set of results establishing the status of mutual information as the “canonical” measure of dependence. Specifically, we show that, when measuring dependence by the extent to which one variable is helpful in estimating the other, the only loss function for estimation satisfying a natural data processing stipulation is the logarithmic loss, and mutual information is the resulting dependence measure. Other objects with mutual information at their core inherit analogous justifications. A notable example is directed information, which emerges as the only measure of the ‘degree to which one process is helpful in predicting the other’ to satisfy a natural data processing property.
The second part of the talk will showcase a new approach to the estimation of mutual information between random objects with distributions residing in high-dimensional spaces (e.g., large alphabets), as is the case in increasingly many applications. We will discuss the shortcomings of traditional estimators, and suggest a new estimator achieving essentially optimum worst-case performance under L2 risk (i.e., achieves the minimax rates). Finally, we will exhibit examples illustrating the benefits afforded by this estimator in practice.
The talk is based on recent collaborations with Jiantao Jiao, Thomas Courtade, Yanjun Han, Albert No, and Tsachy Weissman.
Kartik Venkat is a fifth year graduate student at Stanford EE, working with Tsachy Weissman.