We discuss two separate results, related only through the list of authors.
First, we consider bounding the mutual information for distributions p(x,y) with "sparse" supports. To that end, we introduce the notion of "intrinsic uncertainty", which measures the uncertainty remaining as to which "action" has been taken by the channel p(y|x) after observing both its input and its output. Using the Donsker-Varadhan variational principle, we derive a lower bound for the intrinsic uncertainty that depends only on the marginals and support of p(x,y). This bound easily translates into a bound on the mutual information. We apply this method to the binary deletion channel, and show that for the special case of an i.i.d. input, our bound outperforms the best known lower bound for the mutual information over a wide range of deletion probabilities.
Next, we consider the AWGN channel with noisy feedback. Schalkwijk and Kailath have shown that for noiseless feedback, capacity can be attained via a simple scalar linear scheme. This elegant scheme fails however in the presence of arbitrarily low feedback noise, essentially since the uncertainty at the transmitter regarding the receiver's state grows rapidly, decoupling the terminals. In fact, a general negative result of Kim, Lapidoth and Weissman indicates that linear encoding schemes cannot attain any positive rate with asymptotically vanishing error. Is there nevertheless a "simple" scheme robust to feedback noise in some sense? We move away from the asymptotic approach, and present a simple scalar interactive scheme that can operate close to capacity with a finite (but very small) error probability, in a small number of rounds. Our scheme is a variation on the Schakwijk-Kailath theme with active feedback, where transmitter uncertainty growth is curbed using modulo arithmetic. For example, for an error probability of 1e-6, if the feedback SNR exceeds the forward SNR by 20dB, our scheme operates 0.8dB from capacity with only 19 rounds of interaction. In comparison, error correcting codes attaining the same performance require two orders of magnitude increase in delay and complexity, and a minimal delay/complexity uncoded system is bounded 9dB away from capacity.
Joint work with Or Ordentlich and Assaf Ben-Yishai.
The Information Theory Forum (IT-Forum) at Stanford ISL is an interdisciplinary academic forum which focuses on mathematical aspects of information processing. With a primary emphasis on information theory, we also welcome researchers from signal processing, learning and statistical inference, control and optimization to deliver talks at our forum. We also warmly welcome industrial affiliates in the above fields. The forum is typically held in Packard 202 every Friday at 1:00 pm during the academic year.
The Information Theory Forum is organized by graduate students Jiantao Jiao and Kartik Venkat. To suggest speakers, please contact any of the students.