IT-Forum

IT-Forum

Topic: 
Capacity bounds for diamond networks with an orthogonal broadcast channel
Abstract / Description: 

A class of diamond networks is studied where the broadcast component is orthogonal and modeled by two independent bit-pipes. New upper and lower bounds on the capacity are derived. The proof technique for the upper bound generalizes bounding techniques of Ozarow for the Gaussian multiple description problem (1981) and Kang and Liu for the Gaussian diamond network (2011). The lower bound is based on Marton's coding technique and superposition coding. The bounds are evaluated for Gaussian and binary adder multiple access channels (MACs). For Gaussian MACs, both the lower and upper bounds strengthen the Kang-Liu bounds and establish capacity for interesting ranges of bit-pipe capacities. For binary adder MACs, the capacity is established for all ranges of bit-pipe capacities.


The Information Theory Forum (IT-Forum) at Stanford ISL is an interdisciplinary academic forum which focuses on mathematical aspects of information processing. With a primary emphasis on information theory, we also welcome researchers from signal processing, learning and statistical inference, control and optimization to deliver talks at our forum. We also warmly welcome industrial affiliates in the above fields. The forum is typically held in Packard 202 every Friday at 1:00 pm during the academic year.

 

 

Date and Time: 
Friday, December 4, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Super-Resolution of Positive Sources
Abstract / Description: 

The resolution of all microscopes is limited by diffraction. The observed signal is a convolution of the emitted signal with a low-pass kernel, the point-spread function (PSF) of the microscope. The frequency cut-off of the PSF is inversely proportional to the wavelength of light. Hence, the features of the object that are smaller than the wavelength of light are difficult to observe. In single-molecule microscopy the emitted signal is a collection of point sources, produced by blinking molecules. The goal is to recover the location of these sources with precision that is much higher than the wavelength of light. This leads to the problem of super-resolution of positive sources in the presence of noise. We show that the problem can be solved using convex optimization in a stable fashion. The stability of reconstruction depends on Rayleigh-regularity of the support of the signal, i.e., on how many point sources can occur within an interval of one wavelength. The stability estimate is complemented by a converse result: the performance of the convex algorithm is nearly optimal. I will also give a brief summary on the ongoing project, developed in collaboration with the group of Prof. W.E. Moerner, where we use the theoretical ideas to improve microscopes.


 

The Information Theory Forum (IT-Forum) at Stanford ISL is an interdisciplinary academic forum which focuses on mathematical aspects of information processing. With a primary emphasis on information theory, we also welcome researchers from signal processing, learning and statistical inference, control and optimization to deliver talks at our forum. We also warmly welcome industrial affiliates in the above fields. The forum is typically held in Packard 202 every Friday at 1:00 pm during the academic year.

Date and Time: 
Friday, November 20, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Understanding synergistic interactions and complex information sharing
Abstract / Description: 

The interactions between three or more variables are frequently nontrivial, poorly understood, and yet, are paramount for future advances in fields such as multiuser information theory, neuroscience and complexity science. In this talk, we will study the interactions that can exist between a group of random variables by introducing a axiomatic framework that characterizes the ways in which information can be shared. The framework is based on the novel notion of information synergy, which measures statistical structures that are present in the whole set of variables but not in the any of them individually. Finally, the framework will be applied to several communication scenarios, providing a more intuitive understanding of their fundamental limits.


 

The Information Theory Forum (IT-Forum) at Stanford ISL is an interdisciplinary academic forum which focuses on mathematical aspects of information processing. With a primary emphasis on information theory, we also welcome researchers from signal processing, learning and statistical inference, control and optimization to deliver talks at our forum. We also warmly welcome industrial affiliates in the above fields. The forum is typically held in Packard 202 every Friday at 1:00 pm during the academic year.

Date and Time: 
Friday, November 13, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
A Stronger Soft-Covering Lemma that assures Semantic Security in Wiretap Channels
Abstract / Description: 

In 1975, Wyner published two very different papers that are unexpectedly connected. One introduced the wiretap channel, showing that information-theoretic secrecy is possible without a secret key by taking advantage of channel noise. This is the foundation for much of physical-layer security. The other paper introduced a notion of common information relevant to generating random variables at different terminals. In that work he introduced a soft-covering tool for proving achievability. Coincidently, soft covering has now become the tool of choice for proving strong secrecy in wiretap channels, although Wyner didn't appear to make any connection between the two results.

We present a sharpening of the soft-covering tool by showing that the soft-covering phenomenon happens with doubly-exponential certainty with respect to a randomly generated codebook. Through the union bound, this enables security proofs in settings where many security constraints must be satisfied simultaneously. The "type II" wiretap channel is a great example of this, where the eavesdropper can actively influence his observations but security must hold in all cases. We demonstrate the effectiveness of this tool by deriving the secrecy capacity of wiretap channels of type II with a noisy main channel---previously an open problem. Additionally, this stronger soft covering allows information-theoretic security proofs to be easily upgraded to semantic security, which is the gold standard in cryptography.

Date and Time: 
Friday, November 6, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Rate-distortion of sub-Nyquist sampled processes
Abstract / Description: 

Consider the task of analog to digital conversion in which a continuous time random process is mapped into a stream of bits. The optimal trade-off between the bitrate and the minimal average distortion in recovering the waveform from its bit representation is described by the Shannon rate-distortion function of the continuous-time source. Traditionally, in solving for the optimal mapping and the rate-distortion function we assume that the analog waveform has a discrete time version, as in the case of a band-limited signal sampled above its Nyquist frequency. Such assumption, however, may not hold in many scenarios due to wideband signaling and A/D technology limitations. A more relevant assumption in such scenarios is that only a sub-Nyquist sampled version of the source can be observed, and that the error in analog to digital conversion is due to both sub-sampling and finite bit representation. This assumption gives rise to a combined sampling and source coding problem, in which the quantities of merit are the sampling frequency, the bitrate and the average distortion.

In this talk we will characterize the optimal trade-off among these three parameters. The resulting rate-distortion-samplingFrequency function can be seen as a generalization of the classical Shannon-Kotelnikov-Whittaker sampling theorem to the case where finite bitrate representation is required. This characterization also provides us with a new critical sampling rate: the minimal sampling rate required to achieve the rate-distortion function of a Gaussian stationary process for a given rate-distortion pair. That is, although the Nyquist rate is the minimal sampling frequency that allows perfect reconstruction of a bandlimited signal from its samples, relaxing perfect reconstruction to a prescribed distortion allows sampling below the Nyquist rate while achieving the same rate-distortion trade-off.

Date and Time: 
Friday, October 23, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Learning (from) networks: fundamental limits, algorithms, and applications
Abstract / Description: 

Network models provide a unifying framework for understanding dependencies among variables in medical, biological, and other sciences. Networks can be used to reveal underlying data structures, infer functional modules, and facilitate experiment design. In practice, however, size, uncertainty and complexity of the underlying associations render these applications challenging.

In this talk, we illustrate the use of spectral, combinatorial, and statistical inference techniques in several significant network science problems. First, we consider the problem of network alignment where the goal is to find a bijective mapping between nodes of two networks to maximize their overlapping edges while minimizing mismatches. To solve this combinatorial problem, we present a new scalable spectral algorithm, and establish its efficiency theoretically and experimentally over several synthetic and real networks. Next, we introduce network maximal correlation (NMC) as an essential measure to capture nonlinear associations in networks. We characterize NMC using geometric properties of Hilbert spaces and illustrate its application in learning network topology when variables have unknown nonlinear dependencies. Finally, we discuss the problem of learning low dimensional structures (such as clusters) in large networks, where we introduce logistic Random Dot Product Graphs, a new class of networks which includes most stochastic block models as well as other low dimensional structures. Using this model, we propose a spectral network clustering algorithm that possesses robust performance under different clustering setups. In all of these problems, we examine underlying fundamental limits and present efficient algorithms for solving them. We also highlight applications of the proposed algorithms to data-driven problems such as functional and regulatory genomics of human diseases, and cancer.

Date and Time: 
Thursday, October 15, 2015 - 3:00pm to 4:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Communication in Strategic Environments: Crawford-Sobel Meet Shannon
Abstract / Description: 

Over thirty years ago, economists Vincent Crawford and Joel Sobel introduced the concepts of strategic information transmission (SIT) and cheap talk in their seminal Econometrica paper, as a way of understanding how information is strategically revealed (or not) by agents whose interests are only partially aligned. This theory has had tremendous success in explaining situations ranging from advertising to expert advice sharing, and many extensions of the original SIT model and the broader "principal- agent" class of problems have been extensively studied in the economics literature since. However, despite its name and even superficially obvious connection with information theory (IT), SIT has so far received very little attention from the IT community.

In this talk, I will present approaches to address such strategic communication problems from the lens of information and game theories. Specifically, I will focus on a strategic communication paradigm where the better-informed transmitter communicates with a receiver who makes the ultimate decision concerning both agents. While the economists have extensively studied the Nash equilibrium variant of this problem, the more relevant Stackelberg equilibrium enables the use of Shannon theoretic tools. I will present the fundamental limits of strategic compression and communication problems in the SIT context. Particularly, three problem settings will be considered, focusing on the quadratic distortion measures and jointly Gaussian variables: compression, communication, and the simple equilibrium conditions without any compression or communication. The analysis will then be extended to the receiver side information setting, where the strategic aspect of the problem yields rather surprising results regarding optimality of uncoded communication. Finally, several applications of the results within the broader context of decision theory will be presented.

Date and Time: 
Friday, October 9, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Recursive Estimation and Application of Multivariate Markov Processes
Abstract / Description: 

A multivariate Markov process is a vector of random processes which are jointly, but not necessarily individually, Markov. Multivariate Markov processes form a rich family of mathematically tractable models, which have been found useful in numerous applications. Examples include the hidden Markov process in discrete-time, and the Markov modulated Poisson process in continuous-time. Multivariate Markov processes lend themselves to the mathematical tractability of Markov processes while allowing nonMarkovian features for the individual process components. For example, while the distribution of the sojourn time of a multivariate Markov chain in each of its states is geometric or exponential, the distribution of the sojourn time of an individual process component in each of its states is phase-type. In this talk we present forward recursions for some statistics of discrete and continuous-time multivariate Markov processes, which are relevant to their parameter estimation. We shall also briefly review applications in cognitive radio and network tomography.

This presentation is based on joint work with Brian L. Mark. This work was supported in part by the National Science Foundation under grant CCF- 0916568.

Date and Time: 
Friday, July 24, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Deep Speech, a Deep Learning based speech recognition system
Abstract / Description: 

Speech recognition is still an unsolved problem in AI. Humans transcribe speech substantially better than machines, particularly when the speech is noisy, accented or spoken in a natural, unaffected manner. Over the past half-century slow yet steady progress has been made in speech recognition punctuated with rare breakthroughs including the Hidden Markov Model in the 70s and, more recently, Deep Learning.

In this talk I will briefly overview the current state of speech recognition technology and discuss the challenges that must be overcome in order to make progress and eventually approach human level performance. I will also focus on Deep Speech, a Deep Learning based speech recognition system built at Baidu Research's Silicon Valley AI lab. Key to the success of Deep Speech is a significantly simpler architecture than traditional speech systems, a well-optimized recurrent neural network training system that uses multiple GPUs and a set of novel data synthesis techniques. These ingredients taken together have shown great potential for rapid progress in speech recognition.

Date and Time: 
Friday, June 5, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

IT-Forum

Topic: 
Capacity of Energy Harvesting Communication
Abstract / Description: 

Recent advances in energy harvesting technologies enable wireless devices to harvest the energy they need from the natural resources in their environment. This development opens the exciting new possibility to build wireless networks that are self-powered, self-sustainable and which have lifetimes limited by their hardware and not the size of their batteries.

However, energy harvesting also brings a fundamental shift in communication system design principles. In conventional systems, energy (or power) is a deterministic quantity continuously available to the transmitter and communication is typically constrained only in terms of average power. In harvesting systems, the energy available for communication is a stochastic rather than a deterministic process which has memory and is input-dependent. In this talk, we will investigate the information-theoretic capacity of this non-traditional communication system. We will characterize the capacity as an n-letter mutual information rate under various assumptions on the availability of energy arrival information and use these n-letter expressions to approximate the capacity in terms of a power control problem that has been extensively studied in the recent communication theory literature. We will then proceed to deriving an approximate solution to this power control problem which will allow us to approximate the capacity with a simple and insightful formula within a constant gap independent of system parameters.

Joint work with Dor Shaviv, Phan-Minh Nguyen and Yishun Dong.

Date and Time: 
Friday, May 29, 2015 - 1:00pm to 2:00pm
Venue: 
Packard 202

Pages

Subscribe to RSS - IT-Forum