What is the minimal mean squared error in recovering a path of the Brownian motion from any finite bit per time lag encoded version of it? Is it possible to attain a bounded mean squared error in this case?

Berger's PhD thesis answers these two question by providing a coding theorem for the Brownian motion, showing that its optimal distortion-rate performance are described by Shannon's distortion-rate function. That is, Berger's result implies that for any positive bitrate, there exists a coding scheme that allows recovering of the Brownian path with mean squared error given by Shannon's distortion-rate function (first evaluated by Iaglom).

Berger's result, however, does not take into account practical considerations in encoding analog process. Indeed, hardware and other implementation constraints restrict access to samples of the continuous-time path, taken at a finite time resolution. The question that we ask in thistalk is what is the minimal mean squared error in recovering the original Brownian path from a code that is only a function of these samples. This question is answered by deriving the distortion-rate function as a function of the sampling rate. It can be shown that the ratio between this function and the optimal distortion-rate performance without sampling constraint of Berger, are only a function of the number of bits per sample. For example, our result implies that using 1 bits per sample, one can attain distortion which is no more than 1.12 times the distortion-rate function.

Another interesting question arises in the following case: assume that the source encoder receives the discrete-time samples but is unaware of their underlying continuous-time origin. As a result, the encoder applies an optimal source code with respect to the empirical distribution of these samples, namely with respect to a discrete-time Brownian motion. We show that the maximal loss in this case, compared to an encoder that considers the continuous-time origin, does not exceed 1/6 times the intensity of the process.

The Information Theory Forum (IT-Forum) at Stanford ISL is an interdisciplinary academic forum which focuses on mathematical aspects of information processing. With a primary emphasis on information theory, we also welcome researchers from signal processing, learning and statistical inference, control and optimization to deliver talks at our forum. We also warmly welcome industrial affiliates in the above fields. The forum is typically held in Packard 202 every Friday at 1:00 pm during the academic year.

The Information Theory Forum is organized by graduate students Jiantao Jiao and Yanjun Han. To suggest speakers, please contact any of the students.

**Bio:**

Alon Kipnis is a fifth year PhD candidate in the Department of Electrical Engineering at Stanford. Previously he completed his M.Sc. in mathematics from Ben-Gurion University.