Image
event icon eye

Latent Variables and Lossless Compression

Summary
James Townsend, University of Amsterdam
Zoom only
*Note time
Oct
28
Date(s)
Content

ZOOM LINK
PASSWORD:  032264

Abstract: I will give some background/history of ‘latent variable models’, and explain how a last-in-first-out compression technique such as asymmetric numeral systems (ANS) allows you to introduce latent random variables during lossless compression. I will then discuss known examples where this is useful. These examples include (going from simple to more elaborate) rANS itself; ANS with the ‘alias method’; and a method for compressing images using variational auto-encoders (VAEs). Some prior familiarity with ANS will be useful for understanding the talk.

 

Bio: James Townsend is a post-doc machine learning researcher, based at the Amsterdam Machine Learning Lab (AMLab) at the University of Amsterdam. He completed his PhD, on lossless compression with latent variable models, in 2020, supervised by Professor David Barber at the UCL AI Centre in London. Most of his research to date has been on deep generative models and lossless compression. He is also interested in unsupervised learning more generally, approximate inference, Monte Carlo methods, optimization and the design of machine learning software systems.