By building upon the recent theory that established the connection between implicit generative modeling (IGM) and optimal transport, in this study, we propose a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them. The proposed algorithm is based on a functional optimization problem, which aims at finding a measure that is close to the data distribution as much as possible and also expressive enough for generative modeling purposes. We formulate the problem as a gradient flow in the space of probability measures. The connections between gradient flows and stochastic differential equations let us develop a computationally efficient algorithm for solving the optimization problem. We provide formal theoretical analysis where we prove finite-time error guarantees for the proposed algorithm. Our experimental results support our theory and show that our algorithm is able to successfully capture the structure of different types of data distributions.
The talk will be based on the following paper:
"Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions", ICML 2019
Umut Şimşekli received his PhD degree in 2015 from Bogaziçi University, İstanbul, Turkey. His current research interests are in large-scale Bayesian machine learning, diffusion-based Markov Chain Monte Carlo, non-convex optimization, and audio/music processing. He is an assistant professor within the Signals, Statistics, and Machine Learning Group at Telecom ParisTech, Paris, France and currently a long term visitor at the University of Oxford, Department of Statistics.