Deep unsupervised learning using nonequilibrium thermodynamics
Citations Over Time
Abstract
A central problem in machine learning involves modeling complex data-sets using highly flexi-ble families of probability distributions in which learning, sampling, inference, and evaluation are still analytically or computationally tractable. Here, we develop an approach that simultane-ously achieves both flexibility and tractability. The essential idea, inspired by non-equilibrium statistical physics, is to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process. We then learn a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data. This ap-proach allows us to rapidly learn, sample from, and evaluate probabilities in deep generative models with thousands of layers or time steps, as well as to compute conditional and posterior probabilities under the learned model. We addi-tionally release an open source reference imple-mentation of the algorithm. 1.
Related Papers
- Generative Modeling by Estimating Gradients of the Data Distribution(2019)
- Score-Based Generative Modeling through Stochastic Differential Equations(2021)
- → Auto-Encoding Variational Bayes(2013)15,549 cited
- → Denoising Diffusion Probabilistic Models(2020)5,577 cited
- → Deep Unsupervised Learning using Nonequilibrium Thermodynamics(2015)1,415 cited