Generalized Normalizing Flows via Markov Chains
Paul Lyonel Hagemann, Johannes Hertrich, Gabriele Steidl
Generalized Normalizing Flows via Markov Chains
Paul Lyonel Hagemann, Johannes Hertrich, Gabriele Steidl
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
This item is not currently in-stock. It can be ordered online and is expected to ship in approx 2 weeks
Our stock data is updated periodically, and availability may change throughout the day for in-demand items. Please call the relevant shop for the most current stock information. Prices are subject to change without notice.
Sign in or become a Readings Member to add this title to a wishlist.