Timezone: »
Normalizing flows are a class of probabilistic generative models which allow for both fast density computation and efficient sampling and are effective at modelling complex distributions like images. A drawback among current methods is their significant training cost, sometimes requiring months of GPU training time to achieve state-of-the-art results. This paper introduces Wavelet Flow, a multi-scale, normalizing flow architecture based on wavelets. A Wavelet Flow has an explicit representation of signal scale that inherently includes models of lower resolution signals and conditional generation of higher resolution signals, i.e., super resolution. A major advantage of Wavelet Flow is the ability to construct generative models for high resolution data (e.g., 1024 × 1024 images) that are impractical with previous models. Furthermore, Wavelet Flow is competitive with previous normalizing flows in terms of bits per dimension on standard (low resolution) benchmarks while being up to 15× faster to train.
Author Information
Jason Yu (York University)
Konstantinos Derpanis (Ryerson University)
Marcus Brubaker (York University)
More from the Same Authors
-
2022 : Physics aware inference for the cryo-EM inverse problem: anisotropic network model heterogeneity, global 3D pose and microscope defocus »
Geoffrey Woollard · Shayan Shekarforoush · Frank Wood · Marcus Brubaker · Khanh Dao Duc -
2021 Poster: Continuous Latent Process Flows »
Ruizhi Deng · Marcus Brubaker · Greg Mori · Andreas Lehrmann -
2021 Poster: Drop-DTW: Aligning Common Signal Between Sequences While Dropping Outliers »
Mikita Dvornik · Isma Hadji · Konstantinos Derpanis · Animesh Garg · Allan Jepson