Spotlight
Invertible Convolutional Flow
Mahdi Karami · Dale Schuurmans · Jascha Sohl-Dickstein · Laurent Dinh · Daniel Duckworth

Tue Dec 10th 04:35 -- 04:40 PM @ West Exhibition Hall C + B3

Normalizing flows can be used to construct high quality generative probabilistic models, but training and sample generation require repeated evaluation of Jacobian determinants and function inverses. To make such computations feasible, current approaches employ highly constrained architectures that produce diagonal, triangular, or low rank Jacobian matrices. As an alternative, we investigate a set of novel normalizing flows based on the circular and symmetric convolutions. We show that these transforms admit efficient Jacobian determinant computation and inverse mapping (deconvolution) in O(N log N) time. Additionally, element-wise multiplication, widely used in normalizing flow architectures, can be combined with these transforms to increase modeling flexibility. We further propose an analytic approach to designing nonlinear elementwise bijectors that induce special properties in the intermediate layers, by implicitly introducing specific regularizers in the loss. We show that these transforms allow more effective normalizing flow models to be developed for generative image models.

Author Information

Mahdi Karami (University of Alberta)
Dale Schuurmans (Google)
Jascha Sohl-Dickstein (Google Brain)
Laurent Dinh (Google Brain)
Daniel Duckworth (Google Brain)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors