Timezone: »

Structured Dropout Variational Inference for Bayesian Neural Networks
Son Nguyen · Duong Nguyen · Khai Nguyen · Khoat Than · Hung Bui · Nhat Ho

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @ None #None

Approximate inference in Bayesian deep networks exhibits a dilemma of how to yield high fidelity posterior approximations while maintaining computational efficiency and scalability. We tackle this challenge by introducing a novel variational structured approximation inspired by the Bayesian interpretation of Dropout regularization. Concretely, we focus on the inflexibility of the factorized structure in Dropout posterior and then propose an improved method called Variational Structured Dropout (VSD). VSD employs an orthogonal transformation to learn a structured representation on the variational Gaussian noise with plausible complexity, and consequently induces statistical dependencies in the approximate posterior. Theoretically, VSD successfully addresses the pathologies of previous Variational Dropout methods and thus offers a standard Bayesian justification. We further show that VSD induces an adaptive regularization term with several desirable properties which contribute to better generalization. Finally, we conduct extensive experiments on standard benchmarks to demonstrate the effectiveness of VSD over state-of-the-art variational methods on predictive accuracy, uncertainty estimation, and out-of-distribution detection.

Author Information

Son Nguyen (Vinai artificial intelligence application and research JSC)
Duong Nguyen (Hanoi University of Science and Technology)
Khai Nguyen (University of Texas, Austin)
Khoat Than (VinAI Research)
Hung Bui (Google DeepMind)
Nhat Ho (University of Texas at Austin)

More from the Same Authors