학술논문

Efficient Priors for Scalable Variational Inference in Bayesian Deep Neural Networks
Document Type
Conference
Source
2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW) ICCVW Computer Vision Workshop (ICCVW), 2019 IEEE/CVF International Conference on. :773-777 Oct, 2019
Subject
Computing and Processing
Bayes methods
Uncertainty
Motorcycles
Neural networks
Training
Mathematical model
Convergence
Bayesian deep neural networks
variational inference
uncertainty estimates
Bayesian Priors
Language
ISSN
2473-9944
Abstract
Stochastic variational inference for Bayesian deep neural networks (DNNs) requires specifying priors and approximate posterior distributions for neural network weights. Specifying meaningful weight priors is a challenging problem, particularly for scaling variational inference to deeper architectures involving high dimensional weight space. Based on empirical Bayes approach, we propose Bayesian MOdel Priors Extracted from Deterministic DNN (MOPED) method to choose meaningful prior distributions over weight space using deterministic weights derived from the pretrained DNNs of equivalent architecture. We empirically evaluate the proposed approach on real-world applications including image classification, video activity recognition and audio classification tasks with varying complex neural network architectures. The proposed method enables scalable variational inference with faster training convergence and provides reliable uncertainty quantification.