학술논문

A General Framework for Ensemble Distribution Distillation
Document Type
Conference
Source
2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP) Machine Learning for Signal Processing (MLSP), 2020 IEEE 30th International Workshop on. :1-6 Sep, 2020
Subject
Signal Processing and Analysis
Uncertainty
Predictive models
Data models
Computational modeling
Training
Toy manufacturing industry
Neural networks
Ensemble
distillation
uncertainty
Language
Abstract
Ensembles of neural networks have shown to give better predictive performance and more reliable uncertainty estimates than individual networks. Additionally, ensembles allow the uncertainty to be decomposed into aleatoric (data) and epistemic (model) components, giving a more complete picture of the predictive uncertainty. Ensemble distillation is the process of compressing an ensemble into a single model, often resulting in a leaner model that still outperforms the individual ensemble members. Unfortunately, standard distillation erases the natural uncertainty decomposition of the ensemble. We present a general framework for distilling both regression and classification ensembles in a way that preserves the decomposition. We demonstrate the desired behaviour of our framework and show that its predictive performance is on par with standard distillation.