학술논문

Adaptive Bounding Box Uncertainties via Two-Step Conformal Prediction
Document Type
Working Paper
Source
Subject
Computer Science - Computer Vision and Pattern Recognition
Computer Science - Machine Learning
Statistics - Machine Learning
Language
Abstract
Quantifying a model's predictive uncertainty is essential for safety-critical applications such as autonomous driving. We consider quantifying such uncertainty for multi-object detection. In particular, we leverage conformal prediction to obtain uncertainty intervals with guaranteed coverage for object bounding boxes. One challenge in doing so is that bounding box predictions are conditioned on the object's class label. Thus, we develop a novel two-step conformal approach that propagates uncertainty in predicted class labels into the uncertainty intervals of bounding boxes. This broadens the validity of our conformal coverage guarantees to include incorrectly classified objects, thus offering more actionable safety assurances. Moreover, we investigate novel ensemble and quantile regression formulations to ensure the bounding box intervals are adaptive to object size, leading to a more balanced coverage. Validating our two-step approach on real-world datasets for 2D bounding box localization, we find that desired coverage levels are satisfied with practically tight predictive uncertainty intervals.
Comment: European Conference on Computer Vision (ECCV) 2024; 37 pages, 14 figures, 6 tables (incl. appendix)