학술논문

SFME: Score Fusion from Multiple Experts for Long-tailed Recognition
Document Type
Conference
Source
2022 IEEE International Conference on Networking, Sensing and Control (ICNSC) Networking, Sensing and Control (ICNSC), 2022 IEEE International Conference on. :1-6 Dec, 2022
Subject
Aerospace
Communication, Networking and Broadcast Technologies
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Image recognition
Self-supervised learning
Tail
Predictive models
Benchmark testing
Probability distribution
Sensors
Long-tailed Learning
Self-supervised Learning
Score Fusion Module
Language
Abstract
In real-world scenarios, datasets often perform a long-tailed distribution, making it difficult to train neural net-work models that achieve high accuracy across all classes. In this paper, we explore self-supervised learning for the purpose of learning generalized features and propose a score fusion module to integrate outputs from multiple expert models to obtain a unified prediction. Specifically, we take inspiration from the observation that networks trained on a less unbalanced subset of the distribution tend to produce better performance than networks trained on the entire dataset. However, subsets from tail classes are not adequately represented due to the limitation of data size, which means that their performance is actually unsatisfactory. Therefore, we employ self-supervised learning (SSL) on the whole dataset to obtain a more generalized and transferable feature representation, resulting in a sufficient improvement in subset performance. Unlike previous work that used knowledge distillation models to distill the models trained on a subset to get a unified student model, we propose a score fusion module that directly exploits and integrates the predictions of the subset models. We do extensive experiments on several long-tailed recognition benchmarks to demonstrate the effectiveness of our pronosed model.