학술논문

Critical Classes and Samples Discovering for Partial Domain Adaptation
Document Type
Periodical
Source
IEEE Transactions on Cybernetics IEEE Trans. Cybern. Cybernetics, IEEE Transactions on. 53(9):5641-5654 Sep, 2023
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Robotics and Control Systems
General Topics for Engineers
Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Handheld computers
Training
Task analysis
Temperature distribution
Space exploration
Research and development
Knowledge transfer
Adversarial learning
ambiguous target score
partial domain adaptation (PDA)
source class weighting
Language
ISSN
2168-2267
2168-2275
Abstract
Partial domain adaptation (PDA) attempts to learn transferable models from a large-scale labeled source domain to a small unlabeled target domain with fewer classes, which has attracted a recent surge of interest in transfer learning. Most conventional PDA approaches endeavor to design delicate source weighting schemes by leveraging target predictions to align cross-domain distributions in the shared class space. Accordingly, two crucial issues are overlooked in these methods. First, target prediction is a double-edged sword, and inaccurate predictions will result in negative transfer inevitably. Second, not all target samples have equal transferability during the adaptation; thus, “ambiguous” target data predicted with high uncertainty should be paid more attentions. In this article, we propose a critical classes and samples discovering network (CSDN) to identify the most relevant source classes and critical target samples, such that more precise cross-domain alignment in the shared label space could be enforced by co-training two diverse classifiers. Specifically, during the training process, CSDN introduces an adaptive source class weighting scheme to select the most relevant classes dynamically. Meanwhile, based on the designed target ambiguous score, CSDN emphasizes more on ambiguous target samples with larger inconsistent predictions to enable fine-grained alignment. Taking a step further, the weighting schemes in CSDN can be easily coupled with other PDA and DA methods to further boost their performance, thereby demonstrating its flexibility. Extensive experiments verify that CSDN attains excellent results compared to state of the arts on four highly competitive benchmark datasets.