학술논문

Refined Consistency for Semi-Supervised Learning with Knowledge Distillation / Refined Consistencyによる知識蒸留を用いた半教師あり学習
Document Type
Journal Article
Source
Proceedings of the Annual Conference of JSAI. 2021, :4
Subject
Semi-Supervised Learning
半教師あり学習
Language
Japanese
Abstract
Semi-supervised learning is a method that uses both labeled and unlabeled data for training the model. Dual Student (DS), which transfers knowledge between two networks, and Multiple Student (MS), which expands the number of DS networks to four or more, have been proposed as semi-supervised learning. MS achieves higher accuracy than DS, but learning MS is inefficient because knowledge transfer between all networks is not performed at once in the MS learning. In this paper, we propose refined-consistency, which transfers knowledge between all networks at once, to improve accuracy through an efficient knowledge transfer method. In the experiment with the CIFAR-100 dataset, we show that the proposed method improves the accuracy more than MS.

Online Access