Label Reuse for Efficient Semi-Supervised Learning

TH Hsieh, JC Chen, CS Chen - ICASSP 2020-2020 IEEE …, 2020 - ieeexplore.ieee.org
TH Hsieh, JC Chen, CS Chen
ICASSP 2020-2020 IEEE International Conference on Acoustics …, 2020ieeexplore.ieee.org
In this paper, we propose a new learning strategy for semi-supervised deep learning
algorithms, called label reuse, aiming to significantly reduce the expensive computational
cost of pseudo label generation and the like for each unlabeled training instance since
pseudo labels require to be repeatedly evaluated through the whole training process. For
label reuse, we first divide the unlabeled training data into several partitions, replicate each
partition in several copies, and place them consecutively in the training queue so as to reuse …
In this paper, we propose a new learning strategy for semi-supervised deep learning algorithms, called label reuse, aiming to significantly reduce the expensive computational cost of pseudo label generation and the like for each unlabeled training instance since pseudo labels require to be repeatedly evaluated through the whole training process. For label reuse, we first divide the unlabeled training data into several partitions, replicate each partition in several copies, and place them consecutively in the training queue so as to reuse the pseudo labels computed at first time before invalidation. To evaluate the effectiveness of the proposed approach, we conduct extensive experiments on CIFAR-10 [1] and SVHN [2] by applying it upon the recent state-of-the-art semi-supervised deep learning approach, MixMatch [3]. The results demonstrate the proposed approach can not only significantly reduce the cost of pseudo label computation of MixMatch by a large amount but also keep comparable classification performance.
ieeexplore.ieee.org
顯示最佳搜尋結果。 查看所有結果