The Majority Can Help the Minority: Context-rich Minority Oversampling for Long-tailed Classification
Citations Over TimeTop 1% of 2022 papers
Abstract
The problem of class imbalanced data is that the gener-alization performance of the classifier deteriorates due to the lack of data from minority classes. In this paper, we pro-pose a novel minority over-sampling method to augment di-versified minority samples by leveraging the rich context of the majority classes as background images. To diversify the minority samples, our key idea is to paste an image from a minority class onto rich-context images from a majority class, using them as background images. Our method is simple and can be easily combined with the existing long-tailed recognition methods. We empirically prove the effectiveness of the proposed oversampling method through extensive experiments and ablation studies. Without any architectural changes or complex algorithms, our method achieves state-of-the-art performance on various long-tailed classification benchmarks. Our code is made available at https://github.com/naver-ai/cmo.
Related Papers
- → Does Adversarial Oversampling Help us?(2021)7 cited
- → Design of Jitter Spectral Shaping as Robust with Various Oversampling Techniques in OFDM(2017)4 cited
- → Probability-Based Synthetic Minority Oversampling Technique(2023)3 cited
- → Does Adversarial Oversampling Help us?(2021)1 cited
- → An Oversampling Technique for Classifying Imbalanced Datasets(2017)