Robust Contrastive Learning against Noisy Views
Citations Over TimeTop 10% of 2022 papers
Abstract
Contrastive learning relies on an assumption that positive pairs contain related views that share certain underlying information about an instance, e.g., patches of an image or co-occurring multimodal signals of a video. What if this assumption is violated? The literature suggests that contrastive learning produces suboptimal representations in the presence of noisy views, e.g., false positive pairs with no apparent shared information. In this work, we pro-pose a new contrastive loss function that is robust against noisy views. We provide rigorous theoretical justifications by showing connections to robust symmetric losses for noisy binary classification and by establishing a new contrastive bound for mutual information maximization based on the Wasserstein distance measure. The proposed loss is completely modality-agnostic and a simple drop-in replacement for the InfoNCE loss, which makes it easy to apply to ex-isting contrastive frameworks. We show that our approach provides consistent improvements over the state-of-the-art on image, video, and graph contrastive learning bench-marks that exhibit a variety of real-world noise patterns.
Related Papers
- → Monomodal image registration using mutual information based methods(2006)54 cited
- → Mutual information maximization for improving and interpreting multi-layered neural networks(2017)9 cited
- → Self-Organized Mutual Information Maximization Learning for Improved Generalization Performance(2015)4 cited
- [Implementation of mutual information based medical image registration methods].(2003)
- → A New Criterion of Mutual Information Using R-value(2013)