Towards Fair Federated Learning with Zero-Shot Data Augmentation
Citations Over TimeTop 10% of 2021 papers
Abstract
Federated learning has emerged as an important distributed learning paradigm, where a server aggregates a global model from many client-trained models, while having no access to the client data. Although it is recognized that statistical heterogeneity of the client local data yields slower global model convergence, it is less commonly recognized that it also yields a biased federated global model with a high variance of accuracy across clients. In this work, we aim to provide federated learning schemes with improved fairness. To tackle this challenge, we propose a novel federated learning system that employs zero-shot data augmentation on under-represented data to mitigate statistical heterogeneity, and encourage more uniform accuracy performance across clients in federated networks. We study two variants of this scheme, Fed-ZDAC (federated learning with zero-shot data augmentation at the clients) and Fed-ZDAS (federated learning with zero-shot data augmentation at the server). Empirical results on a suite of datasets demonstrate the effectiveness of our methods on simultaneously improving the test accuracy and fairness.
Related Papers
- → Reflections on the creation of a real-time parallel benchmark suite(1999)1 cited
- → An evaluation of the Meiko CS-2 using the GENESIS Benchmark Suite(1994)2 cited
- → SUITE 2010(2010)1 cited
- Lille suite = Kleine Suite = Little suite : op. 1(1949)
- → Winter suite.(2021)