What is it about?

Federated learning (FL) is a promising machine learning paradigm that allows many clients jointly train a model without sharing the raw data. As the standard FL has been designed from the server’s perspective, the unfairness issue may occur throughout the whole learning process including the global model optimization phase. Some existing works have attempted this issue to guarantee the global model achieves a similar accuracy across different classes (i.e., labels), but failed to consider the implicit classes (different representations of one label) under them in which the fairness issue persists. In this paper, we focus on the fairness issue in the global model optimization phase and mitigate the research gap by introducing the Implicit Class Balancing (ICB) Federated Learning framework with Single Class Training Scheme (SCTS). In ICB FL, the server first broadcasts the current global model and assigns a particular class (label) for each client. Then, each client locally trains the model only with the assigned class data (SCTS) and sends the gradient back to the server. The server subsequently performs unsupervised learning to identify the implicit classes and generates the balanced weight for each client. Finally, the server averages the gradient received with weights, and updates the global model. We evaluate our ICB FL in three datasets, and the experimental results show that our ICB FL can effectively enhance fairness across explicit and implicit classes.

Featured Image

Read the Original

This page is a summary of: ICB FL: Implicit Class Balancing Towards Fairness in Federated Learning, January 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3579375.3579392.
You can read the full text:

Read

Contributors

The following have contributed to this page