Skip to content

Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators

Abstract

Federated learning has become a popular method to learn from decentralized heterogeneous data. Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data due to label scarcity on decentralized clients. Existing FSSL methods assume independent and identically distributed (IID) labeled data across clients and consistent class distribution between labeled and unlabeled data within a client. This work studies a more practical and challenging scenario of FSSL, where data distribution is different not only across clients but also within a client between labeled and unlabeled data. To address this challenge, we propose a novel FSSL framework with dual regulators, FedDure.} FedDure lifts the previous assumption with a coarse-grained regulator (C-reg) and a fine-grained regulator (F-reg): C-reg regularizes the updating of the local model by tracking the learning effect on labeled data distribution; F-reg learns an adaptive weighting scheme tailored for unlabeled instances in each client. We further formulate the client model training as bi-level optimization that adaptively optimizes the model in the client with two regulators. Theoretically, we show the convergence guarantee of the dual regulators. Empirically, we demonstrate that FedDure is superior to the existing methods across a wide range of settings, notably by more than 11% on CIFAR-10 and CINIC-10 datasets.

View PDF

Authors

  • Sikai Bai*
  • Shuaicheng Li*
  • Weiming Zhuang
  • Jie Zhang*
  • Kunlin Yang*
  • Jun Hou*
  • Shuai Yi*
  • Shuai Zhang*
  • Junyu Gao*

*External Authors

Venue

AAAI 2024

Date

2024

Share

Related Publications

Join Us on the Cutting-Edge of AI Innovation