Skip to content

FedSkip: Combatting Statistical Heterogeneity with Federated Skip Aggregation

Abstract

The statistical heterogeneity of the non-independent and identically distributed (non-IID) data in local clients significantly limits the performance of federated learning. Previous attempts like FedProx, SCAFFOLD, MOON, FedNova and FedDyn resort to an optimization perspective, which requires an auxiliary term or re-weights local updates to calibrate the learning bias or the objective inconsistency. However, in addition to previous explorations for improvement in federated averaging, our analysis shows that another critical bottleneck is the poorer optima of client models in more heterogeneous conditions. We thus introduce a data-driven approach called FedSkip to improve the client optima by periodically skipping federated averaging and scattering local models to the cross devices. We provide theoretical analysis of the possible benefit from FedSkip and conduct extensive experiments on a range of datasets to demonstrate that FedSkip achieves much higher accuracy, better aggregation efficiency and competing communication efficiency.

Authors

  • Ziqing Fan*
  • Yanfeng Wang*
  • Jiangchao Yao*
  • Lingjuan Lyu
  • Ya Zhang*
  • Qi Tian*

*External Authors

Venue

ICDM 2022

Date

2022

Share

Related Publications

Join Us on the Cutting-Edge of AI Innovation