Close

Presentation

HADFL: Heterogeneity-aware Decentralized Federated Learning Framework
Time
Location
Event Type
Research Manuscript
Virtual Programs
Hosted in Virtual Platform
Keywords
AI/ML System Design
Topics
Design
DescriptionFederated learning (FL) supports training models on geographically distributed devices. However, traditional FL systems adopt a centralized synchronous strategy, putting high communication pressure and model generalization challenge. Existing optimizations either performs poorly on heterogeneous devices or suffer from communication bottleneck. In this paper, we propose HADFL, a framework that supports decentralized training on heterogeneous devices. Devices train model with heterogeneity-aware local steps. In each aggregation cycle, they are selected based on probability to perform model aggregation. HADFL can relieve the communication pressure and can achieve a maximum speedup of 3.15x than decentralized-FedAvg, with almost no loss of convergence accuracy.