Close

Presentation

DeltaNet: High-Performance Federated Learning with Hybrid Data & Model Parallelism
TimeTuesday, December 7th6:00pm - 7:00pm PST
LocationLevel 2 - Lobby
Event Type
Networking Reception
Work-in-Progress Poster
Virtual Programs
Presented In-Person
DescriptionFederated Learning (FL) has received great attention with good collaborative performance. However, the current FL only adopts data parallelism, making training performance obliged to the local edge node’s limited computation capacity. To grant thorough model/data parallelism to collaborative CNN training on edge, we propose DeltaNet, a FL framework with hybrid data and model parallelism. The core mechanism of DeltaNet is to decouple a target CNN model into independent sub-models, which enable efficient model parallelism on different local nodes but also greatly reduce the communication cost. Extensive experiments show our framework's effectiveness and high performance in computing, communication, and accuracy aspects.