SPATL: Salient Parameter Aggregation and Transfer Learning for Heterogeneous Federated Learning
DescriptionFederated learning~(FL) facilitates the training and deploying AI models on edge devices. Preserving user data privacy in FL introduces several challenges, including expensive communication costs, limited resources, and data heterogeneity. In this paper, we propose SPATL, an FL method that addresses these issues by: (a) introducing a salient parameter selection agent and communicating selected parameters only; (b) splitting a model into a shared encoder and a local predictor, and transferring its knowledge to heterogeneous clients via the locally customized predictor. Additionally, we leverage a gradient control mechanism to further speed up model convergence and increase robustness of training processes. Experiments demonstrate that SPATL reduces communication overhead, accelerates model inference, and enables stable training processes with better results compared to state-of-the-art methods. Our approach reduces communication cost by up to 86.45%, accelerates local inference by reducing up to 39.7% FLOPs on VGG-11, and requires 7.4× less communication overhead when training ResNet-20.
Event Type
Paper
TimeWednesday, 16 November 20222:30pm - 3pm CST
LocationC146
Session Formats
Recorded
Tags
Machine Learning and Artificial Intelligence
Software Engineering
State of the Practice
Registration Categories
TP
Reproducibility Badges
Back To Top Button