Li, Y., Wang, X., Li, H., Donta, P. K., Huang, M., & Dustdar, S. (2025). Communication-Efficient Federated Learning for Heterogeneous Clients. ACM Transactions on Internet Technology, 25(2), 1–37. https://doi.org/10.1145/3716870
Computing methodologies; Learning paradigms; Computer systems organization; Cloud computing
en
Abstract:
Federated learning stands out as a promising approach within the domain of edge computing, providing a framework for collaborative training on distributed datasets without necessitating data sharing. However, federated learning involves the frequent transmission of machine learning model updates between the server and clients, resulting in high communication costs. Additionally, heterogeneous clients can further complicate the Federated Learning process and deteriorate performance. To address these challenges, we propose Adaptive Self-Knowledge Distillation-based Quality- and Reputation-Aware Cross-Device Federated Learning (ASDQR) - an efficient communication and inference framework designed for heterogeneous clients. ASDQR initiates the process by selecting high-reputation and high-quality clients to be involved in federated learning, significantly impacting communication efficiency and inference effectiveness. ASDQR also introduces a model of adaptive local self-knowledge distillation that incorporates multiple local personalized historical knowledge for more accurate inference, allowing the historical level to be dynamically adjusted across time. Finally, we present an inference-effective aggregation scheme that assigns higher weights to important and reliable local model updates based on clients' contribution degrees when performing global model aggregation. ASDQR consistently outperforms baseline methods across all datasets and communication rounds, achieving 9.0% higher accuracy than FedAvg, 6.59% higher than MOON, 0.29% higher than FedProx, 0.2% higher than PFedSD, and 0.08% higher than FedMD on the MNIST dataset at 100 communication rounds. Similar improvements are observed on CIFAR, HAR, and WISDM datasets, demonstrating the robustness and efficiency of ASDQR in federated learning with non-IID data.
en
Project (external):
National Natural Science Foundation of China National Natural Science Foundation of China