Ahmad, S., & Aral, A. (2023). Hierarchical Federated Transfer Learning: A Multi-Cluster Approach on the Computing Continuum. In Proceedings ICMLA 2023. 22nd International Conference on Machine Learning and Applications (ICMLA-23), Jacksonville Riverfront, Florida, United States of America (the).
Federated Learning (FL) involves training models over a set of geographically distributed users. We address the problem where a single global model is not enough to meet the needs of geographically distributed heterogeneous clients. This setup captures settings where different groups of users have their own objectives however, users based on geographical location or task similarity, can be grouped together and by inter-cluster knowledge they can leverage the strength in numbers and better generalization in order to perform more efficient FL. We introduce a Hierarchical Multi-Cluster Computing Continuum for Federated Learning Personalization (HC3FL) to cluster similar clients and train one edge model per cluster. HC3FL incorporates federated transfer learning to enhance the performance of edge models by leveraging a global model that captures collective knowledge from all edge models. Furthermore, we introduce dynamic clustering based on task similarity to handle client drift and to dynamically recluster mobile (non-stationary) clients. We evaluate the HC3FL approach through extensive experiments on real-world datasets. The results demonstrate that our approach effectively improves the performance of edge models compared to traditional FL approaches.
Index Terms—federated transfer learning, hierarchical collaborative learning, dynamic clustering.
en
Research Areas:
Computer Science Foundations: 90% Modeling and Simulation: 10%