Intelligent Model Aggregation in Hierarchical Clustered Federated Multitask Learning

Moqbel Hamood*, Abdullatif Albaseer*, Mohamed Abdallah*, Ala Al-Fuqaha*, Amr Mohamed

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Clustered federated multi task learning (CFL) is introduced as an effective and efficient approach for addressing statistical challenges such as non-independent and identically distributed (non-IID) data among workers. Workers in CFL are clustered in groups based on similarity (i.e., cosine similarity) in their data distributions, in which each cluster is equipped with an efficient specialized model. However, this approach can be costly and time-consuming when implemented in hierarchical wireless networks (HWNs) due to uploading several models at every round to enable the cloud server to capture the incongruent data distribution from different edge networks. This brings about the need for novel solutions to address these challenges. To this end, this paper introduces a framework with two cloud-based model aggregation approaches, round-based and split-based, so as to minimize latency and resource consumption while attaining satisfying personalized accuracy. In the round-based scheme, the cloud aggregates the models from the edge servers after a predetermined number of rounds. As for the split-based scheme, the models are collected by the cloud only when edge servers perform the split. Extensive experiments are conducted to evaluate and compare the proposed heuristics against approaches presented in the recent literature. The numerical results and findings demonstrate that the proposed heuristics significantly conserve resources by reducing energy consumption by 60% and saving time, all while accelerating the convergence rate for cluster workers across various edge networks.

Original languageEnglish
Title of host publicationGLOBECOM 2023 - 2023 IEEE Global Communications Conference
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3009-3014
Number of pages6
ISBN (Electronic)9798350310900
DOIs
Publication statusPublished - 2023
Event2023 IEEE Global Communications Conference, GLOBECOM 2023 - Kuala Lumpur, Malaysia
Duration: 4 Dec 20238 Dec 2023

Publication series

NameProceedings - IEEE Global Communications Conference, GLOBECOM
ISSN (Print)2334-0983
ISSN (Electronic)2576-6813

Conference

Conference2023 IEEE Global Communications Conference, GLOBECOM 2023
Country/TerritoryMalaysia
CityKuala Lumpur
Period4/12/238/12/23

Keywords

  • CFL
  • Federated learning
  • Hierarchical networks
  • Model aggregation
  • Resource allocation
  • client scheduling

Fingerprint

Dive into the research topics of 'Intelligent Model Aggregation in Hierarchical Clustered Federated Multitask Learning'. Together they form a unique fingerprint.

Cite this