Login

Optimization of Decentralized Federated Multi-Task Learning: Automated Task Grouping and Performance Enhancement

MA, MP
State: Open
Published: 2025-01-31

Decentralized Federated Multi-Task Learning (DFMTL) extends federated learning by allowing multiple tasks to be learned simultaneously across distributed nodes without a central server. However, optimizing DFMTL remains a challenge due to data heterogeneity and varying task similarities. This thesis focuses on improving the efficiency and effectiveness of DFMTL by (1) developing an automated task grouping mechanism based on data representation and task similarity metrics to dynamically cluster related tasks, (2) optimizing the DFMTL framework with novel aggregation and learning strategies to enhance performance, and (3) evaluating the proposed methods in terms of learning efficiency, accuracy, and robustness in real-world decentralized scenarios. The research aims to advance the adaptability and scalability of DFMTL in applications such as personalized recommendation systems, distributed healthcare analytics, and autonomous systems.

[1] Feng, C., Kohler, N.F., Celdran, A.H., Bovet, G. and Stiller, B., 2025. ColNet: Collaborative Optimization in Decentralized Federated Multi-task Learning Systems. arXiv preprint arXiv:2501.10347.

 

40% Design, 40% Implementation, 20% Documentation
python [pytorch]

Supervisors: Chao Feng

back to the main page