Salta al contenuto principale
Passa alla visualizzazione normale.

ANDREA AUGELLO

DCFL: Dynamic Clustered Federated Learning under Differential Privacy Settings

  • Autori: Augello A.; Falzone G.; Lo Re G.
  • Anno di pubblicazione: 2023
  • Tipologia: Contributo in atti di convegno pubblicato in volume
  • OA Link: http://hdl.handle.net/10447/661473

Abstract

Federated Learning (FL) allows training machine learning models on a dataset distributed amongst multiple clients without disclosing sensitive data. Each FL client, however, might have a different data distribution, with a detrimental effect on the performance of the trained model. In this paper, we present a dynamic clustering algorithm (DCFL) that allows the server to cluster FL clients based on their model updates, letting the server adapt to changes in the data distribution and supporting the addition of new clients. Moreover, we propose a novel distance metric to estimate the distance between model updates by different clients. We evaluate our approach in a wide range of experimental settings, comparing it against the standard FedAvg algorithm and divisive clustering on the EMNIST dataset. Our approach outperforms the baselines, yielding higher accuracy and lower variance for the participating clients.