FedSoup: Improving Generalization and Personalization in Federated Learning via Selective Model Interpolation

By Minghui Chen in MICCAI

July 20, 2023

Authors: Minghui Chen, Meirui Jiang, Qi Dou, Zehua Wang, Xiaoxiao Li

Published in: The 26th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2023)

Abstract

Cross-silo federated learning (FL) enables the development of machine learning models on datasets distributed across data centers such as hospitals and clinical research laboratories. However, recent research has found that current FL algorithms face a trade-off between local and global performance when confronted with distribution shifts. Specifically, personalized FL methods have a tendency to overfit to local data, leading to a sharp valley in the local model and inhibiting its ability to generalize to out-of-distribution data. In this paper, we propose a novel federated model soup method (i.e., selective interpolation of model parameters) to optimize the trade-off between local and global performance. Specifically, during the federated training phase, each client maintains its own global model pool by monitoring the performance of the interpolated model between the local and global models. This allows us to alleviate overfitting and seek flat minima, which can significantly improve the model’s generalization performance.

Posted on:
July 20, 2023
Length:
1 minute read, 176 words
Categories:
MICCAI
Tags:
Federated Learning Machine Learning Medical Imaging Model Interpolation Personalization
See Also:
Can Textual Gradient Work in Federated Learning?
Local Superior Soups: A Catalyst for Model Merging in Cross-Silo Federated Learning