Can Textual Gradient Work in Federated Learning?
We systematically explore the potential and challenges of incorporating textual gradient into Federated Learning, introducing FedTextGrad - a novel FL paradigm for optimizing LLMs.
We systematically explore the potential and challenges of incorporating textual gradient into Federated Learning, introducing FedTextGrad - a novel FL paradigm for optimizing LLMs.
An innovative model interpolation-based local training technique that enhances local training across different clients through regularized model interpolation, acting as a catalyst for seamless adaptation of pre-trained models in federated learning.
A federated learning framework that enables certified data removal through linear approximation and efficient removal strategies, providing theoretical guarantees for the right to be forgotten.
Nanyang Technological University, working with Prof. Xiaoxiao Li and Prof. Han Yu on federated learning and multi-agent.
Vector Institute, working with Prof. Xiaoxiao Li on federated learning.
A novel federated model soup method that optimizes the trade-off between local and global performance through selective interpolation of model parameters, alleviating overfitting and seeking flat minima for improved generalization.
University of British Columbia, working with Prof. Xiaoxiao Li and Prof. Zehua Wang on federated learning.