对于 MOON (µ = 1),它的性能优于 FedAvg 和 FedProx 超过 2% 50 方的 200 轮准确率和 100 方的 500 轮准确率 3%。此外,对于 MOON (μ = 10),虽然大的模型对比损失在开始时减慢了训练速度,如图 8 所示,但 MOON 可以通过更多的通信轮次远远优于其他方法。与 FedAvg 和 FedProx 相比,
今天要介绍的就是其中一篇论文《Model-Contrastive Federated Learning》 一、Motivation 联邦学习的关键挑战是客户端之间数据的异质性(Non-IID),尽管已有很多方法(例如FedProx,SCAFFOLD)来解决这个问题,但是他们在图像数据集上的效果欠佳(见实验Table1)。 传统的对比学习是data-level的,本文改进了FedAvg的本地模型训练阶段...
In this paper, we propose MOON: model-contrastive federated learning. MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model representations to correct the local training of individual parties, i.e., conducting contrastive ...
因此作者提出MOON,模型对比学习,通过模型级别的对比学习处理non-IID问题 2.1 Federated Learning 现有处理non-IID问题的方法分为提升本地训练效果和提升聚合效果,本文是前一种,因为对比损失函数在本地计算 其他研究方向有个性化联邦学习,尝试为每个客户端学习一个模型,本文属于经典联邦学习,学习单个全局最佳模型 2.2 Contr...
we find that they fail to achieve high performance in image datasets with deep learning models. In this paper, we propose MOON: model-contrastive federated learning. MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model repre...
FedAvg is a classic algorithm for Federated Learning. The MOON algorithm is based on the idea of comparative learning. The model in the previous round is a negative sample, and the global model is a positive sample, which improves the local training of FedAvg. However, in the study of MOON...
Zhao P, Jin Y, Ren X, Li Y (2024) A personalized cross-domain recommendation with federated meta learning. Multimedia Tools and Applications. https://doi.org/10.1007/s11042-024-18495-3 Article Google Scholar Meng X (2024) Cross-domain information fusion and personalized recommendation in arti...
(1) improvement on local training. (MOON belongs to) (2)on aggregation. the author study for a new approach of handling non-IID image datasets with deep learning methods. ps: another research direction is personalized federated learning, which tries to learn personalized local models for each ...