作者发现使用所有数据训练的全局模型比使用部分子集训练的本地模型性能要好,就算使用FedAVG进行聚合,全局也比本地的性能要好 因此作者提出MOON,模型对比学习,通过模型级别的对比学习处理non-IID问题 2.1 Federated Learning 现有处理non-IID问题的方法分为提升本地训练效果和提升聚合效果,本文是前一种,因为对比损失函数在...
Since there is always drift in local training and the global model learns a better representation than the local model, MOON aims to decrease the distance between the representation learned by the local model and the representation learned by the global model, and increase the distance between the...
对于 MOON (µ = 1),它的性能优于 FedAvg 和 FedProx 超过 2% 50 方的 200 轮准确率和 100 方的 500 轮准确率 3%。此外,对于 MOON (μ = 10),虽然大的模型对比损失在开始时减慢了训练速度,如图 8 所示,但 MOON 可以通过更多的通信轮次远远优于其他方法。与 FedAvg 和 FedProx 相比,MOON 在 50 方...
we find that they fail to achieve high performance in image datasets with deep learning models. In this paper, we propose MOON: model-contrastive federated learning. MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model repre...
MOON is a simple and effective federated learning framework. The key idea of MOON is to utilize the similarity between model representations to correct the local training of individual parties, i.e., conducting contrastive learning in model-level. Our extensive experiments show that MOON ...