Communication-Efficient Algorithms for Distributed Computation and Machine LearningThe rise of big data has brought the design of distributed algorithms to the forefront. For example, in many practical settings,
J. Mota, "Communication-Efficient Algorithms For Distributed Optimiza- tion," Ph.D. thesis, Carnegie Mellon University, PA, Technical University of Lisbon, Lisbon, Portugal, 2013.J. F. Mota, "Communication-efficient algorithms for distributed optimization," arXiv preprint arXiv:1312.0263, 2013....
4. Differential privacy (DP): federated learning can be combined with global DP to provide an additional layer of privacy . Training only the ensemble weights via federated learning is well-suited for DP since the utility-privacy trade-off depends on the number of parameters being trained . Fur...
In the distributed learning literature for communication efficiency, most of existing works on distributed machine learning consist of two categories: (1) How to design communication efficient algorithms to reduce the round of communications. For instance, [4] proposed DANE (Distributed Approximate NEwton...
Fast multi-thread FEC simulator & library of efficient digital communication algorithms for SDR. - aff3ct/aff3ct
Communication-Efficient Distributed Stochastic AUC Maximization with Deep Neural Networks 腾讯 已认证机构号 来自专栏 · THINK BIGGER 5 人赞同了该文章 Abstract In this paper, we study distributed algorithms for large-scale AUC maximization with a deep neural network as a predictive model. Althou...
Our experiments on real-world datasets show that PV-Tree significantly outperforms the existing parallel decision tree algorithms in the tradeoff between accuracy and efficiency. Opens in a new tab Publication Events Microsoft at NIPS 2016 Groups Machine Learning Area MSR Asia Theory Ce...
Federated Optimization 我们将联邦学习中隐含的优化问题称为联邦优化,并与分布式优化建立了联系(和对比)。联合优化有几个关键特性,可以将其与典型的分布式优化问题区分开来: 非独立同分布:给定客户机上的训练数据通常基于特定用户对移动设备的使用,因此任何特定用户的本地数据集都不能代表总体分布。
关于联邦学习,我的阅读笔记将联邦学习开山之作《Communication-Efficient Learning of Deep Networks from Decentralized Data》开始。这篇论文探讨了联邦学习的核心概念和方法,具有重要的理论价值和实用意义。 作为一名热爱科研的学术小白,我在这里分享我的阅读思考,希望通过这种方式能够更好地理解联邦学习的相关内容。如果我...
Compared with the conventional federated algorithms such as FedAvg, existing methods for CFL require either more communication costs or multi-stage computation overheads. In this paper, we propose an iterative CFL framework with almost the same communication cost as FedAvg in each round based on a ...