期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
MSRA-Fed:A Communication-Efficient Federated Learning Method Based on Model Split and Representation Aggregate
1
作者 LIU Qinbo JIN Zhihao +2 位作者 WANG Jiabo LIU Yang LUO Wenjian 《ZTE Communications》 2022年第3期35-42,共8页
Recent years have witnessed a spurt of progress in federated learning,which can coordinate multi-participation model training while protecting the data privacy of participants.However,low communication efficiency is a... Recent years have witnessed a spurt of progress in federated learning,which can coordinate multi-participation model training while protecting the data privacy of participants.However,low communication efficiency is a bottleneck when deploying federated learning to edge computing and IoT devices due to the need to transmit a huge number of parameters during co-training.In this paper,we verify that the outputs of the last hidden layer can record the characteristics of training data.Accordingly,we propose a communication-efficient strategy based on model split and representation aggregate.Specifically,we make the client upload the outputs of the last hidden layer instead of all model parameters when participating in the aggregation,and the server distributes gradients according to the global information to revise local models.Empirical evidence from experiments verifies that our method can complete training by uploading less than one-tenth of model parameters,while preserving the usability of the model. 展开更多
关键词 federated learning communication load efficient communication privacy protection
在线阅读 下载PDF
Coded Distributed Computing for System with Stragglers
2
作者 Xu Jiasheng Kang Huquan +5 位作者 Zhang Haonan Fu Luoyi Long Fei Cao Xinde Wang Xinbing Zhou Chenghu 《China Communications》 2025年第8期298-313,共16页
Distributed computing is an important topic in the field of wireless communications and networking,and its high efficiency in handling large amounts of data is particularly noteworthy.Although distributed computing be... Distributed computing is an important topic in the field of wireless communications and networking,and its high efficiency in handling large amounts of data is particularly noteworthy.Although distributed computing benefits from its ability of processing data in parallel,the communication burden between different servers is incurred,thereby the computation process is detained.Recent researches have applied coding in distributed computing to reduce the communication burden,where repetitive computation is utilized to enable multicast opportunities so that the same coded information can be reused across different servers.To handle the computation tasks in practical heterogeneous systems,we propose a novel coding scheme to effectively mitigate the "straggling effect" in distributed computing.We assume that there are two types of servers in the system and the only difference between them is their computational capabilities,the servers with lower computational capabilities are called stragglers.Given any ratio of fast servers to slow servers and any gap of computational capabilities between them,we achieve approximately the same computation time for both fast and slow servers by assigning different amounts of computation tasks to them,thus reducing the overall computation time.Furthermore,we investigate the informationtheoretic lower bound of the inter-communication load and show that the lower bound is within a constant multiplicative gap to the upper bound achieved by our scheme.Various simulations also validate the effectiveness of the proposed scheme. 展开更多
关键词 coded computation communication load distributed computing straggling effect
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部