期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Prototype-wise momentum-based federated contrast learning
1
作者 Yong Zhang Jingrui Zhang +3 位作者 Yanjie Dong Feng liang aohan li Xiping Hu 《Journal of Information and Intelligence》 2025年第6期515-525,共11页
Federated learning(FL)has gained significant attention for enabling privacy preservation and knowledge sharing by transmitting model parameters from clients to a central server.However,with increasing network scale an... Federated learning(FL)has gained significant attention for enabling privacy preservation and knowledge sharing by transmitting model parameters from clients to a central server.However,with increasing network scale and limited bandwidth,uploading complete model parameters has become increasingly impractical.To address this challenge,we leverage the high informativeness of prototypes—feature centroids representing samples of the same class—and propose federated prototype momentum contrastive learning(FedPMC).At the communication level,FedPMC reduces communication overhead by using prototypes as carriers instead of full model parameters.At the local model update level,to mitigate overfitting,we construct an expanded batch sample space to incorporate richer visual information,design a supervised contrastive loss between global and real-time local prototypes,and adopt momentum contrast to gradually update the model.At the framework level,to fully exploit the sample's feature space,we employ three different pre-trained models for feature extraction and concatenate their outputs as input to the local model.FedPMC supports personalized local models and utilizes both global and local prototypes to address data heterogeneity among clients.We evaluate FedPMC alongside other state-of-the-art FL algorithms on the Digit-5 dataset within a unified lightweight framework to assess their comparative performance.The code is available at https://github.com/zhy665/fedPMC. 展开更多
关键词 Federated learning Prototype contrastive loss Momentum contrast
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部