Federated learning(FL)has gained significant attention for enabling privacy preservation and knowledge sharing by transmitting model parameters from clients to a central server.However,with increasing network scale an...Federated learning(FL)has gained significant attention for enabling privacy preservation and knowledge sharing by transmitting model parameters from clients to a central server.However,with increasing network scale and limited bandwidth,uploading complete model parameters has become increasingly impractical.To address this challenge,we leverage the high informativeness of prototypes—feature centroids representing samples of the same class—and propose federated prototype momentum contrastive learning(FedPMC).At the communication level,FedPMC reduces communication overhead by using prototypes as carriers instead of full model parameters.At the local model update level,to mitigate overfitting,we construct an expanded batch sample space to incorporate richer visual information,design a supervised contrastive loss between global and real-time local prototypes,and adopt momentum contrast to gradually update the model.At the framework level,to fully exploit the sample's feature space,we employ three different pre-trained models for feature extraction and concatenate their outputs as input to the local model.FedPMC supports personalized local models and utilizes both global and local prototypes to address data heterogeneity among clients.We evaluate FedPMC alongside other state-of-the-art FL algorithms on the Digit-5 dataset within a unified lightweight framework to assess their comparative performance.The code is available at https://github.com/zhy665/fedPMC.展开更多
基金supported in part by the Innovation Team Project of Guangdong Province of China(2024KCXTD017)Guangdong-Hong Kong-Macao University Joint Laboratory(2023LSYS005).
文摘Federated learning(FL)has gained significant attention for enabling privacy preservation and knowledge sharing by transmitting model parameters from clients to a central server.However,with increasing network scale and limited bandwidth,uploading complete model parameters has become increasingly impractical.To address this challenge,we leverage the high informativeness of prototypes—feature centroids representing samples of the same class—and propose federated prototype momentum contrastive learning(FedPMC).At the communication level,FedPMC reduces communication overhead by using prototypes as carriers instead of full model parameters.At the local model update level,to mitigate overfitting,we construct an expanded batch sample space to incorporate richer visual information,design a supervised contrastive loss between global and real-time local prototypes,and adopt momentum contrast to gradually update the model.At the framework level,to fully exploit the sample's feature space,we employ three different pre-trained models for feature extraction and concatenate their outputs as input to the local model.FedPMC supports personalized local models and utilizes both global and local prototypes to address data heterogeneity among clients.We evaluate FedPMC alongside other state-of-the-art FL algorithms on the Digit-5 dataset within a unified lightweight framework to assess their comparative performance.The code is available at https://github.com/zhy665/fedPMC.