期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
FedBS:Solving data heterogeneity issue in federated learning using balanced subtasks
1
作者 Chuxiao Su Jing Wu +3 位作者 Rui Zhang Zi Kang Hui Xia Cheng Zhang 《High-Confidence Computing》 2025年第4期142-150,共9页
Federated learning has emerged as a popular paradigm for distributed machine learning,enabling participants to collaborate on model training while preserving local data privacy.However,a key challenge in deploying fed... Federated learning has emerged as a popular paradigm for distributed machine learning,enabling participants to collaborate on model training while preserving local data privacy.However,a key challenge in deploying federated learning in real-world applications arises from the substantial heterogeneity in local data distributions across participants.These differences can have negative consequences,such as degraded performance of aggregated models.To address this issue,we propose a novel approach that advocates decomposing the skewed original task into a series of relatively balanced subtasks.Decomposing the task allows us to derive unbiased features extractors for the subtasks,which are then utilized to solve the original task.Based on this concept,we have developed the FedBS algorithm.Through comparative experiments on various datasets,we have demonstrated that FedBS outperforms traditional federated learning algorithms such as FedAvg and FedProx in terms of accuracy,convergence speed,and robustness.The main reason behind these improvements is that FedBS addresses the data heterogeneity problem in federated learning by decomposing the original task into smaller,more balanced subtasks,thereby more effectively mitigating imbalances during model training. 展开更多
关键词 Distributed machine learning Federated learning Data heterogeneity non-1id data Unbalanced dat
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部