期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Division in Unity:Towards Efficient and Privacy-Preserving Learning of Healthcare Data
1
作者 Panyu Liu Tongqing Zhou +2 位作者 Guofeng Lu Huaizhe Zhou Zhiping Cai 《Computers, Materials & Continua》 2025年第11期2913-2934,共22页
The isolation of healthcare data among worldwide hospitals and institutes forms barriers for fully realizing the data-hungry artificial intelligence(AI)models promises in renewing medical services.To overcome this,pri... The isolation of healthcare data among worldwide hospitals and institutes forms barriers for fully realizing the data-hungry artificial intelligence(AI)models promises in renewing medical services.To overcome this,privacy-preserving distributed learning frameworks,represented by swarm learning and federated learning,have been investigated recently with the sensitive healthcare data retaining in its local premises.However,existing frameworks use a one-size-fits-all mode that tunes one model for all healthcare situations,which could hardly fit the usually diverse disease prediction in practice.This work introduces the idea of ensemble learning into privacypreserving distributed learning and presents the En-split framework,where the predictions of multiple expert models with specialized diagnostic capabilities are jointly explored.Considering the exacerbation of communication and computation burdens with multiple models during learning,model split is used to partition targeted models into two parts,with hospitals focusing on building the feature-enriched shallow layers.Meanwhile,dedicated noises are implemented to the edge layers for differential privacy protection.Experiments on two public datasets demonstrate En-split’s superior performance on accuracy and efficiency,compared with existing distributed learning frameworks. 展开更多
关键词 Collaborative learning federated learning split learning
在线阅读 下载PDF
FedBone:Towards Large-Scale Federated Multi-Task Learning 被引量:1
2
作者 Yi-Qiang Chen Teng Zhang +3 位作者 Xin-Long Jiang Qian Chen Chen-Long Gao Wu-Liang Huang 《Journal of Computer Science & Technology》 SCIE EI CSCD 2024年第5期1040-1057,共18页
Federated multi-task learning(FMTL)has emerged as a promising framework for learning multiple tasks simultaneously with client-aware personalized models.While the majority of studies have focused on dealing with the n... Federated multi-task learning(FMTL)has emerged as a promising framework for learning multiple tasks simultaneously with client-aware personalized models.While the majority of studies have focused on dealing with the non-independent and identically distributed(Non-IID)characteristics of client datasets,the issue of task heterogeneity has largely been overlooked.Dealing with task heterogeneity often requires complex models,making it impractical for federated learning in resource-constrained environments.In addition,the varying nature of these heterogeneous tasks introduces inductive biases,leading to interference during aggregation and potentially resulting in biased global models.To address these issues,we propose a hierarchical FMTL framework,referred to as FedBone,to facilitate the construction of large-scale models with improved generalization.FedBone leverages server-client split learning and gradient projection to split the entire model into two components:1)a large-scale general model(referred to as the general model)on the cloud server,and 2)multiple task-specific models(referred to as client models)on edge clients,accommodating devices with limited compute power.To enhance the robustness of the large-scale general model,we incorporate the conflicting gradient projection technique into FedBone to rectify the skewed gradient direction caused by aggregating gradients from heterogeneous tasks.The proposed FedBone framework is evaluated on three benchmark datasets and one real ophthalmic dataset.The comprehensive experiments demonstrate that FedBone efficiently adapts to the heterogeneous local tasks of each client and outperforms existing federated learning algorithms in various dense prediction and classification tasks while utilizing off-the-shelf computational resources on the client side. 展开更多
关键词 federated learning multi-task learning split learning heterogeneous task
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部