期刊文献+

一种基于局部加权均值的领域适应学习框架 被引量:10

A Local Weighted Mean Based Domain Adaptation Learning Framework
在线阅读 下载PDF
导出
摘要 最大均值差异(Maximum mean discrepancy,MMD)作为一种能有效度量源域和目标域分布差异的标准已被成功运用.然而,MMD作为一种全局度量方法一定程度上反映的是区域之间全局分布和全局结构上的差异.为此,本文通过引入局部加权均值的方法和理论到MMD中,提出一种具有局部保持能力的投影最大局部加权均值差异(Projected maximum local weighted mean discrepancy,PMLWD)度量,结合传统的学习理论提出基于局部加权均值的领域适应学习框架(Local weighted mean based domain adaptation learning framework,LDAF),在LDAF框架下,衍生出两种领域适应学习方法:LDAF MLC和LDAF SVM.最后,通过测试人工数据集、高维文本数据集和人脸数据集来表明LDAF比其他领域适应学习方法更具优势. Maximum mean discrepancy(MMD),as a criterion effectively and effciently measuring the distribution discrepancy between source domains and target ones,has been successfully used.But it is a global measuring algorithm and to some extent only reflects the global distribution discrepancy between domains and the global structural difference.Therefore,we propose projected maximum local weighted mean discrepancy(PMLWD) scheme by with locality preserving ability integrating the theory and method of local weighted mean into the MMD.At the same time,we formulate in theory that the PMLWD is one of generalized algorithms of the MMD.Furthermore,on the basis of the PMLWD and by integrating classical learning theories,we present local weighted mean based domain adaptation learning framework(LDAF).Following the LDAF,we propose local weighted mean based multi-label classification domain adaptation learning algorithm(LDAF MLC) and local weighted mean based domain adaptation supporting vector machine(LDAF SVM).At last,tests on artificial data sets,high dimensional text data sets and face data sets show the LDAF methods are superior to other domain adaption ones.
出处 《自动化学报》 EI CSCD 北大核心 2013年第7期1037-1052,共16页 Acta Automatica Sinica
基金 国家自然科学基金(61272210 60903100) 江苏省自然科学基金(BK2011417) 苏州大学江苏省计算机信息处理技术重点实验室开放课题(KJS1126) 江苏省新型环保重点实验室开放课题(AE201068) 江苏省高校优秀中青年教师和校长境外研修计划资助~~
关键词 迁移学习 领域适应学习 局部加权均值 投影最大局部加权均值差异 基于局部加权均值的领域适应学习框架 Transfer learning(TL) domain adaptation learning(DAL) local weighted mean(LWM) projected maximum local weighted mean discrepancy(PMLWD) local weighted mean based domain adaptation learning framework(LDAF)
  • 相关文献

参考文献33

  • 1Ozawa S, Roy A, Roussinov D. A multitask learning model for online pattern recognition. IEEE Transactions on Neural Networks, 2009, 20(3): 430-445.
  • 2Xu Z J, Sun S L. Multi-source Transfer Learning with Multi-view Adaboost [Online], available: http://www.cst. ecnu.edu.cn/ slsun/pubs/MvTransfer.pdf, November 7-9, 2006.
  • 3Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345-1359.
  • 4Quanz B, Huan J. Large margin transductive transfer learning. In: Proceedings of the 18th ACM Conference on Information and Knowledge Management (CIKM). New York, USA: ACM, 2009. 1327-1336.
  • 5Zhang D, He J R, Liu Y, Si L, Lawrence R D. Multi-view transfer learning with a large margin approach. In: Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2011. 1208-1216.
  • 6Xu Z J, Sun S L. Multi-view Transfer learning with adaboost. In: Proceedings of the 23rd IEEE International Conference on Tools with Artificial Intelligence (ICTAI). New York, USA: IEEE, 2011. 399-402.
  • 7Perez-Cruz F. Kullback-Leibler divergence estimation of continuous distributions. In: Proceedings of the 2008 IEEE International Symposium on Information Theory (ISIT) 2008. New York, USA: IEEE, 2008. 1666-1670.
  • 8Borgwardt K M, Gretton A, Rasch M J, Kriegel H P, Sch?lkopf B, Smola A J. Integrating structured biological data by kernel maximum mean discrepancy. In: Proceedings of the 14th International Conference on Intelligent Systems for Molecular Biology (ISMB). California, USA: ISCB, 2006. e49-e57.
  • 9Joachims T. Transductive inference for text classification using support vector machines. In: Proceedings of the 16th International Conference on Machine Learning (ICML). San Francisco, CA: Morgan Kaufmann Publishers, 1999. 200-209.
  • 10Ji S W, Tang L, Yu S P, Ye J P. Extracting shared subspace for multi-label classification. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD). New York, USA: ACM, 2008. 381-389.

共引文献2

同被引文献102

  • 1吴慰慈.试论图书保护学[J].图书馆工作与研究,1981(3):4-7. 被引量:4
  • 2吴良凯.谈图书污损问题[J].图书馆工作与研究,2004(4):74-75. 被引量:29
  • 3尹峻松,肖健,周宗潭,胡德文.非线性流形学习方法的分析与应用[J].自然科学进展,2007,17(8):1015-1025. 被引量:19
  • 4Yang Q. An introduction to transfer learning. In: Proceedings of the 4th International Advanced Data Mining and Applications Conference. Berlin, Heidelberg: Springer-Verlag, 2008. 1.
  • 5Taylor M E, Stone P. Transfer learning for reinforcement learning domains: a survey. Journal of Machine Learning, 2009, 10: 1633-1685.
  • 6Dai W Y, Yang Q, Xue G R, Yu Y. Boosting for transfer learning. In: Proceedings of the 24th International Conference on Machine Learning. New York, USA: ACM, 2007. 193-200.
  • 7Freund Y, Schapire R E. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 1997, 55(1): 119-139.
  • 8Pardoe D, Stone P. Boosting for regression transfer. In: Proceedings of the 27th International Conference on Machine Learning (ICML-10). Haifa, Israel, 2010. 863-870.
  • 9Eaton E, des Jardins M. Set-based boosting for instance-level transfer. In: Proceedings of the 2009 IEEE International Conference on Data Mining Workshops. Miami, FL: IEEE, 2009. 422-428.
  • 10Eaton E. Selective Knowledge Transfer for Machine Learning [Ph.D. dissertation], University of Maryland Baltimore County, USA, 2009.

引证文献10

二级引证文献106

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部