期刊文献+

深度特征选择方法研究综述 被引量:2

REVIEW RESEARCH ON DEEP FEATURE SELECTION METHODS
在线阅读 下载PDF
导出
摘要 特征选择能够剔除数据中的噪声和冗余信息,降低计算复杂度和数据分析难度,在数据挖掘、机器学习等领域具有重要研究价值。随着深度学习技术的发展,深度神经网络开始被应用到特征选择中,且相比传统方法取得了更好的选择效果,但缺少对此类研究的综合阐述和讨论。为此先对传统特征选择算法进行阐述,重点总结近年来深度特征选择算法的研究进展,并将其分为“输入层嵌入”和“编码层嵌入”两类进行讨论。在公开数据集上测试了几种典型深度特征选择算法的效果,对该领域未来研究重点进行探讨。 Feature selection can eliminate noise and redundant information in data,simplify computational complexity and data analysis difficulty,so it has significant research value in data mining and machine learning.With the development of deep learning technology,deep neural networks have been applied to feature selection and achieved better results than traditional methods.Still,there is a lack of comprehensive description and discussion of such research.In this paper,we described the traditional feature selection algorithms,and summarized the research progress of deep feature selection algorithms into two categories:input-layer embedding and encoding-layer embedding.The effects of typical algorithms were tested on public datasets,and challenging and research directions were further discussed.
作者 陈挺 刘香君 臧璇 池明旻 Chen Ting;Liu Xiangjun;Zang Xuan;Chi Mingmin(Shanghai Key Laboratory of Data Science,School of Computer Science,Fudan University,Shanghai 200438,China;Zhongshan Fudan Joint Innovation Center,Zhongshan PoolNet Technology Co.,Ltd.,Zhongshan 528400,Guangdong,China)
出处 《计算机应用与软件》 北大核心 2025年第7期1-11,32,共12页 Computer Applications and Software
基金 国家自然科学基金项目(62171139)。
关键词 特征选择 数据挖掘 深度学习 深度特征选择 Feature selection Data mining Deep learning Deep feature selection
  • 相关文献

参考文献1

二级参考文献52

  • 1Li G-Z, Yang J Y. Feature selection for ensemble learning and its application[M]. Machine Learning in Bioinformatics, 2008: 135-155.
  • 2Sheinvald J, Byron Dom, Wayne Niblack. A modelling approach to feature selection[J]. Proc of 10th Int Conf on Pattern Recognition, 1990, 6(1): 535-539.
  • 3Cardie C. Using decision trees to improve case-based learning[C]. Proc of 10th Int Conf on Machine Learning. Amherst, 1993: 25-32.
  • 4Modrzejewski M. Feature selection using rough sets theory[C]. Proc of the European Conf on Machine ,Learning. 1993: 213-226.
  • 5Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data[J]. J of Bioinformatics and Computational Biology, 2005, 3(2): 185-205.
  • 6Francois Fleuret. Fast binary feature selection with conditional mutual information[J]. J of Machine Learning Research, 2004, 5(10): 1531-1555.
  • 7Kwak N, Choi C-H. Input feature selection by mutual information based on Parzen window[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2002, 24(12): 1667-1671.
  • 8Novovicova J, Petr S, Michal H, et al. Conditional mutual information based feature selection for classification task[C]. Proc of the 12th Iberoamericann Congress on Pattern Recognition. Valparaiso, 2007: 417-426.
  • 9Qu G, Hariri S, Yousif M. A new dependency and correlation analysis for features[J]. IEEE Trans on Knowledge and Data Engineering, 2005, 17(9): 1199- 1207.
  • 10Forman G. An extensive empirical study of feature selection metrics for text classification[J]. J of Machine Learning Research, 2003, 3(11): 1289-1305.

共引文献218

同被引文献43

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部