期刊文献+

高斯核选择的线性性质检测方法

Linearity Property Testing Approach to Gaussian Kernel Selection
在线阅读 下载PDF
导出
摘要 核选择直接影响核方法的性能.已有高斯核选择方法的计算复杂度为Ω(n2),阻碍大规模核方法的发展.文中提出高斯核选择的线性性质检测方法,不同于传统核选择方法,询问复杂度为O(ln(1/δ)/2),计算复杂度独立于样本规模.文中首先给出函数线性水平的定义,证明可使用线性水平近似度量一个函数与线性函数类之间的距离,并以此为基础提出高斯核选择的线性性质检测准则.然后应用该准则,在随机傅里叶特征空间中有效评价并选择高斯核.理论分析与实验表明,应用性质检测以实现高斯核选择的方法有效可行. Kernel selection is critical to the performance of kernel methods. The computational complexity of the existing approaches to Gaussian kernel selection isΩ(n^2). Therefore, it is an impediment to the development of large-scale kernel methods. To address this issue, a linearity property testing approach to Gaussian kernel selection is proposed. Completely different from the existing approaches, the proposed approach only needs O( ln(1/δ)/ε2) query complexity, and its computational complexity is independent of the sample size. Firstly, a concept called ε linearity level is defined. It is proved that ε linearity level can approximate the distance between a function and the linear function class, and the linearity property testing criterion for Gaussian kernel selection is presented via the concept of ~ linearity level and the approximate distance. The linearity property testing criterion can be applied in random Fourier feature space to assess and select a suitable Gaussian kernel. Theoretical and experimental results demonstrate that the linearity property testing approach to Gaussian kernel selection is feasible and effective.
出处 《模式识别与人工智能》 EI CSCD 北大核心 2017年第9期815-821,共7页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金项目(No.61673293)资助~~
关键词 高斯核选择 线性性质检测 随机傅里叶特征 询问复杂度 Gaussian Kernel Selection, Linearity Property Testing, Random Fourier Features,Query Complexity
  • 相关文献

参考文献2

二级参考文献47

  • 1Vapnik V. The Nature of Statistical Learning Theory [M]. Berlin: Springer, 2000.
  • 2Guyon I, Saffari A, Dror G. Model selection: Beyond the Bayesian/frequent divide[J]. Journal of Machine Learning Research, 2010, 11: 61-87.
  • 3Duan K, Keerthi S, Poo A. Evaluation of simple performance measures for tuning SVM hyperparameters [J]. Neurocomputing, 2003, 51: 41-59.
  • 4Huang C, Wang C. A GA-hased feature selection and parameters optimization for support vector machines [J]. Expert Systems with Applications, 2006, 31(2):231-240.
  • 5Friedrichs F, Igel C. Evolutionary tuning of multiple SVM parameters [J]. Neuroeomputing, 2005, 64:107-117.
  • 6Vapnik V, Chapelle O. Bounds on error expectation for support vector machines [J]. Neural Computation, 2000, 12 (9) : 2013-2036.
  • 7Chapelle O, Vapnik V, Bousquet O. Choosing multiple parameters for support vector machines [J]. Machine Learning, 2002, 46(1): 131-159.
  • 8Xu Z, Dai M, Meng D. Fast and effcient strategies for model selection of Gaussian support vector machine [J]. IEEE Trans on Systems, Man, and Cybernetics, Part B: Cybernetics, 2009, 39(5) : 1292-1307.
  • 9Jia Lei, Liao Shizhong, Ding Lizhong. Learning with uncertain kernel matrix set[J]. Journal of Computer Seienee and Technology, 2010, 25(4): 709-727.
  • 10Papadimitriou C, Raghavan P, Tamaki H. Latent semantic indexing: A probabilistic analysis [J]. Journal of Computer and System Sciences, 2000, 61(2): 217-235.

共引文献17

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部