摘要
介绍了核学习算法中核主分量分析 (KPCA)和支持向量机 (SVM)的基本原理 ,给出一种推广误差上界估计判据 ,实现了SVM核参数及惩罚因子的优化选取 .根据多变量自回归模型理论对 4个受试对象、三种不同意识任务的脑电信号进行特征提取 ,并利用KPCA方法进行降维预处理 ,对SVM进行训练和分类测试 .结果表明 ,KPCA算法在高维特征空间具有较强的特征选择能力 ,优化核参数的SVM的分类正确率明显高于径向基函数网络 ,三种意识任务的平均分类正确率达 78 6 % .
The fundamentals of two kernel-based learning algorithms, which are kernel principal component analysis (KPCA) and support vector machines (SVM), are introduced. An estimation formula of upper bound of generalization error is given to estimate the optimal kernel parameters and penalization factor of the SVM. Six-channel EEG data were recorded from four subjects while they performed three different mental tasks. A multivariate autoregressive (MVAR) model is applied to extract the features of EEG. The dimensionality of the feature vectors formed by the coefficients of MVAR models is reduced by KPCA first. Then the feature vectors with lower size are used as inputs of SVM with optimal parameters to train and test classification accuracy for three mental tasks. The classification accuracies indicate that the KPCA technique is a powerful feature selector in high-dimensional feature space, and optimal SVM can get optimal results which are significantly better than that of Radial Basis Function (RBF) network. The average classification accuracy over three mental tasks of four subjects achieves 78.6%.
出处
《电子学报》
EI
CAS
CSCD
北大核心
2004年第10期1749-1753,共5页
Acta Electronica Sinica
基金
国家自然科学基金 (No .30 370 395)
关键词
核主分量分析
KPCA
支持向量机
SVM
意识任务
脑电信号
EEG
特征提取
Algorithms
Brain
Classification (of information)
Estimation
Feature extraction
Optimization
Principal component analysis