摘要
为了克服传统SIFT描述子进行图像匹配时对噪声和图像灰度的非线性变换敏感的缺点,提出了一种全局结构化SIFT描述子及其生成方法.该方法将特征点矩形区域改为以特征点为中心向外扩散的同心圆区域,计算同心圆区域10个方向的曲率累积值,建立一个描述范围为特征点尺度函数的特征向量,对其实施排序操作,赋予完全旋转尺度不变,形成全局结构化SIFT描述子.采用欧氏距离为匹配度量函数应用于图像匹配.实验结果表明:这种全局、局部结构式信息联合的思想增强了算法对图像的光照、平移、旋转等变换的鲁棒性,匹配精度提升18%,极大地改善了匹配效果.
In order to restrain the high sensitivity to image noise and non-linear intensity transformation for conventional SIFT (scale invariable feature transformation) descriptor when used in image matching, a global structured SIFT descriptor and its generation method were proposed. The algorithm first convert the rectangular area in which feature points were equally distributed to the concentric circles area which took the feature points as the center and spread outside. Curvature accumulated value of concentric circles area in the ten direction, then the global structured local feature vector was established, which takes the feature points as the function was calculated, and then the feature vector were sequenced and endowed with completely rotation scale invariance. Finally, a global structured SIFT descriptor was formed. Euclidean distance was adopted as the measuring function into matching. The experiment results show that the structured jointing of global and local information can improve the robustness of the images to the illumination, scale and rotation transformation, and it enhances the accuracy of matching. The proposed algorithm greatly improves the matching and the matching accuracy is improved by 18%.
出处
《华中科技大学学报(自然科学版)》
EI
CAS
CSCD
北大核心
2012年第1期15-20,共6页
Journal of Huazhong University of Science and Technology(Natural Science Edition)
基金
国家科技重大专项基金资助项目(2010ZX03004-002
2011ZX002-4)
关键词
图像匹配
特征匹配
特征描述子
尺度不变特征变换
鲁棒性
image matching
feature matching
feature descriptor
scale invariant feature transform
robustness