摘要
研究了用基于非单调线搜索技术的超记忆梯度算法解决大规模信号恢复问题。利用平滑切片绝对偏差惩罚函数(SCAD)代替1正则化最小二乘问题的1范数惩罚函数,因SCAD的一个局部二次逼近是凸且可微的,所以目标函数的梯度和海瑟阵易计算。该算法的特点:每一步迭代充分利用前面多步迭代信息,避免目标函数海瑟阵的储存和计算,因此它适合解决大规模信号恢复问题。在某些假设下,证明了提出算法的收敛性,数值实验表明本文提出的算法是可行的。
We study a nonmonotone supermemory gradient algorithm for solving large-scale sparse signal recovery prob- lems. The l1 penalty function of the constrained l1-regularized least-squares recovery problem is replaced by the smooth- ly clipped absolute deviation (SCAD) sparsity-promoting penalty function. In addition, a convex and differentiable local quadratic approximation for the SCAD function is employed to render the computation of the gradient and Hessian tractable. The proposed method sufficiently uses the previous multi-step iterative information at each iteration, avoids the storage and computation of matrices associated with the Hessian of objective functions, thus it is suitable to solve large-scale sparse signal recovery problems. Under some assumptions, the convergence properties of the proposed algorithm are analyzed. Numerical results are also reported to show the efficiency of this proposed method.
出处
《山东大学学报(理学版)》
CAS
CSCD
北大核心
2017年第1期65-73,80,共10页
Journal of Shandong University(Natural Science)
基金
河南省高等学校重点科研项目(17A110032)
河南省教育厅科学技术研究重点项目(12B110011)