期刊文献+
共找到5篇文章
< 1 >
每页显示 20 50 100
A splicing algorithm for best subset selection in sliced inverse regression
1
作者 Borui Tang Jin Zhu +1 位作者 Tingyin Wang Junxian Zhu 《中国科学技术大学学报》 北大核心 2025年第5期22-34,21,I0001,共15页
In this study,we examine the problem of sliced inverse regression(SIR),a widely used method for sufficient dimension reduction(SDR).It was designed to find reduced-dimensional versions of multivariate predictors by re... In this study,we examine the problem of sliced inverse regression(SIR),a widely used method for sufficient dimension reduction(SDR).It was designed to find reduced-dimensional versions of multivariate predictors by replacing them with a minimally adequate collection of their linear combinations without loss of information.Recently,regularization methods have been proposed in SIR to incorporate a sparse structure of predictors for better interpretability.However,existing methods consider convex relaxation to bypass the sparsity constraint,which may not lead to the best subset,and particularly tends to include irrelevant variables when predictors are correlated.In this study,we approach sparse SIR as a nonconvex optimization problem and directly tackle the sparsity constraint by establishing the optimal conditions and iteratively solving them by means of the splicing technique.Without employing convex relaxation on the sparsity constraint and the orthogonal constraint,our algorithm exhibits superior empirical merits,as evidenced by extensive numerical studies.Computationally,our algorithm is much faster than the relaxed approach for the natural sparse SIR estimator.Statistically,our algorithm surpasses existing methods in terms of accuracy for central subspace estimation and best subset selection and sustains high performance even with correlated predictors. 展开更多
关键词 splicing technique best subset selection sliced inverse regression nonconvex optimization sparsity constraint optimal conditions
在线阅读 下载PDF
Projective Resampling Functional Sliced Inverse Regression
2
作者 QU Wenxin LIANG Beiting WANG Guochang 《Journal of Systems Science & Complexity》 2025年第5期2185-2203,共19页
For functional data,the most popular dimension reduction methods are functional sliced inverse regression(FSIR)and functional sliced average variance estimation(FSAVE).Both FSIR and FSAVE methods are based on the slic... For functional data,the most popular dimension reduction methods are functional sliced inverse regression(FSIR)and functional sliced average variance estimation(FSAVE).Both FSIR and FSAVE methods are based on the slice approach to estimate the conditional expectation E[x(t)|y].While sliced-based methods are effective for scalar responses,they often perform poorly or even lead to failure for multivariate responses and small sample sizes as the so-called“curse of dimensionality”.To avoid this problem,this study proposes a projective resampling method that first projects the multivariate response into a scalar-response and then uses SDR method for the univariate response to estimate the effective dimension reduction space(e.d.r space).The proposed projective resampling method is insensitive to the number of slices and the dimensionality of the response variable.In theory,the proposed resampling method can fully recover the effective dimension reduction space.Furthermore,this study investigates the performance of the proposed method through simulation studies and one real data analysis and compares the proposed method with other methods. 展开更多
关键词 Effectivedimension reduction functional sliced inverse regression(FSIR) functional data multivariate response projective resampling
原文传递
CLUSTER-BASED REGULARIZED SLICED INVERSE REGRESSION FOR FORECASTING MACROECONOMIC VARIABLES 被引量:1
3
作者 YU Yue CHEN Zhihong YANG Jie 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2014年第1期75-91,共17页
This paper concerns the dimension reduction in regression for large data set. The authors introduce a new method based on the sliced inverse regression approach, cMled cluster-based regularized sliced inverse regressi... This paper concerns the dimension reduction in regression for large data set. The authors introduce a new method based on the sliced inverse regression approach, cMled cluster-based regularized sliced inverse regression. The proposed method not only keeps the merit of considering both response and predictors' information, but also enhances the capability of handling highly correlated variables. It is justified under certain linearity conditions. An empirical application on a macroeconomic data set shows that the proposed method has outperformed the dynamic factor model and other shrinkage methods. 展开更多
关键词 Cluster-based FORECAST MACROECONOMICS sliced inverse regression.
原文传递
On spline approximation of sliced inverse regression
4
作者 Li-ping ZHU Zhou YU 《Science China Mathematics》 SCIE 2007年第9期1289-1302,共14页
The dimension reduction is helpful and often necessary in exploring the nonparametric regression structure.In this area,Sliced inverse regression (SIR) is a promising tool to estimate the central dimension reduction (... The dimension reduction is helpful and often necessary in exploring the nonparametric regression structure.In this area,Sliced inverse regression (SIR) is a promising tool to estimate the central dimension reduction (CDR) space.To estimate the kernel matrix of the SIR,we herein suggest the spline approximation using the least squares regression.The heteroscedasticity can be incorporated well by introducing an appropriate weight function.The root-n asymptotic normality can be achieved for a wide range choice of knots.This is essentially analogous to the kernel estimation.Moreover, we also propose a modified Bayes information criterion (BIC) based on the eigenvalues of the SIR matrix.This modified BIC can be applied to any form of the SIR and other related methods.The methodology and some of the practical issues are illustrated through the horse mussel data.Empirical studies evidence the performance of our proposed spline approximation by comparison of the existing estimators. 展开更多
关键词 asymptotic normality SPLINE Bayes information criterion dimension reduction sliced inverse regression structural dimensionality 62H12 62J02
原文传递
Dimension reduction based on weighted variance estimate
5
作者 ZHAO JunLong1 & XU XingZhong2 1 Department of Mathematics, Beihang University Laboratory of Mathematics, Information and Behavior of the Ministry of Education, Beijing 100083, China 2 Department of Mathematics, Beijing Institute of Technology, Beijing 100081, China 《Science China Mathematics》 SCIE 2009年第3期539-560,共22页
In this paper, we propose a new estimate for dimension reduction, called the weighted variance estimate (WVE), which includes Sliced Average Variance Estimate (SAVE) as a special case. Bootstrap method is used to sele... In this paper, we propose a new estimate for dimension reduction, called the weighted variance estimate (WVE), which includes Sliced Average Variance Estimate (SAVE) as a special case. Bootstrap method is used to select the best estimate from the WVE and to estimate the structure dimension. And this selected best estimate usually performs better than the existing methods such as Sliced Inverse Regression (SIR), SAVE, etc. Many methods such as SIR, SAVE, etc. usually put the same weight on each observation to estimate central subspace (CS). By introducing a weight function, WVE puts different weights on different observations according to distance of observations from CS. The weight function makes WVE have very good performance in general and complicated situations, for example, the distribution of regressor deviating severely from elliptical distribution which is the base of many methods, such as SIR, etc. And compared with many existing methods, WVE is insensitive to the distribution of the regressor. The consistency of the WVE is established. Simulations to compare the performances of WVE with other existing methods confirm the advantage of WVE. 展开更多
关键词 central subspace contour regression sliced average variance estimate sliced inverse regression sufficient dimension reduction weight function 62G08 62H05
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部