期刊文献+
共找到10篇文章
< 1 >
每页显示 20 50 100
A splicing algorithm for best subset selection in sliced inverse regression
1
作者 Borui Tang Jin Zhu +1 位作者 Tingyin Wang Junxian Zhu 《中国科学技术大学学报》 北大核心 2025年第5期22-34,21,I0001,共15页
In this study,we examine the problem of sliced inverse regression(SIR),a widely used method for sufficient dimension reduction(SDR).It was designed to find reduced-dimensional versions of multivariate predictors by re... In this study,we examine the problem of sliced inverse regression(SIR),a widely used method for sufficient dimension reduction(SDR).It was designed to find reduced-dimensional versions of multivariate predictors by replacing them with a minimally adequate collection of their linear combinations without loss of information.Recently,regularization methods have been proposed in SIR to incorporate a sparse structure of predictors for better interpretability.However,existing methods consider convex relaxation to bypass the sparsity constraint,which may not lead to the best subset,and particularly tends to include irrelevant variables when predictors are correlated.In this study,we approach sparse SIR as a nonconvex optimization problem and directly tackle the sparsity constraint by establishing the optimal conditions and iteratively solving them by means of the splicing technique.Without employing convex relaxation on the sparsity constraint and the orthogonal constraint,our algorithm exhibits superior empirical merits,as evidenced by extensive numerical studies.Computationally,our algorithm is much faster than the relaxed approach for the natural sparse SIR estimator.Statistically,our algorithm surpasses existing methods in terms of accuracy for central subspace estimation and best subset selection and sustains high performance even with correlated predictors. 展开更多
关键词 splicing technique best subset selection sliced inverse regression nonconvex optimization sparsity constraint optimal conditions
在线阅读 下载PDF
Projective Resampling Functional Sliced Inverse Regression
2
作者 QU Wenxin LIANG Beiting WANG Guochang 《Journal of Systems Science & Complexity》 2025年第5期2185-2203,共19页
For functional data,the most popular dimension reduction methods are functional sliced inverse regression(FSIR)and functional sliced average variance estimation(FSAVE).Both FSIR and FSAVE methods are based on the slic... For functional data,the most popular dimension reduction methods are functional sliced inverse regression(FSIR)and functional sliced average variance estimation(FSAVE).Both FSIR and FSAVE methods are based on the slice approach to estimate the conditional expectation E[x(t)|y].While sliced-based methods are effective for scalar responses,they often perform poorly or even lead to failure for multivariate responses and small sample sizes as the so-called“curse of dimensionality”.To avoid this problem,this study proposes a projective resampling method that first projects the multivariate response into a scalar-response and then uses SDR method for the univariate response to estimate the effective dimension reduction space(e.d.r space).The proposed projective resampling method is insensitive to the number of slices and the dimensionality of the response variable.In theory,the proposed resampling method can fully recover the effective dimension reduction space.Furthermore,this study investigates the performance of the proposed method through simulation studies and one real data analysis and compares the proposed method with other methods. 展开更多
关键词 Effectivedimension reduction functional sliced inverse regression(FSIR) functional data multivariate response projective resampling
原文传递
CLUSTER-BASED REGULARIZED SLICED INVERSE REGRESSION FOR FORECASTING MACROECONOMIC VARIABLES 被引量:1
3
作者 YU Yue CHEN Zhihong YANG Jie 《Journal of Systems Science & Complexity》 SCIE EI CSCD 2014年第1期75-91,共17页
This paper concerns the dimension reduction in regression for large data set. The authors introduce a new method based on the sliced inverse regression approach, cMled cluster-based regularized sliced inverse regressi... This paper concerns the dimension reduction in regression for large data set. The authors introduce a new method based on the sliced inverse regression approach, cMled cluster-based regularized sliced inverse regression. The proposed method not only keeps the merit of considering both response and predictors' information, but also enhances the capability of handling highly correlated variables. It is justified under certain linearity conditions. An empirical application on a macroeconomic data set shows that the proposed method has outperformed the dynamic factor model and other shrinkage methods. 展开更多
关键词 Cluster-based FORECAST MACROECONOMICS sliced inverse regression.
原文传递
On spline approximation of sliced inverse regression
4
作者 Li-ping ZHU Zhou YU 《Science China Mathematics》 SCIE 2007年第9期1289-1302,共14页
The dimension reduction is helpful and often necessary in exploring the nonparametric regression structure.In this area,Sliced inverse regression (SIR) is a promising tool to estimate the central dimension reduction (... The dimension reduction is helpful and often necessary in exploring the nonparametric regression structure.In this area,Sliced inverse regression (SIR) is a promising tool to estimate the central dimension reduction (CDR) space.To estimate the kernel matrix of the SIR,we herein suggest the spline approximation using the least squares regression.The heteroscedasticity can be incorporated well by introducing an appropriate weight function.The root-n asymptotic normality can be achieved for a wide range choice of knots.This is essentially analogous to the kernel estimation.Moreover, we also propose a modified Bayes information criterion (BIC) based on the eigenvalues of the SIR matrix.This modified BIC can be applied to any form of the SIR and other related methods.The methodology and some of the practical issues are illustrated through the horse mussel data.Empirical studies evidence the performance of our proposed spline approximation by comparison of the existing estimators. 展开更多
关键词 asymptotic normality SPLINE Bayes information criterion dimension reduction sliced inverse regression structural dimensionality 62H12 62J02
原文传递
INVERSE REGRESSION METHOD IN DATA STRUCTURE ANALYSIS
5
作者 朱力行 安鸿志 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 1991年第4期344-353,共10页
In order to explore the nonlinear structure hidden in high-dimensional data, some dimen-sion reduction techniques have been developed, such as the Projection Pursuit technique (PP).However, PP will involve enormous co... In order to explore the nonlinear structure hidden in high-dimensional data, some dimen-sion reduction techniques have been developed, such as the Projection Pursuit technique (PP).However, PP will involve enormous computational load. To overcome this, an inverse regressionmethod is proposed. In this paper, we discuss and develop this method. To seek the interestingprojective direction, the minimization of the residual sum of squares is used as a criterion, andspline functions are applied to approximate the general nonlinear transform function. The algo-rithm is simple, and saves the computational load. Under certain proper conditions, consistencyof the estimators of the interesting direction is shown. 展开更多
关键词 INVERSE regression METHOD IN DATA STRUCTURE ANALYSIS
原文传递
Federated Sufficient Dimension Reduction Through High-Dimensional Sparse Sliced Inverse Regression
6
作者 Wenquan Cui Yue Zhao +1 位作者 Jianjun Xu Haoyang Cheng 《Communications in Mathematics and Statistics》 2025年第3期719-756,共38页
Federated learning has become a popular tool in the big data era nowadays.It trains a centralized model based on data from different clients while keeping data decentralized.In this paper,we propose a federated sparse... Federated learning has become a popular tool in the big data era nowadays.It trains a centralized model based on data from different clients while keeping data decentralized.In this paper,we propose a federated sparse sliced inverse regression algorithm for the first time.Our method can simultaneously estimate the central dimension reduction subspace and perform variable selection in a federated setting.We transform this federated high-dimensional sparse sliced inverse regression problem into a convex optimization problem by constructing the covariance matrix safely and losslessly.We then use a linearized alternating direction method of multipliers algorithm to estimate the central subspace.We also give approaches of Bayesian information criterion and holdout validation to ascertain the dimension of the central subspace and the hyperparameter of the algorithm.We establish an upper bound of the statistical error rate of our estimator under the heterogeneous setting.We demonstrate the effectiveness of our method through simulations and real world applications. 展开更多
关键词 Federated learning Sliced inverse regression Sufficient dimension reduction Variable selection
原文传递
Recent progress on laser-induced breakdown spectroscopy for the monitoring of coal quality and unburned carbon in fly ash 被引量:3
7
作者 张雷 胡志裕 +7 位作者 尹王保 黄丹 马维光 董磊 武红鹏 李志新 肖连团 贾锁堂 《Frontiers of physics》 SCIE CSCD 2012年第6期690-700,共11页
Our recent progress on developments of laser-induced breakdown spectroscopy (L[BS) based equipments for on-line monitoring of pulverized coal and unburned carbon (UC) level of fly ash are reviewed. A fully softwar... Our recent progress on developments of laser-induced breakdown spectroscopy (L[BS) based equipments for on-line monitoring of pulverized coal and unburned carbon (UC) level of fly ash are reviewed. A fully software-controlled LIBS equipment comprising a self-cleaning device for on-line coal quality monitoring in power plants is developed. The system features an automated sampling device, which is capable of elemental (C, Ca, Mg, Ti, Si, H, Al, Fe, S, and organic oxygen) and proximate analysis (Qad and Aad) through optimal data processing methods. An automated prototype LIBS apparatus has been developed for possible application to power plants for on-line analysis of UC level in fly ash. New data processing methods are proposed to correct spectral interference and matrix effects, with the accuracy for UC level analysis estimated to be 0.26%. 展开更多
关键词 laser-induced breakdown spectroscopy (LIBS) on-line coal quality analysis organic oxygen proximate analysis unburned carbon multivariate inverse regression
原文传递
THE STUDY OF RETRIEVAL THEORY AND METHODS FROM SATELLITE REMOTE SENSING FOR METEOROLOGICAL PARAMETERS OVER EASTERN ASIA-PARTI:ISPRM AND SRRM 被引量:3
8
作者 黎光清 张文建 +5 位作者 董超华 张凤英 张丽霞 冉茂农 罗东风 王保华 《Acta meteorologica Sinica》 SCIE 2000年第3期257-267,共11页
A review of ten-year's practice in developing the improved simultaneous physical retrieval method(ISPRM)is given in the hope that some creative ideas can be drawn from it.The improvement upon the SPRM is associate... A review of ten-year's practice in developing the improved simultaneous physical retrieval method(ISPRM)is given in the hope that some creative ideas can be drawn from it.The improvement upon the SPRM is associated with the under-determinedness of this ill-posed inverse problem.In our experiment,the precondition is observed that prior information must be independent of the satellite measurements.The well-posed retrieval theory has told us that the forward process is fundamental for the retrieval,and it is the bridge between the input of satellite radiance and the output of retrievals.In order to obtain a better result from the forward process. the full advantage of every prior information available must be taken.It is necessary to turn the ill- posed inverse problem into the well-posed one.Then by using the Ridge regression or Bayes algorithm to find the optimal combination among the first guess,the theoretical analogue information and the satellite observations,the impact of the under-determinedness of this inverse problem on the numerical solution is minimized. 展开更多
关键词 simultaneous physical retrieval model(SPRM) statistical regression retrieval model(SRRM).under-determlnedness of ill-posed inverse problem prior information well-posed inverse theory verification
在线阅读 下载PDF
Asymptotics for Kernel Estimation of Slicing Average Third-Moment Estimation
9
作者 Li-ping Zhu Li-xing Zhu 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2006年第1期103-114,共12页
To estimate central dimension-reduction space in multivariate nonparametric rcgression, Sliced Inverse Regression (SIR), Sliced Average Variance Estimation (SAVE) and Slicing Average Third-moment Estimation (SAT... To estimate central dimension-reduction space in multivariate nonparametric rcgression, Sliced Inverse Regression (SIR), Sliced Average Variance Estimation (SAVE) and Slicing Average Third-moment Estimation (SAT) have been developed, Since slicing estimation has very different asymptotic behavior for SIR, and SAVE, the relevant study has been madc case by case, when the kernel estimators of SIH and SAVE share similar asymptotic properties. In this paper, we also investigate kernel estimation of SAT. We. prove the asymptotic normality, and show that, compared with tile existing results, the kernel Slnoothing for SIR, SAVE and SAT has very similar asymptotic behavior, 展开更多
关键词 Asymptotic normality bandwidth selection dimension reduction inverse regression method kernel estimation
原文传递
Dimension reduction based on weighted variance estimate
10
作者 ZHAO JunLong XU XingZhong 《Science China Mathematics》 SCIE 2009年第3期539-560,共22页
In this paper,we propose a new estimate for dimension reduction,called the weighted variance estimate(WVE),which includes Sliced Average Variance Estimate(SAVE)as a special case.Bootstrap method is used to select the ... In this paper,we propose a new estimate for dimension reduction,called the weighted variance estimate(WVE),which includes Sliced Average Variance Estimate(SAVE)as a special case.Bootstrap method is used to select the best estimate from the WVE and to estimate the structure dimension.And this selected best estimate usually performs better than the existing methods such as Sliced Inverse Regression(SIR),SAVE,etc.Many methods such as SIR,SAVE,etc.usually put the same weight on each observation to estimate central subspace(CS).By introducing a weight function,WVE puts different weights on different observations according to distance of observations from CS.The weight function makes WVE have very good performance in general and complicated situations,for example,the distribution of regressor deviating severely from elliptical distribution which is the base of many methods,such as SIR,etc.And compared with many existing methods,WVE is insensitive to the distribution of the regressor.The consistency of the WVE is established.Simulations to compare the performances of WVE with other existing methods confirm the advantage of WVE. 展开更多
关键词 central subspace contour regression sliced average variance estimate sliced inverse regression sufficient dimension reduction weight function
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部