DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres...DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.展开更多
A comprehensive numerical investigation into mixed⁃mode delamination is presented in this study.It aims to assess the impact of thermal and friction effects through mixed⁃mode flexure crack propagation testing.Finite ...A comprehensive numerical investigation into mixed⁃mode delamination is presented in this study.It aims to assess the impact of thermal and friction effects through mixed⁃mode flexure crack propagation testing.Finite element analysis was employed to model the delamination process,incorporating a contact cohesive zone model.This model couples the traction⁃separation law,the contact law,and the Coulomb friction law simultaneously.The thermomechanical analysis in this study is performed using a sequentially coupled approach,implemented with the finite element software ABAQUS.The findings underscore the importance of this study.展开更多
The mortality of trees across diameter class model is a useful tool for predicting changes in stand structure.Mortality data commonly contain a large fraction of zeros and general discrete models thus show more errors...The mortality of trees across diameter class model is a useful tool for predicting changes in stand structure.Mortality data commonly contain a large fraction of zeros and general discrete models thus show more errors.Based on the traditional Poisson model and the negative binomial model,different forms of zero-inflated and hurdle models were applied to spruce-fir mixed forests data to simulate the number of dead trees.By comparing the residuals and Vuong test statistics,the zero-inflated negative binomial model performed best.A random effect was added to improve the model accuracy;however,the mixed-effects zero-inflated model did not show increased advantages.According to the model principle,the zeroinflated negative binomial model was the most suitable,indicating that the"0"events in this study,mainly from the sample"0",i.e.,the zero mortality data,are largely due to the limitations of the experimental design and sample selection.These results also show that the number of dead trees in the diameter class is positively correlated with the number of trees in that class and the mean stand diameter,and inversely related to class size,and slope and aspect of the site.展开更多
Korean larch(Larix olgensis)is one of the main tree species for aff orestation and timber production in northeast China.However,its timber quality and growth ability are largely infl uenced by crown size,structure and...Korean larch(Larix olgensis)is one of the main tree species for aff orestation and timber production in northeast China.However,its timber quality and growth ability are largely infl uenced by crown size,structure and shape.The majority of crown models are static models based on tree size and stand characteristics from temporary sample plots,but crown dynamic models has seldom been constructed.Therefore,this study aimed to develop height to crown base(HCB)and crown length(CL)dynamic models using the branch mortality technique for a Korean larch plantation.The nonlinear mixed-eff ects model with random eff ects,variance functions and correlation structures,was used to build HCB and CL dynamic models.The data were obtained from 95 sample trees of 19 plots in Meng JiaGang forest farm in Northeast China.The results showed that HCB progressively increases as tree age,tree height growth(HT growth)and diameter at breast height growth(DBH growth).The CL was increased with tree age in 20 years ago,and subsequently stabilized.HT growth,DBH growth stand basal area(BAS)and crown competition factor(CCF)signifi cantly infl uenced HCB and CL.The HCB was positively correlated with BAS,HT growth and DBH growth,but negatively correlated with CCF.The CL was positively correlated with BAS and CCF,but negatively correlated with DBH growth.Model fi tting and validation confi rmed that the mixed-eff ects model considering the stand and tree level random eff ects was accurate and reliable for predicting the HCB and CL dynamics.However,the models involving adding variance functions and time series correlation structure could not completely remove heterogeneity and autocorrelation,and the fi tting precision of the models was reduced.Therefore,from the point of view of application,we should take care to avoid setting up over-complex models.The HCB and CL dynamic models in our study may also be incorporated into stand growth and yield model systems in China.展开更多
Based on empirical likelihood method and QR decomposition technique, an orthogonality empirical likelihood based estimation method for the fixed effects in linear mixed effects models is proposed. Under some regularit...Based on empirical likelihood method and QR decomposition technique, an orthogonality empirical likelihood based estimation method for the fixed effects in linear mixed effects models is proposed. Under some regularity conditions, the proposed empirical log-likelihood ratio is proved to be asymptotically chi-squared, and then the confidence intervals for the fixed effects are constructed. The proposed estimation procedure is not affected by the random effects,and then the resulting estimator is more effective. Some simulations and a real data application are conducted for further illustrating the performances of the proposed method.展开更多
It is well known that viral load of the hepatitis C virus (HCV) is related to the efficacy of interferon therapy. The complex biological parameters that impact on viral load are essentially unknown. The current knowle...It is well known that viral load of the hepatitis C virus (HCV) is related to the efficacy of interferon therapy. The complex biological parameters that impact on viral load are essentially unknown. The current knowledge of the hepatitis C virus does not provide a mathematical model for viral load dynamics within untreated patients. We car-ried out an empirical modelling to investigate whether different fluctuation patterns exist and how these patterns (if exist) are related to host-specific factors. Data was prospectively col-lected from 147 untreated patients chronically infected with hepatitis C, each contributing be-tween 2 to 10 years of measurements. We pro-pose to use a three parameter logistic model to describe the overall pattern of viral load fluctua-tion based on an exploratory analysis of the data. To incorporate the correlation feature of longitu-dinal data and patient to patient variation, we introduced random effects components into the model. On the basis of this nonlinear mixed ef-fects modelling, we investigated effects of host-specific factors on viral load fluctuation by in-corporating covariates into the model. The pro-posed model provided a good fit for describing fluctuations of viral load measured with varying frequency over different time intervals. The aver-age viral load growth time was significantly dif-ferent between infection sources. There was a large patient to patient variation in viral load as-ymptote.展开更多
The effect of tree age and climatic variables on stem radial growth of two hybrid clones of Eucalyptus was determined using longitudinal data from eastern South Africa.The stem radius of was measured weekly as the res...The effect of tree age and climatic variables on stem radial growth of two hybrid clones of Eucalyptus was determined using longitudinal data from eastern South Africa.The stem radius of was measured weekly as the response variable.In addition to tree age,average weekly temperature,solar radiation,relative humidity and wind speed were simultaneously recorded with total rainfall at the site.An additive mixed effects model that incorporates a non-parametric smooth function was used.The results of the analysis indicate that the relationship between stem radius and each of the covariates can be explained by nonlinear functions.Models that account for the effect of clone and season together with their interaction in the parametric part of the additive mixed model were also fitted.The interaction between clone and season was not significant in all cases.For analyzing the joint effect all the covariates,additive mixed models that included two or more covariates were fitted.A significant effect of tree age was found in all cases.Although tree age was the key determinant of stem radial growth,weather variables also had a significant effect that was dependent on season.展开更多
The Cox proportional hazard model is being used extensively in oncology in studying the relationship between survival times and prognostic factors. The main question that needs to be addressed with respect to the appl...The Cox proportional hazard model is being used extensively in oncology in studying the relationship between survival times and prognostic factors. The main question that needs to be addressed with respect to the applicability of the Cox PH model is whether the proportional hazard assumption is met. Failure to justify the subject assumption will lead to misleading results. In addition, identifying the correct functional form of the continuous covariates is an important aspect in the development of a Cox proportional hazard model. The purpose of this study is to develop an extended Cox regression model for breast cancer survival data which takes non-proportional hazards and non-linear effects that exist in prognostic factors into consideration. Non-proportional hazards and non-linear effects are detected using methods based on residuals. An extended Cox model with non-linear effects and time-varying effects is proposed to adjust the Cox proportional hazard model. Age and tumor size were found to have nonlinear effects. Progesterone receptor assay status and age violated the proportional hazard assumption in the Cox model. Quadratic effect of age and progesterone receptor assay status had hazard ratio that changes with time. We have introduced a statistical model to overcome the presence of the proportional hazard assumption violation for the Cox proportional hazard model for breast cancer data. The proposed extended model considers the time varying nature of the hazard ratio and non-linear effects of the covariates. Our improved Cox model gives a better insight on the hazard rates associated with the breast cancer risk factors.展开更多
We calculate the new physics contributions to the neutral Bd^o and Ba^o meson mass splitting △Md and △Ma induced by the box diagrams involving the charged-Higgs bosons in the top quark two-Higgs doublet model (T2HD...We calculate the new physics contributions to the neutral Bd^o and Ba^o meson mass splitting △Md and △Ma induced by the box diagrams involving the charged-Higgs bosons in the top quark two-Higgs doublet model (T2HDM). Using the precision data, we obtain the bounds on the parameter space of the T2HDM: (a) For fixed MH = 400 GeV and 5= [0°, 60°], the upper bound on tan β is tan β≤ 30 after the inclusion of major theoretical uncertainties; (b) For the case of tan β≤ 20, a light charged Higgs boson with a mass around 300 GeV is allowed; and (c) The bounds on tan β and MH are strongly correlated: a smaller (larger) tan β means a lighter (heavier) charged Higgs boson.展开更多
A scale-similarity model of a two-point two-time Lagrangian velocity correlation(LVC) was originally developed for the relative dispersion of tracer particles in isotropic turbulent flows(HE, G. W., JIN, G. D., and ZH...A scale-similarity model of a two-point two-time Lagrangian velocity correlation(LVC) was originally developed for the relative dispersion of tracer particles in isotropic turbulent flows(HE, G. W., JIN, G. D., and ZHAO, X. Scale-similarity model for Lagrangian velocity correlations in isotropic and stationary turbulence. Physical Review E, 80, 066313(2009)). The model can be expressed as a two-point Eulerian space correlation and the dispersion velocity V. The dispersion velocity denotes the rate at which one moving particle departs from another fixed particle. This paper numerically validates the robustness of the scale-similarity model at high Taylor micro-scale Reynolds numbers up to 373, which are much higher than the original values(R_λ = 66, 102). The effect of the Reynolds number on the dispersion velocity in the scale-similarity model is carefully investigated. The results show that the scale-similarity model is more accurate at higher Reynolds numbers because the two-point Lagrangian velocity correlations with different initial spatial separations collapse into a universal form compared with a combination of the initial separation and the temporal separation via the dispersion velocity.Moreover, the dispersion velocity V normalized by the Kolmogorov velocity V_η ≡ η/τ_η in which η and τ_η are the Kolmogorov space and time scales, respectively, scales with the Reynolds number R_λ as V/V_η ∝ R_λ^(1.39) obtained from the numerical data.展开更多
We use the Pair Approximation method to analyze the magnetic and magnetocaloric behaviors of diluted mixed spin SA=1 and spin SB=1/2 with the anisotropic Heisenberg model, on a cubic lattice with coordination number z...We use the Pair Approximation method to analyze the magnetic and magnetocaloric behaviors of diluted mixed spin SA=1 and spin SB=1/2 with the anisotropic Heisenberg model, on a cubic lattice with coordination number z=6. Our system is described in presence of an external magnetic field;the phase diagram and thermodynamic properties related to the concentration of magnetic atom(A or B) and the single ion anisotropy are constructed and discussed.Special attention is paid to magnetocaloric properties provided by isothermal entropy change as well as the cooling capacity. These cooling power keys are plotted and discussed as a function of interaction anisotropy and magnetic component concentration of two sublattices ions A and B. Numerical results show a double peak structure in the entropy change curve and the inverse magnetocaloric effect related to the presence of the negative single-ion anisotropy.展开更多
In this article, robust generalized estimating equation for the analysis of partial linear mixed model for longitudinal data is used. The authors approximate the nonparametric function by a regression spline. Under so...In this article, robust generalized estimating equation for the analysis of partial linear mixed model for longitudinal data is used. The authors approximate the nonparametric function by a regression spline. Under some regular conditions, the asymptotic properties of the estimators are obtained. To avoid the computation of high-dimensional integral, a robust Monte Carlo Newton-Raphson algorithm is used. Some simulations are carried out to study the performance of the proposed robust estimators. In addition, the authors also study the robustness and the efficiency of the proposed estimators by simulation. Finally, two real longitudinal data sets are analyzed.展开更多
It is well known that spline smoothing estimator relates to the Bayesian estimate under partially informative normal prior. In this paper, we derive the conditions for the pro- priety of the posterior in the nonparame...It is well known that spline smoothing estimator relates to the Bayesian estimate under partially informative normal prior. In this paper, we derive the conditions for the pro- priety of the posterior in the nonparametric mixed effects model under this class of partially informative normal prior for fixed effect with inverse gamma priors on the variance compo- nents and hierarchical priors for covariance matrix of random effect, then we explore the Gibbs sampling procedure.展开更多
Background: Novel models for the assessment of non-linear data are being developed for the benefit of making better predictions from the data. Objective: To review traditional and modern models. Results, and Conclusio...Background: Novel models for the assessment of non-linear data are being developed for the benefit of making better predictions from the data. Objective: To review traditional and modern models. Results, and Conclusions: 1) Logit and probit transformations are often successfully used to mimic a linear model. Logistic regression, Cox regression, Poisson regression, and Markow modeling are examples of logit transformation;2) Either the x- or y-axis or both of them can be logarithmically transformed. Also Box Cox transformation equations and ACE (alternating conditional expectations) or AVAS (additive and variance stabilization for regression) packages are simple empirical methods often successful for linearly remodeling of non-linear data;3) Data that are sinusoidal, can, generally, be successfully modeled using polynomial regression or Fourier analysis;4) For exponential patterns like plasma concentration time relationships exponential modeling with or without Laplace transformations is a possibility. Spline and Loess are computationally intensive modern methods, suitable for smoothing data patterns, if the data plot leaves you with no idea of the relationship between the y- and x-values. There are no statistical tests to assess the goodness of fit of these methods, but it is always better than that of traditional models.展开更多
Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, th...Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, that double assumption is unlikely to hold, particularly for the random effects, a crucial component </span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">in </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">which assessment of magnitude is key in such modeling. Alternative fitting methods not relying on that assumption (as ANOVA ones and Rao</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">s MINQUE) apply, quite often, only to the very constrained class of variance components models. In this paper, a new computationally feasible estimation methodology is designed, first for the widely used class of 2-level (or longitudinal) LMMs with only assumption (beyond the usual basic ones) that residual errors are uncorrelated and homoscedastic, with no distributional assumption imposed on the random effects. A major asset of this new approach is that it yields nonnegative variance estimates and covariance matrices estimates which are symmetric and, at least, positive semi-definite. Furthermore, it is shown that when the LMM is, indeed, Gaussian, this new methodology differs from ML just through a slight variation in the denominator of the residual variance estimate. The new methodology actually generalizes to LMMs a well known nonparametric fitting procedure for standard Linear Models. Finally, the methodology is also extended to ANOVA LMMs, generalizing an old method by Henderson for ML estimation in such models under normality.展开更多
Adaptive fractional polynomial modeling of general correlated outcomes is formulated to address nonlinearity in means, variances/dispersions, and correlations. Means and variances/dispersions are modeled using general...Adaptive fractional polynomial modeling of general correlated outcomes is formulated to address nonlinearity in means, variances/dispersions, and correlations. Means and variances/dispersions are modeled using generalized linear models in fixed effects/coefficients. Correlations are modeled using random effects/coefficients. Nonlinearity is addressed using power transforms of primary (untransformed) predictors. Parameter estimation is based on extended linear mixed modeling generalizing both generalized estimating equations and linear mixed modeling. Models are evaluated using likelihood cross-validation (LCV) scores and are generated adaptively using a heuristic search controlled by LCV scores. Cases covered include linear, Poisson, logistic, exponential, and discrete regression of correlated continuous, count/rate, dichotomous, positive continuous, and discrete numeric outcomes treated as normally, Poisson, Bernoulli, exponentially, and discrete numerically distributed, respectively. Example analyses are also generated for these five cases to compare adaptive random effects/coefficients modeling of correlated outcomes to previously developed adaptive modeling based on directly specified covariance structures. Adaptive random effects/coefficients modeling substantially outperforms direct covariance modeling in the linear, exponential, and discrete regression example analyses. It generates equivalent results in the logistic regression example analyses and it is substantially outperformed in the Poisson regression case. Random effects/coefficients modeling of correlated outcomes can provide substantial improvements in model selection compared to directly specified covariance modeling. However, directly specified covariance modeling can generate competitive or substantially better results in some cases while usually requiring less computation time.展开更多
文摘DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.
文摘A comprehensive numerical investigation into mixed⁃mode delamination is presented in this study.It aims to assess the impact of thermal and friction effects through mixed⁃mode flexure crack propagation testing.Finite element analysis was employed to model the delamination process,incorporating a contact cohesive zone model.This model couples the traction⁃separation law,the contact law,and the Coulomb friction law simultaneously.The thermomechanical analysis in this study is performed using a sequentially coupled approach,implemented with the finite element software ABAQUS.The findings underscore the importance of this study.
基金supported by the "948" Project of the State Forestry Administration of China(No.2013-4-66)
文摘The mortality of trees across diameter class model is a useful tool for predicting changes in stand structure.Mortality data commonly contain a large fraction of zeros and general discrete models thus show more errors.Based on the traditional Poisson model and the negative binomial model,different forms of zero-inflated and hurdle models were applied to spruce-fir mixed forests data to simulate the number of dead trees.By comparing the residuals and Vuong test statistics,the zero-inflated negative binomial model performed best.A random effect was added to improve the model accuracy;however,the mixed-effects zero-inflated model did not show increased advantages.According to the model principle,the zeroinflated negative binomial model was the most suitable,indicating that the"0"events in this study,mainly from the sample"0",i.e.,the zero mortality data,are largely due to the limitations of the experimental design and sample selection.These results also show that the number of dead trees in the diameter class is positively correlated with the number of trees in that class and the mean stand diameter,and inversely related to class size,and slope and aspect of the site.
基金supported by the National Key Research and Development Program of China(2017YFD0600401)the Fundamental Research Funds for the Central Universities(2572019CP08)
文摘Korean larch(Larix olgensis)is one of the main tree species for aff orestation and timber production in northeast China.However,its timber quality and growth ability are largely infl uenced by crown size,structure and shape.The majority of crown models are static models based on tree size and stand characteristics from temporary sample plots,but crown dynamic models has seldom been constructed.Therefore,this study aimed to develop height to crown base(HCB)and crown length(CL)dynamic models using the branch mortality technique for a Korean larch plantation.The nonlinear mixed-eff ects model with random eff ects,variance functions and correlation structures,was used to build HCB and CL dynamic models.The data were obtained from 95 sample trees of 19 plots in Meng JiaGang forest farm in Northeast China.The results showed that HCB progressively increases as tree age,tree height growth(HT growth)and diameter at breast height growth(DBH growth).The CL was increased with tree age in 20 years ago,and subsequently stabilized.HT growth,DBH growth stand basal area(BAS)and crown competition factor(CCF)signifi cantly infl uenced HCB and CL.The HCB was positively correlated with BAS,HT growth and DBH growth,but negatively correlated with CCF.The CL was positively correlated with BAS and CCF,but negatively correlated with DBH growth.Model fi tting and validation confi rmed that the mixed-eff ects model considering the stand and tree level random eff ects was accurate and reliable for predicting the HCB and CL dynamics.However,the models involving adding variance functions and time series correlation structure could not completely remove heterogeneity and autocorrelation,and the fi tting precision of the models was reduced.Therefore,from the point of view of application,we should take care to avoid setting up over-complex models.The HCB and CL dynamic models in our study may also be incorporated into stand growth and yield model systems in China.
基金Supported by the National Social Science Foundation of China(Grant No.18BTJ035).
文摘Based on empirical likelihood method and QR decomposition technique, an orthogonality empirical likelihood based estimation method for the fixed effects in linear mixed effects models is proposed. Under some regularity conditions, the proposed empirical log-likelihood ratio is proved to be asymptotically chi-squared, and then the confidence intervals for the fixed effects are constructed. The proposed estimation procedure is not affected by the random effects,and then the resulting estimator is more effective. Some simulations and a real data application are conducted for further illustrating the performances of the proposed method.
文摘It is well known that viral load of the hepatitis C virus (HCV) is related to the efficacy of interferon therapy. The complex biological parameters that impact on viral load are essentially unknown. The current knowledge of the hepatitis C virus does not provide a mathematical model for viral load dynamics within untreated patients. We car-ried out an empirical modelling to investigate whether different fluctuation patterns exist and how these patterns (if exist) are related to host-specific factors. Data was prospectively col-lected from 147 untreated patients chronically infected with hepatitis C, each contributing be-tween 2 to 10 years of measurements. We pro-pose to use a three parameter logistic model to describe the overall pattern of viral load fluctua-tion based on an exploratory analysis of the data. To incorporate the correlation feature of longitu-dinal data and patient to patient variation, we introduced random effects components into the model. On the basis of this nonlinear mixed ef-fects modelling, we investigated effects of host-specific factors on viral load fluctuation by in-corporating covariates into the model. The pro-posed model provided a good fit for describing fluctuations of viral load measured with varying frequency over different time intervals. The aver-age viral load growth time was significantly dif-ferent between infection sources. There was a large patient to patient variation in viral load as-ymptote.
文摘The effect of tree age and climatic variables on stem radial growth of two hybrid clones of Eucalyptus was determined using longitudinal data from eastern South Africa.The stem radius of was measured weekly as the response variable.In addition to tree age,average weekly temperature,solar radiation,relative humidity and wind speed were simultaneously recorded with total rainfall at the site.An additive mixed effects model that incorporates a non-parametric smooth function was used.The results of the analysis indicate that the relationship between stem radius and each of the covariates can be explained by nonlinear functions.Models that account for the effect of clone and season together with their interaction in the parametric part of the additive mixed model were also fitted.The interaction between clone and season was not significant in all cases.For analyzing the joint effect all the covariates,additive mixed models that included two or more covariates were fitted.A significant effect of tree age was found in all cases.Although tree age was the key determinant of stem radial growth,weather variables also had a significant effect that was dependent on season.
文摘The Cox proportional hazard model is being used extensively in oncology in studying the relationship between survival times and prognostic factors. The main question that needs to be addressed with respect to the applicability of the Cox PH model is whether the proportional hazard assumption is met. Failure to justify the subject assumption will lead to misleading results. In addition, identifying the correct functional form of the continuous covariates is an important aspect in the development of a Cox proportional hazard model. The purpose of this study is to develop an extended Cox regression model for breast cancer survival data which takes non-proportional hazards and non-linear effects that exist in prognostic factors into consideration. Non-proportional hazards and non-linear effects are detected using methods based on residuals. An extended Cox model with non-linear effects and time-varying effects is proposed to adjust the Cox proportional hazard model. Age and tumor size were found to have nonlinear effects. Progesterone receptor assay status and age violated the proportional hazard assumption in the Cox model. Quadratic effect of age and progesterone receptor assay status had hazard ratio that changes with time. We have introduced a statistical model to overcome the presence of the proportional hazard assumption violation for the Cox proportional hazard model for breast cancer data. The proposed extended model considers the time varying nature of the hazard ratio and non-linear effects of the covariates. Our improved Cox model gives a better insight on the hazard rates associated with the breast cancer risk factors.
基金The project partly supported by National Natural Science Foundation of China under Grant No. 10575052 and the Specialized Research Fund for the Doctoral Program of Higher Education (SRFDP) under Grant No. 20050319008.Acknowledgments 0ne of the authors Lin-Xia Lü would like to thank Prof. C.S. Huang for his valuable help.
文摘We calculate the new physics contributions to the neutral Bd^o and Ba^o meson mass splitting △Md and △Ma induced by the box diagrams involving the charged-Higgs bosons in the top quark two-Higgs doublet model (T2HDM). Using the precision data, we obtain the bounds on the parameter space of the T2HDM: (a) For fixed MH = 400 GeV and 5= [0°, 60°], the upper bound on tan β is tan β≤ 30 after the inclusion of major theoretical uncertainties; (b) For the case of tan β≤ 20, a light charged Higgs boson with a mass around 300 GeV is allowed; and (c) The bounds on tan β and MH are strongly correlated: a smaller (larger) tan β means a lighter (heavier) charged Higgs boson.
基金Project supported by the Science Challenge Program(No.TZ2016001)the National Natural Science Foundation of China(Nos.11472277,11572331,11232011,and 11772337)+1 种基金the Strategic Priority Research Program,Chinese Academy of Sciences(No.XDB22040104)the Key Research Program of Frontier Sciences,Chinese Academy of Sciences(No.QYZDJ-SSW-SYS002)
文摘A scale-similarity model of a two-point two-time Lagrangian velocity correlation(LVC) was originally developed for the relative dispersion of tracer particles in isotropic turbulent flows(HE, G. W., JIN, G. D., and ZHAO, X. Scale-similarity model for Lagrangian velocity correlations in isotropic and stationary turbulence. Physical Review E, 80, 066313(2009)). The model can be expressed as a two-point Eulerian space correlation and the dispersion velocity V. The dispersion velocity denotes the rate at which one moving particle departs from another fixed particle. This paper numerically validates the robustness of the scale-similarity model at high Taylor micro-scale Reynolds numbers up to 373, which are much higher than the original values(R_λ = 66, 102). The effect of the Reynolds number on the dispersion velocity in the scale-similarity model is carefully investigated. The results show that the scale-similarity model is more accurate at higher Reynolds numbers because the two-point Lagrangian velocity correlations with different initial spatial separations collapse into a universal form compared with a combination of the initial separation and the temporal separation via the dispersion velocity.Moreover, the dispersion velocity V normalized by the Kolmogorov velocity V_η ≡ η/τ_η in which η and τ_η are the Kolmogorov space and time scales, respectively, scales with the Reynolds number R_λ as V/V_η ∝ R_λ^(1.39) obtained from the numerical data.
文摘We use the Pair Approximation method to analyze the magnetic and magnetocaloric behaviors of diluted mixed spin SA=1 and spin SB=1/2 with the anisotropic Heisenberg model, on a cubic lattice with coordination number z=6. Our system is described in presence of an external magnetic field;the phase diagram and thermodynamic properties related to the concentration of magnetic atom(A or B) and the single ion anisotropy are constructed and discussed.Special attention is paid to magnetocaloric properties provided by isothermal entropy change as well as the cooling capacity. These cooling power keys are plotted and discussed as a function of interaction anisotropy and magnetic component concentration of two sublattices ions A and B. Numerical results show a double peak structure in the entropy change curve and the inverse magnetocaloric effect related to the presence of the negative single-ion anisotropy.
基金the Natural Science Foundation of China(10371042,10671038)
文摘In this article, robust generalized estimating equation for the analysis of partial linear mixed model for longitudinal data is used. The authors approximate the nonparametric function by a regression spline. Under some regular conditions, the asymptotic properties of the estimators are obtained. To avoid the computation of high-dimensional integral, a robust Monte Carlo Newton-Raphson algorithm is used. Some simulations are carried out to study the performance of the proposed robust estimators. In addition, the authors also study the robustness and the efficiency of the proposed estimators by simulation. Finally, two real longitudinal data sets are analyzed.
基金supported by the Natural Science Foundation of China(11201345,11271136)
文摘It is well known that spline smoothing estimator relates to the Bayesian estimate under partially informative normal prior. In this paper, we derive the conditions for the pro- priety of the posterior in the nonparametric mixed effects model under this class of partially informative normal prior for fixed effect with inverse gamma priors on the variance compo- nents and hierarchical priors for covariance matrix of random effect, then we explore the Gibbs sampling procedure.
文摘Background: Novel models for the assessment of non-linear data are being developed for the benefit of making better predictions from the data. Objective: To review traditional and modern models. Results, and Conclusions: 1) Logit and probit transformations are often successfully used to mimic a linear model. Logistic regression, Cox regression, Poisson regression, and Markow modeling are examples of logit transformation;2) Either the x- or y-axis or both of them can be logarithmically transformed. Also Box Cox transformation equations and ACE (alternating conditional expectations) or AVAS (additive and variance stabilization for regression) packages are simple empirical methods often successful for linearly remodeling of non-linear data;3) Data that are sinusoidal, can, generally, be successfully modeled using polynomial regression or Fourier analysis;4) For exponential patterns like plasma concentration time relationships exponential modeling with or without Laplace transformations is a possibility. Spline and Loess are computationally intensive modern methods, suitable for smoothing data patterns, if the data plot leaves you with no idea of the relationship between the y- and x-values. There are no statistical tests to assess the goodness of fit of these methods, but it is always better than that of traditional models.
文摘Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, that double assumption is unlikely to hold, particularly for the random effects, a crucial component </span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">in </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">which assessment of magnitude is key in such modeling. Alternative fitting methods not relying on that assumption (as ANOVA ones and Rao</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">s MINQUE) apply, quite often, only to the very constrained class of variance components models. In this paper, a new computationally feasible estimation methodology is designed, first for the widely used class of 2-level (or longitudinal) LMMs with only assumption (beyond the usual basic ones) that residual errors are uncorrelated and homoscedastic, with no distributional assumption imposed on the random effects. A major asset of this new approach is that it yields nonnegative variance estimates and covariance matrices estimates which are symmetric and, at least, positive semi-definite. Furthermore, it is shown that when the LMM is, indeed, Gaussian, this new methodology differs from ML just through a slight variation in the denominator of the residual variance estimate. The new methodology actually generalizes to LMMs a well known nonparametric fitting procedure for standard Linear Models. Finally, the methodology is also extended to ANOVA LMMs, generalizing an old method by Henderson for ML estimation in such models under normality.
文摘Adaptive fractional polynomial modeling of general correlated outcomes is formulated to address nonlinearity in means, variances/dispersions, and correlations. Means and variances/dispersions are modeled using generalized linear models in fixed effects/coefficients. Correlations are modeled using random effects/coefficients. Nonlinearity is addressed using power transforms of primary (untransformed) predictors. Parameter estimation is based on extended linear mixed modeling generalizing both generalized estimating equations and linear mixed modeling. Models are evaluated using likelihood cross-validation (LCV) scores and are generated adaptively using a heuristic search controlled by LCV scores. Cases covered include linear, Poisson, logistic, exponential, and discrete regression of correlated continuous, count/rate, dichotomous, positive continuous, and discrete numeric outcomes treated as normally, Poisson, Bernoulli, exponentially, and discrete numerically distributed, respectively. Example analyses are also generated for these five cases to compare adaptive random effects/coefficients modeling of correlated outcomes to previously developed adaptive modeling based on directly specified covariance structures. Adaptive random effects/coefficients modeling substantially outperforms direct covariance modeling in the linear, exponential, and discrete regression example analyses. It generates equivalent results in the logistic regression example analyses and it is substantially outperformed in the Poisson regression case. Random effects/coefficients modeling of correlated outcomes can provide substantial improvements in model selection compared to directly specified covariance modeling. However, directly specified covariance modeling can generate competitive or substantially better results in some cases while usually requiring less computation time.