Some equivalent conditions on the classes of lighted-tailed and heavily heavy-tailed and lightly heavy-tailed d.f.s are introduced.The limit behavior of xα(x) and e λx(x) are discussed.Some properties of the subcla...Some equivalent conditions on the classes of lighted-tailed and heavily heavy-tailed and lightly heavy-tailed d.f.s are introduced.The limit behavior of xα(x) and e λx(x) are discussed.Some properties of the subclass DKc and subclass DK 1 are obtained.展开更多
For multivariate failure time with auxiliary covariate information,an estimated pseudo-partial-likelihood estimator under the marginal hazard model with distinguishable baseline hazard has been proposed.However,the as...For multivariate failure time with auxiliary covariate information,an estimated pseudo-partial-likelihood estimator under the marginal hazard model with distinguishable baseline hazard has been proposed.However,the asymptotic properties of the corresponding estimated cumulative hazard function have not been studied.In this paper,based on counting process martingale,we use the continuous mapping theorem and Lenglart inequality and prove the consistency of the estimated cumulative hazard function in estimated pseudo-partial-likelihood approach.展开更多
Under some mild conditions, we derive the asymptotic normality of the Nadaraya-Watson and local linear estimators of the conditional hazard function for left-truncated and dependent data. The estimators were proposed ...Under some mild conditions, we derive the asymptotic normality of the Nadaraya-Watson and local linear estimators of the conditional hazard function for left-truncated and dependent data. The estimators were proposed by Liang and Ould-Sa?d [1]. The results confirm the guess in Liang and Ould-Sa?d [1].展开更多
In this paper, we define the Weibull kernel and use it to nonparametric estimation of the probability density function (pdf) and the hazard rate function for independent and identically distributed (iid) data. The bia...In this paper, we define the Weibull kernel and use it to nonparametric estimation of the probability density function (pdf) and the hazard rate function for independent and identically distributed (iid) data. The bias, variance and the optimal bandwidth of the proposed estimator are investigated. Moreover, the asymptotic normality of the proposed estimator is investigated. The performance of the proposed estimator is tested using simulation study and real data.展开更多
A mature mathematical technique called copula joint function is introduced in this paper, which is commonly used in the financial risk analysis to estimate uncertainty. The joint function is generalized to the n-dimen...A mature mathematical technique called copula joint function is introduced in this paper, which is commonly used in the financial risk analysis to estimate uncertainty. The joint function is generalized to the n-dimensional Frank’s copula. In addition, we adopt two attenuation models proposed by YU and Boore et al, respectively, and construct a two-dimensional copula joint probabilistic function as an example to illustrate the uncertainty treatment at low probability. The results show that copula joint function gives us a better prediction of peak ground motion than that resultant from the simple linear weight technique which is commonly used in the traditional logic-tree treatment of model uncertainties. In light of widespread application in the risk analysis from financial investment to insurance assessment, we believe that the copula-based technique will have a potential application in the seismic hazard analysis.展开更多
In order to evaluate the reliability of long-lifetime products with degradation data, a new proportional hazard degradation model is proposed. By the similarity between time-degradation data and stress-accelerated lif...In order to evaluate the reliability of long-lifetime products with degradation data, a new proportional hazard degradation model is proposed. By the similarity between time-degradation data and stress-accelerated lifetime, and the failure rate function of degradation data which is assumed to be proportional to the time covariate, the reliability assessment based on a proportional hazard degradation model is realized. The least squares method is used to estimate the model's parameters. Based on the failure rate of the degradation data and the proportion function of the known time, the failure rate and the reliability function under the given time and the predetermined failure threshold can be extrapolated. A long life GaAs laser is selected as a case study and its reliability is evaluated. The results show that the proposed method can accurately describe the degradation process and it is effective for the reliability assessment of long lifetime products.展开更多
This study has provided a starting point for defining and working with Cox models in respect of multivariate modeling. In medical researches, there may be situations, where several risk factors potentially affect pati...This study has provided a starting point for defining and working with Cox models in respect of multivariate modeling. In medical researches, there may be situations, where several risk factors potentially affect patient prognosis, howbeit, only one or two might predict patient’s predicament. In seeking to find out which of the risk factors contribute the most to the survival times of patients, there was the need for researchers to adjust the covariates to realize their impact on survival times of patients. Aside the multivariate nature of the covariates, some covariates might be categorical while others might be quantitative. Again, there might be cases where researchers need a model that has <span style="font-family:Verdana;">the capability of extending survival analysis methods to assessing simulta</span><span style="font-family:Verdana;">neously the effect of several risk factors on survival times. This study unveiled the Cox model as a robust technique which could accomplish the aforementioned cases.</span><span style="font-family:;" "=""> </span><span style="font-family:;" "=""><span style="font-family:Verdana;">An investigation meant to evaluate the ITN-factor vis-à-vis its </span><span style="font-family:Verdana;">contribution towards death due to Malaria was exemplified with the Cox model. Data were taken from hospitals in Ghana. In doing so, we assessed hospital in-patients who reported cases of malaria (origin state) to time until death or censoring (destination stage) as a result of predictive factors (exposure to the malaria parasites) and some socioeconomic variables. We purposefully used Cox models to quantify the effect of the ITN-factor in the presence of other risk factors to obtain some measures of effect that could describe the rela</span><span style="font-family:Verdana;">tionship between the exposure variable and time until death adjusting for</span><span style="font-family:Verdana;"> other variables. PH assumption holds for all three covariates. Sex of patient was insignificant to deaths due to malaria. Age of patient and user status </span></span><span style="font-family:Verdana;">were</span><span style="font-family:Verdana;"> both significant. The magnitude of the coefficient (0.384) of ITN user status depicts its high contribution to the variation in the dependent variable.</span>展开更多
In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. wh...In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. which are valid up to a given order statistic of the observations. A precise bound for the errors is obtained which only depends on the index of the last order statistic to be included.展开更多
Under the condition that the total distribution function is continuous and bounded on ( -∞,∞ ), we constructed estimations for distribution and hazard functions with local polynomial method, and obtained the rate ...Under the condition that the total distribution function is continuous and bounded on ( -∞,∞ ), we constructed estimations for distribution and hazard functions with local polynomial method, and obtained the rate of strong convergence of the estimations.展开更多
令H_n(x)是基于来自密度函数f(x)的容量为n的一个随机样本的hazard函数H(x)=f^(x)/[1-integral from x=∞ to x(f(t)dt)]的一个核形估计.对于非参数hazard估计的积分均方误差integral ((H_n(x)-H(x))~2ω(x)f(x)dx)中心极限定理成立的...令H_n(x)是基于来自密度函数f(x)的容量为n的一个随机样本的hazard函数H(x)=f^(x)/[1-integral from x=∞ to x(f(t)dt)]的一个核形估计.对于非参数hazard估计的积分均方误差integral ((H_n(x)-H(x))~2ω(x)f(x)dx)中心极限定理成立的一个充分条件被给出,这里ω(x)是一个权函数.展开更多
In this paper, the authors continue the researches described in [1], that consists in a comparative study of two methods to eliminate the static hazard from logical functions, by using the form of Product of Sums (POS...In this paper, the authors continue the researches described in [1], that consists in a comparative study of two methods to eliminate the static hazard from logical functions, by using the form of Product of Sums (POS), static hazard “0”. In the first method, it used the consensus theorem to determine the cover term that is equal with the product of the two residual implicants, and in the second method it resolved a Boolean equation system. The authors observed that in the second method the digital hazard can be earlier detected. If the Boolean equation system is incompatible (doesn’t have solutions), the considered logical function doesn’t have the static 1 hazard regarding the coupled variable. Using the logical computations, this method permits to determine the needed transitions to eliminate the digital hazard.展开更多
The paper consists in the use of some logical functions decomposition algorithms with application in the implementation of classical circuits like SSI, MSI and PLD. The decomposition methods use the Boolean matrix cal...The paper consists in the use of some logical functions decomposition algorithms with application in the implementation of classical circuits like SSI, MSI and PLD. The decomposition methods use the Boolean matrix calculation. It is calculated the implementation costs emphasizing the most economical solutions. One important aspect of serial decomposition is the task of selecting “best candidate” variables for the G function. Decomposition is essentially a process of substituting two or more input variables with a lesser number of new variables. This substitutes results in the reduction of the number of rows in the truth table. Hence, we look for variables which are most likely to reduce the number of rows in the truth table as a result of decomposition. Let us consider an input variable purposely avoiding all inter-relationships among the input variables. The only available parameter to evaluate its activity is the number of “l”s or “O”s that it has in the truth table. If the variable has only “1” s or “0” s, it is the “best candidate” for decomposition, as it is practically redundant.展开更多
Cox Proportional Hazard model is a popular statistical technique for exploring the relationship between the survival time of neonates and several explanatory variables. It provides an estimate of the study variables’...Cox Proportional Hazard model is a popular statistical technique for exploring the relationship between the survival time of neonates and several explanatory variables. It provides an estimate of the study variables’ effect on survival after adjustment for other explanatory variables, and allows us to estimate the hazard (or risk) of death of newborn in NICU of hospitals in River Nile State-Sudan for the period (2018-2020). Study Data represented (neonate gender, mode of delivery, birth type, neonate weight, resident type, gestational age, and survival time). Kaplan-Meier method is used to estimate survival and hazard function for survival times of newborns that have not completed their first month. Of 700 neonates in the study area, 25% of them died during 2018-2020. Variables of interest that had a significant effect on neonatal death by Cox Proportional Hazard Model analysis were neonate weight, resident type, and gestational age. In Cox Proportional Hazard Model analysis all the variables of interest had an effect on neonatal death, but the variables with a significant effect included, weight of neonate, resident type and gestational age.展开更多
An accelerated proportional degradation hazards-odds model is proposed. It is a non-parametric model and thus has path- free and distribution-free properties, avoiding the errors caused by faulty assumptions of degrad...An accelerated proportional degradation hazards-odds model is proposed. It is a non-parametric model and thus has path- free and distribution-free properties, avoiding the errors caused by faulty assumptions of degradation paths or distribution of degra- dation measurements. It is established based on a link function which combines the degradation cumulative hazard rate function and the degradation odds function through a transformation pa- rameter, and this makes the accelerated proportional degradation hazards model and the accelerated proportional degradation odds model special cases of it. Hypothesis tests are discussed, and the proposed model is applicable when some model assumptions are satisfied. This model is utilized to estimate the reliability of minia- ture bulbs under low stress levels based on the degradation data obtained under high stress levels to validate the effectiveness of this model.展开更多
Low-pressure mercury lamps,with a main emission of about 254 nm,have been used for disinfecting air,water and surfaces for nearly a century.However,only a few studies on the corneal damage threshold at the wavelength ...Low-pressure mercury lamps,with a main emission of about 254 nm,have been used for disinfecting air,water and surfaces for nearly a century.However,only a few studies on the corneal damage threshold at the wavelength of 254nm exist.In this paper,the in vivo corneal damage threshold was determined inchinchilla rabbit model using a laser system at 254 nm.The irradiance of the laser spot was nearly flat-top distributed and the beam diameter on the animal corneal surface was about 3.44mm and 3.28 mm along the horizontal and vertical directions.Damage lesion determinations were performed at 12 h post-exposure using fluorescein sodium staining.TheED50 was17.7 mJ/cm^(2)with a 95%confidence interval of 15.3-20.1 mJ/cm^(2).The obtained results may contribute to the knowledge base for the refinement of the UV hazard function.展开更多
This article discusses regression analysis of failure time under the additive hazards model, when the regression coefficients are time-varying. The regression coefficients are estimated locally based on the pseudo-sco...This article discusses regression analysis of failure time under the additive hazards model, when the regression coefficients are time-varying. The regression coefficients are estimated locally based on the pseudo-score function [12] in a window around each time point. The proposed method can be easily implemented, and the resulting estimators are shown to be consistent and asymptotically normal with easily estimated variances. The simulation studies show that our estimation procedure is reliable and useful.展开更多
基金Supported by the National Natural Science Foundation of China( 1 0 2 71 0 87)
文摘Some equivalent conditions on the classes of lighted-tailed and heavily heavy-tailed and lightly heavy-tailed d.f.s are introduced.The limit behavior of xα(x) and e λx(x) are discussed.Some properties of the subclass DKc and subclass DK 1 are obtained.
基金Supported by the National Natural Science Foundation of China(10771163)
文摘For multivariate failure time with auxiliary covariate information,an estimated pseudo-partial-likelihood estimator under the marginal hazard model with distinguishable baseline hazard has been proposed.However,the asymptotic properties of the corresponding estimated cumulative hazard function have not been studied.In this paper,based on counting process martingale,we use the continuous mapping theorem and Lenglart inequality and prove the consistency of the estimated cumulative hazard function in estimated pseudo-partial-likelihood approach.
基金supported by National Natural Science Foundation of China(No.11301084)Natural Science Foundation of Fujian Province(No.2014J01010)
文摘Under some mild conditions, we derive the asymptotic normality of the Nadaraya-Watson and local linear estimators of the conditional hazard function for left-truncated and dependent data. The estimators were proposed by Liang and Ould-Sa?d [1]. The results confirm the guess in Liang and Ould-Sa?d [1].
文摘In this paper, we define the Weibull kernel and use it to nonparametric estimation of the probability density function (pdf) and the hazard rate function for independent and identically distributed (iid) data. The bias, variance and the optimal bandwidth of the proposed estimator are investigated. Moreover, the asymptotic normality of the proposed estimator is investigated. The performance of the proposed estimator is tested using simulation study and real data.
基金Project of Institute of Crustal Dynamics, China Earthquake Administration (ZDJ2007-1)One Hundred Individual Program of Chinese Academy of Sciences (99M2009M02) National Natural Science Foundation of China (40574022)
文摘A mature mathematical technique called copula joint function is introduced in this paper, which is commonly used in the financial risk analysis to estimate uncertainty. The joint function is generalized to the n-dimensional Frank’s copula. In addition, we adopt two attenuation models proposed by YU and Boore et al, respectively, and construct a two-dimensional copula joint probabilistic function as an example to illustrate the uncertainty treatment at low probability. The results show that copula joint function gives us a better prediction of peak ground motion than that resultant from the simple linear weight technique which is commonly used in the traditional logic-tree treatment of model uncertainties. In light of widespread application in the risk analysis from financial investment to insurance assessment, we believe that the copula-based technique will have a potential application in the seismic hazard analysis.
基金The National Natural Science Foundation of China (No.50405021)
文摘In order to evaluate the reliability of long-lifetime products with degradation data, a new proportional hazard degradation model is proposed. By the similarity between time-degradation data and stress-accelerated lifetime, and the failure rate function of degradation data which is assumed to be proportional to the time covariate, the reliability assessment based on a proportional hazard degradation model is realized. The least squares method is used to estimate the model's parameters. Based on the failure rate of the degradation data and the proportion function of the known time, the failure rate and the reliability function under the given time and the predetermined failure threshold can be extrapolated. A long life GaAs laser is selected as a case study and its reliability is evaluated. The results show that the proposed method can accurately describe the degradation process and it is effective for the reliability assessment of long lifetime products.
文摘This study has provided a starting point for defining and working with Cox models in respect of multivariate modeling. In medical researches, there may be situations, where several risk factors potentially affect patient prognosis, howbeit, only one or two might predict patient’s predicament. In seeking to find out which of the risk factors contribute the most to the survival times of patients, there was the need for researchers to adjust the covariates to realize their impact on survival times of patients. Aside the multivariate nature of the covariates, some covariates might be categorical while others might be quantitative. Again, there might be cases where researchers need a model that has <span style="font-family:Verdana;">the capability of extending survival analysis methods to assessing simulta</span><span style="font-family:Verdana;">neously the effect of several risk factors on survival times. This study unveiled the Cox model as a robust technique which could accomplish the aforementioned cases.</span><span style="font-family:;" "=""> </span><span style="font-family:;" "=""><span style="font-family:Verdana;">An investigation meant to evaluate the ITN-factor vis-à-vis its </span><span style="font-family:Verdana;">contribution towards death due to Malaria was exemplified with the Cox model. Data were taken from hospitals in Ghana. In doing so, we assessed hospital in-patients who reported cases of malaria (origin state) to time until death or censoring (destination stage) as a result of predictive factors (exposure to the malaria parasites) and some socioeconomic variables. We purposefully used Cox models to quantify the effect of the ITN-factor in the presence of other risk factors to obtain some measures of effect that could describe the rela</span><span style="font-family:Verdana;">tionship between the exposure variable and time until death adjusting for</span><span style="font-family:Verdana;"> other variables. PH assumption holds for all three covariates. Sex of patient was insignificant to deaths due to malaria. Age of patient and user status </span></span><span style="font-family:Verdana;">were</span><span style="font-family:Verdana;"> both significant. The magnitude of the coefficient (0.384) of ITN user status depicts its high contribution to the variation in the dependent variable.</span>
文摘In this paper, based on random left truncated and right censored data, the authors derive strong representations of the cumulative hazard function estimator and the product-limit estimator of the survival function. which are valid up to a given order statistic of the observations. A precise bound for the errors is obtained which only depends on the index of the last order statistic to be included.
文摘Under the condition that the total distribution function is continuous and bounded on ( -∞,∞ ), we constructed estimations for distribution and hazard functions with local polynomial method, and obtained the rate of strong convergence of the estimations.
文摘令H_n(x)是基于来自密度函数f(x)的容量为n的一个随机样本的hazard函数H(x)=f^(x)/[1-integral from x=∞ to x(f(t)dt)]的一个核形估计.对于非参数hazard估计的积分均方误差integral ((H_n(x)-H(x))~2ω(x)f(x)dx)中心极限定理成立的一个充分条件被给出,这里ω(x)是一个权函数.
文摘In this paper, the authors continue the researches described in [1], that consists in a comparative study of two methods to eliminate the static hazard from logical functions, by using the form of Product of Sums (POS), static hazard “0”. In the first method, it used the consensus theorem to determine the cover term that is equal with the product of the two residual implicants, and in the second method it resolved a Boolean equation system. The authors observed that in the second method the digital hazard can be earlier detected. If the Boolean equation system is incompatible (doesn’t have solutions), the considered logical function doesn’t have the static 1 hazard regarding the coupled variable. Using the logical computations, this method permits to determine the needed transitions to eliminate the digital hazard.
文摘The paper consists in the use of some logical functions decomposition algorithms with application in the implementation of classical circuits like SSI, MSI and PLD. The decomposition methods use the Boolean matrix calculation. It is calculated the implementation costs emphasizing the most economical solutions. One important aspect of serial decomposition is the task of selecting “best candidate” variables for the G function. Decomposition is essentially a process of substituting two or more input variables with a lesser number of new variables. This substitutes results in the reduction of the number of rows in the truth table. Hence, we look for variables which are most likely to reduce the number of rows in the truth table as a result of decomposition. Let us consider an input variable purposely avoiding all inter-relationships among the input variables. The only available parameter to evaluate its activity is the number of “l”s or “O”s that it has in the truth table. If the variable has only “1” s or “0” s, it is the “best candidate” for decomposition, as it is practically redundant.
文摘Cox Proportional Hazard model is a popular statistical technique for exploring the relationship between the survival time of neonates and several explanatory variables. It provides an estimate of the study variables’ effect on survival after adjustment for other explanatory variables, and allows us to estimate the hazard (or risk) of death of newborn in NICU of hospitals in River Nile State-Sudan for the period (2018-2020). Study Data represented (neonate gender, mode of delivery, birth type, neonate weight, resident type, gestational age, and survival time). Kaplan-Meier method is used to estimate survival and hazard function for survival times of newborns that have not completed their first month. Of 700 neonates in the study area, 25% of them died during 2018-2020. Variables of interest that had a significant effect on neonatal death by Cox Proportional Hazard Model analysis were neonate weight, resident type, and gestational age. In Cox Proportional Hazard Model analysis all the variables of interest had an effect on neonatal death, but the variables with a significant effect included, weight of neonate, resident type and gestational age.
基金supported by the postdoctoral funding at Tsinghua University
文摘An accelerated proportional degradation hazards-odds model is proposed. It is a non-parametric model and thus has path- free and distribution-free properties, avoiding the errors caused by faulty assumptions of degradation paths or distribution of degra- dation measurements. It is established based on a link function which combines the degradation cumulative hazard rate function and the degradation odds function through a transformation pa- rameter, and this makes the accelerated proportional degradation hazards model and the accelerated proportional degradation odds model special cases of it. Hypothesis tests are discussed, and the proposed model is applicable when some model assumptions are satisfied. This model is utilized to estimate the reliability of minia- ture bulbs under low stress levels based on the degradation data obtained under high stress levels to validate the effectiveness of this model.
基金supported by the National Natural Science Foundation of China(No.61575221).
文摘Low-pressure mercury lamps,with a main emission of about 254 nm,have been used for disinfecting air,water and surfaces for nearly a century.However,only a few studies on the corneal damage threshold at the wavelength of 254nm exist.In this paper,the in vivo corneal damage threshold was determined inchinchilla rabbit model using a laser system at 254 nm.The irradiance of the laser spot was nearly flat-top distributed and the beam diameter on the animal corneal surface was about 3.44mm and 3.28 mm along the horizontal and vertical directions.Damage lesion determinations were performed at 12 h post-exposure using fluorescein sodium staining.TheED50 was17.7 mJ/cm^(2)with a 95%confidence interval of 15.3-20.1 mJ/cm^(2).The obtained results may contribute to the knowledge base for the refinement of the UV hazard function.
基金supported by the Fundamental Research Funds for the Central Universities (QN0914)
文摘This article discusses regression analysis of failure time under the additive hazards model, when the regression coefficients are time-varying. The regression coefficients are estimated locally based on the pseudo-score function [12] in a window around each time point. The proposed method can be easily implemented, and the resulting estimators are shown to be consistent and asymptotically normal with easily estimated variances. The simulation studies show that our estimation procedure is reliable and useful.