Chikungunya is a mosquito-borne viral infection caused by the chikungunya virus(CHIKV).It is characterized by acute onset of high fever,severe polyarthralgia,myalgia,headache,and maculopapular rash.The virus is rapidl...Chikungunya is a mosquito-borne viral infection caused by the chikungunya virus(CHIKV).It is characterized by acute onset of high fever,severe polyarthralgia,myalgia,headache,and maculopapular rash.The virus is rapidly spreading and may establish in new regions where competent mosquito vectors are present.This research analyzes the regulatory dynamics of a stochastic differential equation(SDE)model describing the transmission of the CHIKV,incorporating seasonal variations,immunization efforts,and environmentalffuctuations modeled through Poisson random measure noise under demographic heterogeneity.The model guarantees the existence of a global positive solution and demonstrates periodic dynamics driven by environmental factors.A key contribution of this study is the formulation of a stochastic threshold parameter,R0L,which characterizes the conditions for disease persistence or extinction under random environmental inffuences.Although our analysis highlights age-speciffc heterogeneities to illustrate differential transmission risks,the framework is general and can incorporate other vulnerable demographic groups,ensuring broader applicability of the results.Using the Monte Carlo Markov Chain(MCMC)method,we estimate R0L=1.4978(95%C-I:1.4968–1.5823)based on CHIKV data from Florida,USA,spanning 2005 to 2017,suggesting that the outbreak remains active and requires targeted control strategies.The effectiveness of immunization,screening,and treatment strategies varies depending on the prioritized demographic groups,due to substantial differences in CHIKV incidence across age categories in the USA.Numerical simulations were conducted using the truncated Euler–Maruyama method to robustly capture the stochastic dynamics of CHIKV transmission with Poissondriven jumps.Employing an iterative approach and assuming mild convexity conditions,we formulated and solved a parameterized near-optimality problem using the Ekeland variational principle.Ourffndings indicate that vaccination campaigns are signiffcantly more effective when focused on vulnerable adults over the age of 66,as well as individuals aged 21 to 25.Furthermore,enhancements in vaccine effcacy,diagnostic screening,and treatment protocols all contribute substantially to minimizing infection rates compared to current standard approaches.These insights support the development of targeted,age-speciffc public health interventions that can signiffcantly improve the management and control of future CHIKV outbreaks.展开更多
This paper discussed the random distribution of the loading and unloading response ratio(LURR) of different definitions(Y<sub>1</sub>~Y<sub>5</sub>)using the assumptions that the earthquak...This paper discussed the random distribution of the loading and unloading response ratio(LURR) of different definitions(Y<sub>1</sub>~Y<sub>5</sub>)using the assumptions that the earthquakes occurfollowing the Poisson process and their magnitudes obey the Gutenberg-Richter law.Theresults show that Y<sub>1</sub>~Y<sub>5</sub> are quite stable or concentrated when the expected number of eventsin the calculation time window is relatively large(】40);but when this occurrence ratebecomes very small,Y<sub>2</sub>~Y<sub>5</sub> become quite variable or unstable.That is to say,a high value ofthe LURR can be produced not only from seismicity before a large earthquake,but also from arandom sequence of earthquakes that obeys a Poisson process when the expected number ofevents in the window is too small.To check the influence of randomness in the catalogue tothe LURR,the random distribution of the LURR under Poisson models has been calculated bysimulation.90%,95% and 99% confidence ranges of Y<sub>1</sub> and Y<sub>3</sub> are given in this paper,which is helpful to quantify the random展开更多
Stop frequency models, as one of the elements of activity based models, represent an important part of travel behavior. Unobserved heterogeneity across the travelers should be taken into consideration to prevent biase...Stop frequency models, as one of the elements of activity based models, represent an important part of travel behavior. Unobserved heterogeneity across the travelers should be taken into consideration to prevent biasedness and inconsistency in the estimated parameters in the stop frequency models. Additionally, previous studies on the stop frequency have mostly been done in larger metropolitan areas and less attention has been paid to the areas with less population. This study addresses these gaps by using 2012 travel data from a medium sized U.S. urban area using the work tour for the case study. Stop in the work tour were classified into three groups of outbound leg, work based subtour, and inbound leg of the commutes. Latent Class Poisson Regression Models were used to analyze the data. The results indicate the presence of heterogeneity across the commuters. Using latent class models significantly improves the predictive power of the models compared to regular one class Poisson regression models. In contrast to one class Poisson models, gender becomes insignificant in predicting the number of tours when unobserved heterogeneity is accounted for. The commuters are associated with increased stops on their work based subtour when the employment density of service-related occupations increases in their work zone, but employment density of retail employment does not significantly contribute to the stop making likelihood of the commuters. Additionally, an increase in the number of work tours was associated with fewer stops on the inbound leg of the commute. The results of this study suggest the consideration of unobserved heterogeneity in the stop frequency models and help transportation agencies and policy makers make better inferences from such models.展开更多
In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. ...In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. In addition to their dependence on time, these rate functions also depend on some parameters that need to be estimated. In order to estimate them, a Bayesian approach will be taken. The expressions for the distributions of the parameters involved in the models are very complex. Therefore, Markov chain Monte Carlo algorithms are used to estimate them. The methodology is applied to the ozone data from the city of Puebla, Mexico.展开更多
Deep learning(DL)has proven to be important for computed tomography(CT)image denoising.However,such models are usually trained under supervision,requiring paired data that may be difficult to obtain in practice.Diffus...Deep learning(DL)has proven to be important for computed tomography(CT)image denoising.However,such models are usually trained under supervision,requiring paired data that may be difficult to obtain in practice.Diffusion models offer unsupervised means of solving a wide range of inverse problems via posterior sampling.In particular,using the estimated unconditional score function of the prior distribution,obtained via unsupervised learning,one can sample from the desired posterior via hijacking and regularization.However,due to the iterative solvers used,the number of function evaluations(NFE)required may be orders of magnitudes larger than for single-step samplers.In this paper,we present a novel image denoising technique for photon-counting CT by extending the unsupervised approach to inverse problem solving to the case of Poisson flow generative models(PFGM)++.By hijacking and regularizing the sampling process we obtain a single-step sampler,that is NFE=1.Our proposed method incorporates posterior sampling using diffusion models as a special case.We demonstrate that the added robustness afforded by the PFGM++framework yields significant performance gains.Our results indicate competitive performance compared to popular supervised,including state-of-the-art diffusion-style models with NFE=1(consistency models),unsupervised,and non-DL-based image denoising techniques,on clinical low-dose CT data and clinical images from a prototype photon-counting CT system developed by GE HealthCare.展开更多
We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson p...We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson processes describing the models are taken into account. The rate functions considered are the Weibull, exponentiated-Weibull, and their generalisation the Beta-Weibull rate function. We also use the Musa-Okumoto, the Goel-Okumoto, a generalised Goel- Okumoto and the Weibull-geometric rate functions. Whenever thought justifiable, the model allowing the presence of change-points is also going to be considered. The different models are applied to the daily maximum ozone measurements data provided by the monitoring network of the Metropolitan Area of Mexico City. The aim is to compare the adjustment of different rate functions to the data. Even though, some of the rate functions have been considered before, now we are applying them to the same data set. In previous works they were used in different data sets and therefore a comparison of the adequacy of those models were not possible. The measurements considered here were obtained after a series of environmental measures were implemented in Mexico City. Hence, the data present a different behaviour from that of earlier studies.展开更多
The application of Tikhonov regularization method dealing with the ill-conditioned problems in the regional gravity field modeling by Poisson wavelets is studied. In particular, the choices of the regularization matri...The application of Tikhonov regularization method dealing with the ill-conditioned problems in the regional gravity field modeling by Poisson wavelets is studied. In particular, the choices of the regularization matrices as well as the approaches for estimating the regularization parameters are investigated in details. The numerical results show that the regularized solutions derived from the first-order regularization are better than the ones obtained from zero-order regularization. For cross validation, the optimal regularization parameters are estimated from L-curve, variance component estimation(VCE) and minimum standard deviation(MSTD) approach, respectively, and the results show that the derived regularization parameters from different methods are consistent with each other. Together with the firstorder Tikhonov regularization and VCE method, the optimal network of Poisson wavelets is derived, based on which the local gravimetric geoid is computed. The accuracy of the corresponding gravimetric geoid reaches 1.1 cm in Netherlands, which validates the reliability of using Tikhonov regularization method in tackling the ill-conditioned problem for regional gravity field modeling.展开更多
Road crash prediction models are very useful tools in highway safety, given their potential for determining both the crash frequency occurrence and the degree severity of crashes. Crash frequency refers to the predict...Road crash prediction models are very useful tools in highway safety, given their potential for determining both the crash frequency occurrence and the degree severity of crashes. Crash frequency refers to the prediction of the number of crashes that would occur on a specific road segment or intersection in a time period, while crash severity models generally explore the relationship between crash severity injury and the contributing factors such as driver behavior, vehicle characteristics, roadway geometry, and road-environment conditions. Effective interventions to reduce crash toll include design of safer infrastructure and incorporation of road safety features into land-use and transportation planning;improvement of vehicle safety features;improvement of post-crash care for victims of road crashes;and improvement of driver behavior, such as setting and enforcing laws relating to key risk factors, and raising public awareness. Despite the great efforts that transportation agencies put into preventive measures, the annual number of traffic crashes has not yet significantly decreased. For in-stance, 35,092 traffic fatalities were recorded in the US in 2015, an increase of 7.2% as compared to the previous year. With such a trend, this paper presents an overview of road crash prediction models used by transportation agencies and researchers to gain a better understanding of the techniques used in predicting road accidents and the risk factors that contribute to crash occurrence.展开更多
In this paper, we consider a compound Poisson risk model with taxes paid according to a loss-carry-forward system and dividends paid under a threshold strategy. First, the closed-form expression of the probability fun...In this paper, we consider a compound Poisson risk model with taxes paid according to a loss-carry-forward system and dividends paid under a threshold strategy. First, the closed-form expression of the probability function for the total number of taxation periods over the lifetime of the surplus process is derived. Second, analytical expression of the expected accumulated discounted dividends paid between two consecutive taxation periods is provided. In addition, explicit expressions are also given for the exponential individual claims.展开更多
Introduction: Studies have shown Emergency Department (ED) crowding contributes to reduced quality of patient care, delays in starting treatments, and increased number of patients leaving without being seen. This anal...Introduction: Studies have shown Emergency Department (ED) crowding contributes to reduced quality of patient care, delays in starting treatments, and increased number of patients leaving without being seen. This analysis shows how to theoretically and optimally align staffing to demand. Methods: The ED value stream was identified and mapped. Patients were stratified into three resource-driven care flow cells based on the severity indices. Time observations were conducted for each of the key care team members and the manual cycle times and service rate were calculated and stratified by severity indices. Using X32 Healthcare’s Online Staffing Optimization (OSO) tool, staffing inefficiencies were identified and an optimal schedule was created for each provider group. Results: Lower Severity Indices (higher acuity patient) led to longer times for providers, nurses, patient care assistants, and clerks. The patient length of stay varied from under one hour to over five hours. The flow of patients varied considerably over the 24 hours’ period but was similar by day of the week. Using flow data, we showed that we needed more nurses, more care team members during peak times of patient flow. Eight hour shifts would allow better flexibility. We showed that the additional salary hours added to the budget would be made up for by increased revenue recognized by decreasing the number of patients who leave without being seen. Conclusion: If implemented, these changes will improve ED flow by using lean tools and principles, ultimately leading to timeliness of care, reduced waits, and improved patient experience.展开更多
In this note we study the optimal dividend problem for a company whose surplus process, in the absence of dividend payments, evolves as a generalized compound Poisson model in which the counting process is a generaliz...In this note we study the optimal dividend problem for a company whose surplus process, in the absence of dividend payments, evolves as a generalized compound Poisson model in which the counting process is a generalized Poisson process. This model includes the classical risk model and the Pólya-Aeppli risk model as special cases. The objective is to find a dividend policy so as to maximize the expected discounted value of dividends which are paid to the shareholders until the company is ruined. We show that under some conditions the optimal dividend strategy is formed by a barrier strategy. Moreover, two conjectures are proposed.展开更多
In recent years,there is a scenario in urban tunnel constructions to build super-large-span tunnels for traffic diversion and route optimization purposes.However,the increased size makes tunnel support more difficult....In recent years,there is a scenario in urban tunnel constructions to build super-large-span tunnels for traffic diversion and route optimization purposes.However,the increased size makes tunnel support more difficult.Unfortunately,there are few studies on the failure and support mechanism of the surrounding rocks in the excavation of supported tunnel,while most model tests of super-large-span tunnels focus on the failure characteristics of surrounding rocks in tunnel excavation without supports.Based on excavation compensation method(ECM),model tests of a super-large-span tunnel excavation by different anchor cable support methods in the initial support stage were carried out.The results indicate that during excavation of super-large-span tunnel,the stress and displacement of the shallow surrounding rocks decrease,following a step-shape pattern,and the tunnel failure is mainly concentrated on the vault and spandrel areas.Compared with conventional anchor cable supports,the NPR(negative Poisson’s ratio)anchor cable support is more suitable for the initial support stage of the super-large-span tunnels.The tunnel support theory,model test materials,methods,and the results obtained in this study could provide references for study of similar super-large-span tunnels。展开更多
In this article, two relaxation time limits, namely, the momentum relaxation time limit and the energy relaxation time limit are considered. By the compactness argument, it is obtained that the smooth solutions of the...In this article, two relaxation time limits, namely, the momentum relaxation time limit and the energy relaxation time limit are considered. By the compactness argument, it is obtained that the smooth solutions of the multidimensional nonisentropic Euler-Poisson problem converge to the solutions of an energy transport model or a drift diffusion model, respectively, with respect to different time scales.展开更多
Often the lifecycle data occur as count of the vital events and are recorded as integers.The purpose of this article is to model the fertility behavior based on religious,educational,economic,and occupational characte...Often the lifecycle data occur as count of the vital events and are recorded as integers.The purpose of this article is to model the fertility behavior based on religious,educational,economic,and occupational characteristics.The responses of classified groups according to these determinants are examined for significant influence on fertility using Poisson regression model(PRM) based on the National Family Health Survey-3 dataset.The observed and predicted probabilities under PRM indicate modal value of two children for the Poisson distribution modeled data.Presence of dominance of two child in the data motivates the authors to adopt multinomial regression model(MRM) in order to link fertility with various socioeconomic indicators responsible for fertility variation.Choice of the explanatory factors is limited to the availability of data.Trends and patterns of preference for birth counts suggest that religion,caste,wealth,female education,and occupation are the dominant factors shaping the observed birth process.Empirical analysis suggests that both the models used in the study perform similarly on the sample data.However,fitting of MRM by taking birth count of two as comparison category shows improved Akaike information criterion and consistent Akaike information criterion values.Current work contributes to the existing literature as it attempts to provide more insight into the determinants of Indian fertility using Poisson and MRM.展开更多
It is difficult to measure the sizes of illegal drug user populations directly by using the survey method because of many “hidden drug addicts” and the difficulty of receiving a true response. Systematic and routine...It is difficult to measure the sizes of illegal drug user populations directly by using the survey method because of many “hidden drug addicts” and the difficulty of receiving a true response. Systematic and routine information on treatment episodes of drug users is adopted to estimate the population size in this study. Mixture models of zero-truncated Poisson distributions using the nonparametric maximum likelihood estimators (NPMLE) by means of capture-recapture repeated count data were used to project the number of drug users. The method was applied to surveillance data of drug users identified by treatment episodes in over 1140 health treatment centers in Thailand from the Bureau of Health Service System Development, Ministry of Public Health. We presented how this mixture model could be utilized to construct the unobserved frequency of drug users with no treatment episode and further estimated the total population size of drug users in the country from 2005 to 2007. The result of simulation was confirmed that mixture model is suitable when population is large. By means of mixture models, the estimations for the number of drug users were fitted with excellent goodness-of-fit values and we were also compared to the conventional Chao estimates. The NPMLE for the total number of drug users in Thailand 2005, 2006, and 2007 were 184,045 (95% CI: 181,297-86,793), 230,665 (95% CI: 226,611-234,719), 299,670 (95% CI: 294,217-305,123), respectively, also 125,265 (95% CI: 123,092-127,142), 166,287 (95% CI: 163,222-169,352), 228,898 (95% CI: 224,766 - 233,030) for the number of methamphetamine (Yaba) users, and 11,559 (95% CI: 10,234-12,884), 11,333 (95% CI: 9276-13,390), 8953 (95% CI: 7878-10,028) for the number of heroin users, respectively. The numbers of marijuana, kratom-plant, opium, and inhalant users were underestimated because their symptoms were mild and not severe enough to remedy in health treatment centers which led to the smaller size of the total number of drug users. The well-estimated sizes of heroin and methamphetamine addicts are high reliable because they are based on clearly evident count with a severe addiction problem to health treatment centers. The estimation by means of mixture models can be recommended to monitor drug demand trend and drug health service routinely;it is easy to calculate via the available programs MIXTP based on request.展开更多
The degradation process modeling is one of research hotspots of prognostic and health management(PHM),which can be used to estimate system reliability and remaining useful life(RUL).In order to study system degradatio...The degradation process modeling is one of research hotspots of prognostic and health management(PHM),which can be used to estimate system reliability and remaining useful life(RUL).In order to study system degradation process,cumulative damage model is used for degradation modeling.Assuming that damage increment is Gamma distribution,shock counting subjects to a homogeneous Poisson process(HPP)when degradation process is linear,and shock counting is a non-homogeneous Poisson process(NHPP)when degradation process is nonlinear.A two-stage degradation system is considered in this paper,for which the degradation process is linear in the first stage and the degradation process is nonlinear in the second stage.A nonlinear modeling method for considered system is put forward,and reliability model and remaining useful life model are established.A case study is given to validate the veracities of established models.展开更多
Several economists agree to say that the need for adjustment was essential for African countries over the decade of the 80’s. The econometric analysis of a sample of 28 sub-Saharan African countries, from variables r...Several economists agree to say that the need for adjustment was essential for African countries over the decade of the 80’s. The econometric analysis of a sample of 28 sub-Saharan African countries, from variables regarded as “representatives” for the adjustment objectives, proves that this assertion cannot be completely rejected.展开更多
This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estima...This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estimators (MLEs). The results of a modest simulation study are presented.展开更多
基金Ongoing Research Funding program(ORF-2025-1404),King Saud University,Riyadh,Saudi Arabia。
文摘Chikungunya is a mosquito-borne viral infection caused by the chikungunya virus(CHIKV).It is characterized by acute onset of high fever,severe polyarthralgia,myalgia,headache,and maculopapular rash.The virus is rapidly spreading and may establish in new regions where competent mosquito vectors are present.This research analyzes the regulatory dynamics of a stochastic differential equation(SDE)model describing the transmission of the CHIKV,incorporating seasonal variations,immunization efforts,and environmentalffuctuations modeled through Poisson random measure noise under demographic heterogeneity.The model guarantees the existence of a global positive solution and demonstrates periodic dynamics driven by environmental factors.A key contribution of this study is the formulation of a stochastic threshold parameter,R0L,which characterizes the conditions for disease persistence or extinction under random environmental inffuences.Although our analysis highlights age-speciffc heterogeneities to illustrate differential transmission risks,the framework is general and can incorporate other vulnerable demographic groups,ensuring broader applicability of the results.Using the Monte Carlo Markov Chain(MCMC)method,we estimate R0L=1.4978(95%C-I:1.4968–1.5823)based on CHIKV data from Florida,USA,spanning 2005 to 2017,suggesting that the outbreak remains active and requires targeted control strategies.The effectiveness of immunization,screening,and treatment strategies varies depending on the prioritized demographic groups,due to substantial differences in CHIKV incidence across age categories in the USA.Numerical simulations were conducted using the truncated Euler–Maruyama method to robustly capture the stochastic dynamics of CHIKV transmission with Poissondriven jumps.Employing an iterative approach and assuming mild convexity conditions,we formulated and solved a parameterized near-optimality problem using the Ekeland variational principle.Ourffndings indicate that vaccination campaigns are signiffcantly more effective when focused on vulnerable adults over the age of 66,as well as individuals aged 21 to 25.Furthermore,enhancements in vaccine effcacy,diagnostic screening,and treatment protocols all contribute substantially to minimizing infection rates compared to current standard approaches.These insights support the development of targeted,age-speciffc public health interventions that can signiffcantly improve the management and control of future CHIKV outbreaks.
基金This project was sponsored by the National Soience Foundation of China(19702060)
文摘This paper discussed the random distribution of the loading and unloading response ratio(LURR) of different definitions(Y<sub>1</sub>~Y<sub>5</sub>)using the assumptions that the earthquakes occurfollowing the Poisson process and their magnitudes obey the Gutenberg-Richter law.Theresults show that Y<sub>1</sub>~Y<sub>5</sub> are quite stable or concentrated when the expected number of eventsin the calculation time window is relatively large(】40);but when this occurrence ratebecomes very small,Y<sub>2</sub>~Y<sub>5</sub> become quite variable or unstable.That is to say,a high value ofthe LURR can be produced not only from seismicity before a large earthquake,but also from arandom sequence of earthquakes that obeys a Poisson process when the expected number ofevents in the window is too small.To check the influence of randomness in the catalogue tothe LURR,the random distribution of the LURR under Poisson models has been calculated bysimulation.90%,95% and 99% confidence ranges of Y<sub>1</sub> and Y<sub>3</sub> are given in this paper,which is helpful to quantify the random
文摘Stop frequency models, as one of the elements of activity based models, represent an important part of travel behavior. Unobserved heterogeneity across the travelers should be taken into consideration to prevent biasedness and inconsistency in the estimated parameters in the stop frequency models. Additionally, previous studies on the stop frequency have mostly been done in larger metropolitan areas and less attention has been paid to the areas with less population. This study addresses these gaps by using 2012 travel data from a medium sized U.S. urban area using the work tour for the case study. Stop in the work tour were classified into three groups of outbound leg, work based subtour, and inbound leg of the commutes. Latent Class Poisson Regression Models were used to analyze the data. The results indicate the presence of heterogeneity across the commuters. Using latent class models significantly improves the predictive power of the models compared to regular one class Poisson regression models. In contrast to one class Poisson models, gender becomes insignificant in predicting the number of tours when unobserved heterogeneity is accounted for. The commuters are associated with increased stops on their work based subtour when the employment density of service-related occupations increases in their work zone, but employment density of retail employment does not significantly contribute to the stop making likelihood of the commuters. Additionally, an increase in the number of work tours was associated with fewer stops on the inbound leg of the commute. The results of this study suggest the consideration of unobserved heterogeneity in the stop frequency models and help transportation agencies and policy makers make better inferences from such models.
文摘In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. In addition to their dependence on time, these rate functions also depend on some parameters that need to be estimated. In order to estimate them, a Bayesian approach will be taken. The expressions for the distributions of the parameters involved in the models are very complex. Therefore, Markov chain Monte Carlo algorithms are used to estimate them. The methodology is applied to the ozone data from the city of Puebla, Mexico.
基金supported by MedTechLabs,GE HealthCare,the Swedish Research council,No.2021-05103the Göran Gustafsson foundation,No.2114.
文摘Deep learning(DL)has proven to be important for computed tomography(CT)image denoising.However,such models are usually trained under supervision,requiring paired data that may be difficult to obtain in practice.Diffusion models offer unsupervised means of solving a wide range of inverse problems via posterior sampling.In particular,using the estimated unconditional score function of the prior distribution,obtained via unsupervised learning,one can sample from the desired posterior via hijacking and regularization.However,due to the iterative solvers used,the number of function evaluations(NFE)required may be orders of magnitudes larger than for single-step samplers.In this paper,we present a novel image denoising technique for photon-counting CT by extending the unsupervised approach to inverse problem solving to the case of Poisson flow generative models(PFGM)++.By hijacking and regularizing the sampling process we obtain a single-step sampler,that is NFE=1.Our proposed method incorporates posterior sampling using diffusion models as a special case.We demonstrate that the added robustness afforded by the PFGM++framework yields significant performance gains.Our results indicate competitive performance compared to popular supervised,including state-of-the-art diffusion-style models with NFE=1(consistency models),unsupervised,and non-DL-based image denoising techniques,on clinical low-dose CT data and clinical images from a prototype photon-counting CT system developed by GE HealthCare.
基金financially supported by the project PAPIIT number IN104110-3 of the Direccion General de Apoyo al Personal Academico of the Universidad Nacional Autonoma de Mexico,Mexico,and is part of JMB’s Ph.D.partially funded by the Consejo Nacional de Ciencias y Tecnologia,Mexico,through the Ph.D.Scholarship number 210347JAA was partially funded by the Conselho Nacional de Pesquisa,Brazil,grant number 300235/2005-4.
文摘We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson processes describing the models are taken into account. The rate functions considered are the Weibull, exponentiated-Weibull, and their generalisation the Beta-Weibull rate function. We also use the Musa-Okumoto, the Goel-Okumoto, a generalised Goel- Okumoto and the Weibull-geometric rate functions. Whenever thought justifiable, the model allowing the presence of change-points is also going to be considered. The different models are applied to the daily maximum ozone measurements data provided by the monitoring network of the Metropolitan Area of Mexico City. The aim is to compare the adjustment of different rate functions to the data. Even though, some of the rate functions have been considered before, now we are applying them to the same data set. In previous works they were used in different data sets and therefore a comparison of the adequacy of those models were not possible. The measurements considered here were obtained after a series of environmental measures were implemented in Mexico City. Hence, the data present a different behaviour from that of earlier studies.
基金supported by the National Natural Science Foundation of China (Nos.41374023,41131067,41474019)the National 973 Project of China (No.2013CB733302)+2 种基金the China Postdoctoral Science Foundation (No.2016M602301)the Key Laboratory of Geospace Envi-ronment and Geodesy,Ministry of Education,Wuhan University (No.15-02-08)the State Scholarship Fund from Chinese Scholarship Council (No.201306270014)
文摘The application of Tikhonov regularization method dealing with the ill-conditioned problems in the regional gravity field modeling by Poisson wavelets is studied. In particular, the choices of the regularization matrices as well as the approaches for estimating the regularization parameters are investigated in details. The numerical results show that the regularized solutions derived from the first-order regularization are better than the ones obtained from zero-order regularization. For cross validation, the optimal regularization parameters are estimated from L-curve, variance component estimation(VCE) and minimum standard deviation(MSTD) approach, respectively, and the results show that the derived regularization parameters from different methods are consistent with each other. Together with the firstorder Tikhonov regularization and VCE method, the optimal network of Poisson wavelets is derived, based on which the local gravimetric geoid is computed. The accuracy of the corresponding gravimetric geoid reaches 1.1 cm in Netherlands, which validates the reliability of using Tikhonov regularization method in tackling the ill-conditioned problem for regional gravity field modeling.
文摘Road crash prediction models are very useful tools in highway safety, given their potential for determining both the crash frequency occurrence and the degree severity of crashes. Crash frequency refers to the prediction of the number of crashes that would occur on a specific road segment or intersection in a time period, while crash severity models generally explore the relationship between crash severity injury and the contributing factors such as driver behavior, vehicle characteristics, roadway geometry, and road-environment conditions. Effective interventions to reduce crash toll include design of safer infrastructure and incorporation of road safety features into land-use and transportation planning;improvement of vehicle safety features;improvement of post-crash care for victims of road crashes;and improvement of driver behavior, such as setting and enforcing laws relating to key risk factors, and raising public awareness. Despite the great efforts that transportation agencies put into preventive measures, the annual number of traffic crashes has not yet significantly decreased. For in-stance, 35,092 traffic fatalities were recorded in the US in 2015, an increase of 7.2% as compared to the previous year. With such a trend, this paper presents an overview of road crash prediction models used by transportation agencies and researchers to gain a better understanding of the techniques used in predicting road accidents and the risk factors that contribute to crash occurrence.
基金Supported in part by the National Natural Science Foundation of China, the Guangdong Natural Science Foundation (S2011010004511)the Fundamental Research Funds for the Central Universities of China (201120102020005)
文摘In this paper, we consider a compound Poisson risk model with taxes paid according to a loss-carry-forward system and dividends paid under a threshold strategy. First, the closed-form expression of the probability function for the total number of taxation periods over the lifetime of the surplus process is derived. Second, analytical expression of the expected accumulated discounted dividends paid between two consecutive taxation periods is provided. In addition, explicit expressions are also given for the exponential individual claims.
文摘Introduction: Studies have shown Emergency Department (ED) crowding contributes to reduced quality of patient care, delays in starting treatments, and increased number of patients leaving without being seen. This analysis shows how to theoretically and optimally align staffing to demand. Methods: The ED value stream was identified and mapped. Patients were stratified into three resource-driven care flow cells based on the severity indices. Time observations were conducted for each of the key care team members and the manual cycle times and service rate were calculated and stratified by severity indices. Using X32 Healthcare’s Online Staffing Optimization (OSO) tool, staffing inefficiencies were identified and an optimal schedule was created for each provider group. Results: Lower Severity Indices (higher acuity patient) led to longer times for providers, nurses, patient care assistants, and clerks. The patient length of stay varied from under one hour to over five hours. The flow of patients varied considerably over the 24 hours’ period but was similar by day of the week. Using flow data, we showed that we needed more nurses, more care team members during peak times of patient flow. Eight hour shifts would allow better flexibility. We showed that the additional salary hours added to the budget would be made up for by increased revenue recognized by decreasing the number of patients who leave without being seen. Conclusion: If implemented, these changes will improve ED flow by using lean tools and principles, ultimately leading to timeliness of care, reduced waits, and improved patient experience.
文摘In this note we study the optimal dividend problem for a company whose surplus process, in the absence of dividend payments, evolves as a generalized compound Poisson model in which the counting process is a generalized Poisson process. This model includes the classical risk model and the Pólya-Aeppli risk model as special cases. The objective is to find a dividend policy so as to maximize the expected discounted value of dividends which are paid to the shareholders until the company is ruined. We show that under some conditions the optimal dividend strategy is formed by a barrier strategy. Moreover, two conjectures are proposed.
基金supported by the Innovation Fund Research Project of State Key Laboratory for Geomechanics and Deep Underground Engineering,China University of Mining and Technology(Grant No.SKLGDUEK202201)the Foundation for the Opening of State Key Laboratory for Geomechanics and Deep Underground Engineering,China University of Mining and Technology(Grant No.SKLGDUEK2129)the Open Research Fund of State Key Laboratory of Geomechanics and Geotechnical Engineering,Institute of Rock and Soil Mechanics,Chinese Academy of Sciences(Grant No.Z020007)。
文摘In recent years,there is a scenario in urban tunnel constructions to build super-large-span tunnels for traffic diversion and route optimization purposes.However,the increased size makes tunnel support more difficult.Unfortunately,there are few studies on the failure and support mechanism of the surrounding rocks in the excavation of supported tunnel,while most model tests of super-large-span tunnels focus on the failure characteristics of surrounding rocks in tunnel excavation without supports.Based on excavation compensation method(ECM),model tests of a super-large-span tunnel excavation by different anchor cable support methods in the initial support stage were carried out.The results indicate that during excavation of super-large-span tunnel,the stress and displacement of the shallow surrounding rocks decrease,following a step-shape pattern,and the tunnel failure is mainly concentrated on the vault and spandrel areas.Compared with conventional anchor cable supports,the NPR(negative Poisson’s ratio)anchor cable support is more suitable for the initial support stage of the super-large-span tunnels.The tunnel support theory,model test materials,methods,and the results obtained in this study could provide references for study of similar super-large-span tunnels。
基金Supported by the Chinese Postdoctoral Science Foundation, the Young Scientists Funds of NSF of China (10401019)the Tsinghua Basic Research Foundation.
文摘In this article, two relaxation time limits, namely, the momentum relaxation time limit and the energy relaxation time limit are considered. By the compactness argument, it is obtained that the smooth solutions of the multidimensional nonisentropic Euler-Poisson problem converge to the solutions of an energy transport model or a drift diffusion model, respectively, with respect to different time scales.
基金supported by R&D Grant from University of DelhiDU-DST PURSE GrantICMR Grant No.3/1/3/JRF-2010/HRD-122(35831)
文摘Often the lifecycle data occur as count of the vital events and are recorded as integers.The purpose of this article is to model the fertility behavior based on religious,educational,economic,and occupational characteristics.The responses of classified groups according to these determinants are examined for significant influence on fertility using Poisson regression model(PRM) based on the National Family Health Survey-3 dataset.The observed and predicted probabilities under PRM indicate modal value of two children for the Poisson distribution modeled data.Presence of dominance of two child in the data motivates the authors to adopt multinomial regression model(MRM) in order to link fertility with various socioeconomic indicators responsible for fertility variation.Choice of the explanatory factors is limited to the availability of data.Trends and patterns of preference for birth counts suggest that religion,caste,wealth,female education,and occupation are the dominant factors shaping the observed birth process.Empirical analysis suggests that both the models used in the study perform similarly on the sample data.However,fitting of MRM by taking birth count of two as comparison category shows improved Akaike information criterion and consistent Akaike information criterion values.Current work contributes to the existing literature as it attempts to provide more insight into the determinants of Indian fertility using Poisson and MRM.
文摘It is difficult to measure the sizes of illegal drug user populations directly by using the survey method because of many “hidden drug addicts” and the difficulty of receiving a true response. Systematic and routine information on treatment episodes of drug users is adopted to estimate the population size in this study. Mixture models of zero-truncated Poisson distributions using the nonparametric maximum likelihood estimators (NPMLE) by means of capture-recapture repeated count data were used to project the number of drug users. The method was applied to surveillance data of drug users identified by treatment episodes in over 1140 health treatment centers in Thailand from the Bureau of Health Service System Development, Ministry of Public Health. We presented how this mixture model could be utilized to construct the unobserved frequency of drug users with no treatment episode and further estimated the total population size of drug users in the country from 2005 to 2007. The result of simulation was confirmed that mixture model is suitable when population is large. By means of mixture models, the estimations for the number of drug users were fitted with excellent goodness-of-fit values and we were also compared to the conventional Chao estimates. The NPMLE for the total number of drug users in Thailand 2005, 2006, and 2007 were 184,045 (95% CI: 181,297-86,793), 230,665 (95% CI: 226,611-234,719), 299,670 (95% CI: 294,217-305,123), respectively, also 125,265 (95% CI: 123,092-127,142), 166,287 (95% CI: 163,222-169,352), 228,898 (95% CI: 224,766 - 233,030) for the number of methamphetamine (Yaba) users, and 11,559 (95% CI: 10,234-12,884), 11,333 (95% CI: 9276-13,390), 8953 (95% CI: 7878-10,028) for the number of heroin users, respectively. The numbers of marijuana, kratom-plant, opium, and inhalant users were underestimated because their symptoms were mild and not severe enough to remedy in health treatment centers which led to the smaller size of the total number of drug users. The well-estimated sizes of heroin and methamphetamine addicts are high reliable because they are based on clearly evident count with a severe addiction problem to health treatment centers. The estimation by means of mixture models can be recommended to monitor drug demand trend and drug health service routinely;it is easy to calculate via the available programs MIXTP based on request.
基金National Outstanding Youth Science Fund Project,China(No.71401173)
文摘The degradation process modeling is one of research hotspots of prognostic and health management(PHM),which can be used to estimate system reliability and remaining useful life(RUL).In order to study system degradation process,cumulative damage model is used for degradation modeling.Assuming that damage increment is Gamma distribution,shock counting subjects to a homogeneous Poisson process(HPP)when degradation process is linear,and shock counting is a non-homogeneous Poisson process(NHPP)when degradation process is nonlinear.A two-stage degradation system is considered in this paper,for which the degradation process is linear in the first stage and the degradation process is nonlinear in the second stage.A nonlinear modeling method for considered system is put forward,and reliability model and remaining useful life model are established.A case study is given to validate the veracities of established models.
文摘Several economists agree to say that the need for adjustment was essential for African countries over the decade of the 80’s. The econometric analysis of a sample of 28 sub-Saharan African countries, from variables regarded as “representatives” for the adjustment objectives, proves that this assertion cannot be completely rejected.
文摘This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estimators (MLEs). The results of a modest simulation study are presented.