In this paper,we highlight some recent developments of a new route to evaluate macroeconomic policy effects,which are investigated under the framework with potential outcomes.First,this paper begins with a brief intro...In this paper,we highlight some recent developments of a new route to evaluate macroeconomic policy effects,which are investigated under the framework with potential outcomes.First,this paper begins with a brief introduction of the basic model setup in modern econometric analysis of program evaluation.Secondly,primary attention goes to the focus on causal effect estimation of macroeconomic policy with single time series data together with some extensions to multiple time series data.Furthermore,we examine the connection of this new approach to traditional macroeconomic models for policy analysis and evaluation.Finally,we conclude by addressing some possible future research directions in statistics and econometrics.展开更多
Building a well-off society in an all-round way is the goal put forward at the 16th CPC National Congress for the first two decades of this century.According to 'Statistical Monitoring Program on Building a Well-o...Building a well-off society in an all-round way is the goal put forward at the 16th CPC National Congress for the first two decades of this century.According to 'Statistical Monitoring Program on Building a Well-off Society' [1], Institute of Statistical Science,National Bureau of Statistics of China and local statistics research departments had conducted statistical monitoring for the process of building a well-off society in an all-round way from 2000 to 2010 nationwide and locally.The result shows that,over the past decade,under the correct leadership of the CPC Central Committee and the State Council,China has succeeded in overcoming the impacts of many unfavorable factors including serious international financial crisis,rising production costs,the SARS epidemic,rare snow disasters and earthquakes, landslides,and the debt crisis of European sovereign.展开更多
The value of a statistical life(VSL)is a crucial tool for monetizing health impacts.To explore the VSL in China,this study examines people’s willingness to pay(WTP)to reduce death risk from air pollution in six repre...The value of a statistical life(VSL)is a crucial tool for monetizing health impacts.To explore the VSL in China,this study examines people’s willingness to pay(WTP)to reduce death risk from air pollution in six representative cities in China based on face-to-face contingent valuation interviews(n=3936)from March 7,2019 to September 30,2019.The results reveal that the WTP varied from CNY 455 to 763 in 2019(USD 66-111),corresponding to a VSL range of CNY 3.79-6.36 million(USD 549395-921940).The VSL in China in 2019 is estimated to be CNY 4.76 million(USD 689659).The statistics indicate that monthly expenditure levels,environmental concerns,risk attitudes,and assumed market acceptance,which have seldom been dis‐cussed in previous studies,significantly impact WTP and VSL.These findings will serve as a reference for ana‐lyzing mortality risk reduction benefits in future research and for policymaking.展开更多
This paper provides a concise description of the philosophy, mathematics, and algorithms for estimating, detecting, and attributing climate changes. The estimation follows the spectral method by using empirical orthog...This paper provides a concise description of the philosophy, mathematics, and algorithms for estimating, detecting, and attributing climate changes. The estimation follows the spectral method by using empirical orthogonal functions, also called the method of reduced space optimal averaging. The detection follows the linear regression method, which can be found in most textbooks about multivariate statistical techniques. The detection algorithms are described by using the space-time approach to avoid the non-stationarity problem. The paper includes (1) the optimal averaging method for minimizing the uncertainties of the global change estimate, (2) the weighted least square detection of both single and multiple signals, (3) numerical examples, and (4) the limitations of the linear optimal averaging and detection methods.展开更多
The era of big data brings opportunities and challenges to developing new statistical methods and models to evaluate social programs or economic policies or interventions. This paper provides a comprehensive review on...The era of big data brings opportunities and challenges to developing new statistical methods and models to evaluate social programs or economic policies or interventions. This paper provides a comprehensive review on some recent advances in statistical methodologies and models to evaluate programs with high-dimensional data. In particular, four kinds of methods for making valid statistical inferences for treatment effects in high dimensions are addressed. The first one is the so-called doubly robust type estimation, which models the outcome regression and propensity score functions simultaneously. The second one is the covariate balance method to construct the treatment effect estimators. The third one is the sufficient dimension reduction approach for causal inferences. The last one is the machine learning procedure directly or indirectly to make statistical inferences to treatment effect. In such a way, some of these methods and models are closely related to the de-biased Lasso type methods for the regression model with high dimensions in the statistical literature. Finally, some future research topics are also discussed.展开更多
Proposing new statistical distributions which are more flexible than the existing distributions have become a recent trend in the practice of distribution theory.Actuaries often search for new and appropriate statisti...Proposing new statistical distributions which are more flexible than the existing distributions have become a recent trend in the practice of distribution theory.Actuaries often search for new and appropriate statistical models to address data related to financial and risk management problems.In the present study,an extension of the Lomax distribution is proposed via using the approach of the weighted T-X family of distributions.The mathematical properties along with the characterization of the new model via truncated moments are derived.The model parameters are estimated via a prominent approach called the maximum likelihood estimation method.A brief Monte Carlo simulation study to assess the performance of the model parameters is conducted.An application to medical care insurance data is provided to illustrate the potentials of the newly proposed extension of the Lomax distribution.The comparison of the proposed model is made with the(i)Two-parameter Lomax distribution,(ii)Three-parameter models called the half logistic Lomax and exponentiated Lomax distributions,and(iii)A four-parameter model called the Kumaraswamy Lomax distribution.The statistical analysis indicates that the proposed model performs better than the competitive models in analyzing data in financial and actuarial sciences.展开更多
Although hierarchical correlated data are increasingly available and are being used in evidence-based medical practices and health policy decision making, there is a lack of information about the strengths and weaknes...Although hierarchical correlated data are increasingly available and are being used in evidence-based medical practices and health policy decision making, there is a lack of information about the strengths and weaknesses of the methods of analysis with such data. In this paper, we describe the use of hierarchical data in a family study of alcohol abuse conducted in Edmonton, Canada, that attempted to determine whether alcohol abuse in probands is associated with abuse in their first-degree relatives. We review three methods of analyzing discrete hierarchical data to account for correlations among the relatives. We conclude that the best analytic choice for typical correlated discrete hierarchical data is by nonlinear mixed effects modeling using a likelihood-based approach or multilevel (hierarchical) modeling using a quasilikelihood approach, especially when dealing with heterogeneous patient data.展开更多
In popular Baba-Engle-Kraft-Kroner(BEKK)and dynamic conditional correlation(DCC)multivariate generalized autoregressive conditional heteroskedasticity models,the large number of parameters and the requirement of posit...In popular Baba-Engle-Kraft-Kroner(BEKK)and dynamic conditional correlation(DCC)multivariate generalized autoregressive conditional heteroskedasticity models,the large number of parameters and the requirement of positive definiteness of the covariance and correlation matrices pose some difficulties during the estimation process.To avoid these issues,we propose two modifications to the BEKK and DCC models that employ two spherical parameterizations applied to the Cholesky decompositions of the covariance and correlation matrices.In their full specifications,the introduced Cholesky-BEKK and Cholesky-DCC models allow for a reduction in the number of parameters compared with their traditional counterparts.Moreover,the application of spherical transformation does not require the imposition of inequality constraints on the parameters during the estimation.An application to two crude oils,WTI and Brent,and the main exchange rate prices demonstrates that the Cholesky-BEKK and Cholesky-DCC models can capture the dynamics of covariances and correlations.In addition,the Kupiec test on different portfolio compositions confirms the satisfactory performance of the proposed models.展开更多
Self-consolidating concrete(SCC)is an important innovation in concrete technology due to its superior properties.However,predicting its compressive strength remains challenging due to variability in its composition an...Self-consolidating concrete(SCC)is an important innovation in concrete technology due to its superior properties.However,predicting its compressive strength remains challenging due to variability in its composition and uncertainties in prediction outcomes.This study combines machine learning(ML)models with conformal prediction(CP)to address these issues,offering prediction intervals that quantify uncertainty and reliability.A dataset of over 3000 samples with 17 input variables was used to train four ensemble methods,including Random Forest(RF),Gradient Boosting Regressor(GBR),Extreme gradient boosting(XGBoost),and light gradient boosting machine(LGBM),along with CP techniques,including cross-validation plus(CV+)and conformalized quantile regression(CQR)methods.Results demonstrate that LGBM and XGBoost outperform RF,improving R^(2) by 4.5%and 5.7%and reducing Root-mean-square Error(RMSE)by 24.6%and 24.8%,respectively.While CV+yielded narrower but constant intervals,CV+_Gamma and CQR provided adaptive intervals,highlighting trade-offs among precision,adaptability,and coverage reliability.The integration of CP offers a robust framework for uncertainty quantification in SCC strength prediction and marks a significant step forward in ML applications for concrete research.展开更多
Climate change is an essential topic in climate science,and the accessibility of accurate,high-resolution datasets in recent years has facilitated the extraction of more insights from big-data resources.Nonetheless,cu...Climate change is an essential topic in climate science,and the accessibility of accurate,high-resolution datasets in recent years has facilitated the extraction of more insights from big-data resources.Nonetheless,current research predominantly focuses on mean-value changes and largely overlooks changes in the probability distribution.In this study,a novel method called Wasserstein Stability Analysis(WSA)is developed to identify probability density function(PDF)changes,especially the extreme event shift and nonlinear physical value constraint variation in climate change.WSA is applied to the early 21st century and compared with traditional mean-value trend analysis.The results indicate that despite no significant trend,the equatorial eastern Pacific experienced a decline in hot extremes and an increase in cold extremes,indicating a La Nina-like temperature shift.Further analysis at two Arctic locations suggests sea ice severely restricts the hot extremes of surface air temperature.This impact is diminishing as the sea ice melts.By revealing PDF shifts,WSA emerges as a powerful tool to re-examine climate change dynamics,providing enhanced data-driven insights for understanding climate evolution.展开更多
In this paper,we consider the inhomogeneous pressureless Euler equations.First,we present a class of self-similar analytical solutions to the 1D Cauchy problem and investigate the large-time behavior of the solutions,...In this paper,we consider the inhomogeneous pressureless Euler equations.First,we present a class of self-similar analytical solutions to the 1D Cauchy problem and investigate the large-time behavior of the solutions,and particularly,we obtain slant kink-wave solutions for the inhomogeneous Burgers(InhB)type equation.Next,we prove the integrability of the InhB equation in the sense of Lax pair.Furthermore,we study the spreading rate of the moving domain occupied by mass for the 1D Cauchy problem with compact support initial density.We find that the expanding domain grows exponentially in time,provided that the solutions exist and smooth at all time.Finally,we extend the corresponding results of the inhomogeneous pressureless Euler equations to the radially symmetric multi-dimensional case.展开更多
We prove that for a smooth convex body K⊂ℝ^(d),d≥2,with positive Gauss curvature,its homothety with a certain associated convex body implies that K is either a ball or an ellipsoid,depending on the associated body co...We prove that for a smooth convex body K⊂ℝ^(d),d≥2,with positive Gauss curvature,its homothety with a certain associated convex body implies that K is either a ball or an ellipsoid,depending on the associated body considered.展开更多
In this study,the advanced machine learning algorithm NESTORE(Next Strong Related Earthquake)was applied to the Japan Meteorological Agency catalog(1973-2024).It calculates the probability that the aftershocks will re...In this study,the advanced machine learning algorithm NESTORE(Next Strong Related Earthquake)was applied to the Japan Meteorological Agency catalog(1973-2024).It calculates the probability that the aftershocks will reach or exceed a magnitude equal to the magnitude of the mainshock minus one and classifies the clusters as type A or type B,depending on whether this condition is met or not.It has been shown useful in the tests in Italy,western Slovenia,Greece,and California.Due to Japan’s high and complex seismic activity,new algorithms were developed to complement NESTORE:a hybrid cluster identification method,which uses both ETAS-based stochastic declustering and deterministic graph-based selection,and REPENESE(RElevant features,class imbalance PErcentage,NEighbour detection,SElection),an algorithm for detecting outliers in skewed class distributions,which takes in account if one class has a larger number of samples with respect to the other(class imbalance).Trained with data from 1973 to 2004(7 type A and 43 type B clusters)and tested from 2005 to 2023(4 type A and 27 type B clusters),the method correctly forecasted 75%of A clusters and 96%of B clusters,achieving a precision of 0.75 and an accuracy of 0.94 six hours after the mainshock.It accurately classified the 2011 Tōhoku event cluster.Near-real-time forecasting was applied to the sequence after the April 17,2024 M6.6 earthquake in Shikoku,correctly classifying it as a“Type B cluster”.These results highlight the potential for the forecasting of strong aftershocks in regions with high seismicity and class imbalance,as evidenced by the high recall,precision and accuracy values achieved in the test phase.展开更多
AIM: To evaluate the efficacy of water supplementation treatment in patients with functional dyspepsia or irritable bowe syndrome (IBS) accompanying predominant constipation. METHODS: A total of 3872 patients with...AIM: To evaluate the efficacy of water supplementation treatment in patients with functional dyspepsia or irritable bowe syndrome (IBS) accompanying predominant constipation. METHODS: A total of 3872 patients with functional dyspepsia and 3609 patients with irritable bowel syndrome were enrolled in the study by 18 Italina thermal centres. Patients underwent a first cycle of thermal therapy for 21 d. A year later patients were re-evaluated at the same centre and received another cycle of thermal therapy. A questionnare to collect personal data on social and occupational status, family and pathological case history, life style, clinical records, utilisation of welfare and health structure and devices was administered to each patient at basal time and one year after each thermal treatment. Sixty patients with functional dyspepsia and 20 with IBS and 80 healthy controls received an evaluation of gastric output and oro-cecal transit time by breath test analysis. Breath test was performed at basal time and after water supplementaton therapies. Gastrointestinal symptoms were evaluated at the same time points. Breath samples were analyzed with a mass spectometer and a gascromatograph. Results were expressed as T1/2 and T-lag for octanoic add breath test and as oro-cecal transit time for lactulose breath test. RESULTS: A significant reduction of prevalence of symptoms was observed at the end of the first and second cycles of thermal therapy in dyspeptic and IBS patients, The analysis of variance showed a real and persistant improvement of symptoms in all patients. After water supplementation for 3 wk a reduction of gastric output was observed in 49 (87.5%) of 56 dyspepUc patients. Both T1/2 and T-lag were significantly reduced after the therapy compared to basal values [91 ± 12 (T1/2) and 53± 11 (T-lag), Tables 1 and 2] with results of octanoic acid breath test similar to healthy subjects. After water supplementation for 3 wk oro-cecal transit time was shorter than that at the beginning of the study. CONCLUSION: Mineral water supplementation treatment for functional dyspepsia or conspipation accompanying IBS can improve gastric add output and intestinal transit time.展开更多
Dear Editor,This letter presents a coverage optimization algorithm for underwater acoustic sensor networks(UASN)based on Dijkstra method.Due to the particularity of underwater environment,the multipath effect and chan...Dear Editor,This letter presents a coverage optimization algorithm for underwater acoustic sensor networks(UASN)based on Dijkstra method.Due to the particularity of underwater environment,the multipath effect and channel are easily disturbed,resulting in more node energy consumption.Once the energy is exhausted,the network transmission stability and network connectivity will be affected.展开更多
Quantitative precipitation estimation (QPE) plays an important role in meteorological and hydrological applications.Ground-based telemetered rain gauges are widely used to collect precipitation measurements.Spatial ...Quantitative precipitation estimation (QPE) plays an important role in meteorological and hydrological applications.Ground-based telemetered rain gauges are widely used to collect precipitation measurements.Spatial interpolation methods are commonly employed to estimate precipitation fields covering non-observed locations.Kriging is a simple and popular geostatistical interpolation method,but it has two known problems:uncertainty underestimation and violation of assumptions.This paper tackles these problems and seeks an optimal spatial interpolation for QPE in order to enhance spatial interpolation through appropriately assessing prediction uncertainty and fulfilling the required assumptions.To this end,several methods are tested:transformation,detrending,multiple spatial correlation functions,and Bayesian kriging.In particular,we focus on a short-term and time-specific rather than a long-term and event-specific analysis.This paper analyzes a stratiform rain event with an embedded convection linked to the passing monsoon front on the 23 August 2012.Data from a total of 100 automatic weather stations are used,and the rainfall intensities are calculated from the difference of 15 minute accumulated rainfall observed every 1 minute.The one-hour average rainfall intensity is then calculated to minimize the measurement random error.Cross-validation is carried out for evaluating the interpolation methods at regional and local levels.As a result,transformation is found to play an important role in improving spatial interpolation and uncertainty assessment,and Bayesian methods generally outperform traditional ones in terms of the criteria.展开更多
This article attempts to give a short survey of recent progress on a class of elementary stochastic partial differential equations (for example, stochastic heat equations) driven by Gaussian noise of various covarianc...This article attempts to give a short survey of recent progress on a class of elementary stochastic partial differential equations (for example, stochastic heat equations) driven by Gaussian noise of various covariance structures. The focus is on the existence and uniqueness of the classical (square integrable) solution (mild solution, weak solution). It is also concerned with the Feynman-Kac formula for the solution;Feynman-Kac formula for the moments of the solution;and their applications to the asymptotic moment bounds of the solution. It also briefly touches the exact asymptotics of the moments of the solution.展开更多
This paper is concerned with the stability of the rarefaction wave for the Burgers equationwhere 0 ≤ a < 1/4p (q is determined by (2.2)). Roughly speaking, under the assumption that u_ < u+, the authors prove t...This paper is concerned with the stability of the rarefaction wave for the Burgers equationwhere 0 ≤ a < 1/4p (q is determined by (2.2)). Roughly speaking, under the assumption that u_ < u+, the authors prove the existence of the global smooth solution to the Cauchy problem (I), also find the solution u(x, t) to the Cauchy problem (I) satisfying sup |u(x, t) -uR(x/t)| → 0 as t → ∞, where uR(x/t) is the rarefaction wave of the non-viscous Burgersequation ut + f(u)x = 0 with Riemann initial data u(x, 0) =展开更多
基金the National Natural Science Foundation of China(71631004,Key Project)the National Science Fund for Distinguished Young Scholars(71625001)+2 种基金the Basic Scientific Center Project of National Science Foundation of China:Econometrics and Quantitative Policy Evaluation(71988101)the Science Foundation of Ministry of Education of China(19YJA910003)China Scholarship Council Funded Project(201806315045).
文摘In this paper,we highlight some recent developments of a new route to evaluate macroeconomic policy effects,which are investigated under the framework with potential outcomes.First,this paper begins with a brief introduction of the basic model setup in modern econometric analysis of program evaluation.Secondly,primary attention goes to the focus on causal effect estimation of macroeconomic policy with single time series data together with some extensions to multiple time series data.Furthermore,we examine the connection of this new approach to traditional macroeconomic models for policy analysis and evaluation.Finally,we conclude by addressing some possible future research directions in statistics and econometrics.
文摘Building a well-off society in an all-round way is the goal put forward at the 16th CPC National Congress for the first two decades of this century.According to 'Statistical Monitoring Program on Building a Well-off Society' [1], Institute of Statistical Science,National Bureau of Statistics of China and local statistics research departments had conducted statistical monitoring for the process of building a well-off society in an all-round way from 2000 to 2010 nationwide and locally.The result shows that,over the past decade,under the correct leadership of the CPC Central Committee and the State Council,China has succeeded in overcoming the impacts of many unfavorable factors including serious international financial crisis,rising production costs,the SARS epidemic,rare snow disasters and earthquakes, landslides,and the debt crisis of European sovereign.
基金supported by the National Natural Science Foun‐dation of China[Grant No.71773061].
文摘The value of a statistical life(VSL)is a crucial tool for monetizing health impacts.To explore the VSL in China,this study examines people’s willingness to pay(WTP)to reduce death risk from air pollution in six representative cities in China based on face-to-face contingent valuation interviews(n=3936)from March 7,2019 to September 30,2019.The results reveal that the WTP varied from CNY 455 to 763 in 2019(USD 66-111),corresponding to a VSL range of CNY 3.79-6.36 million(USD 549395-921940).The VSL in China in 2019 is estimated to be CNY 4.76 million(USD 689659).The statistics indicate that monthly expenditure levels,environmental concerns,risk attitudes,and assumed market acceptance,which have seldom been dis‐cussed in previous studies,significantly impact WTP and VSL.These findings will serve as a reference for ana‐lyzing mortality risk reduction benefits in future research and for policymaking.
文摘This paper provides a concise description of the philosophy, mathematics, and algorithms for estimating, detecting, and attributing climate changes. The estimation follows the spectral method by using empirical orthogonal functions, also called the method of reduced space optimal averaging. The detection follows the linear regression method, which can be found in most textbooks about multivariate statistical techniques. The detection algorithms are described by using the space-time approach to avoid the non-stationarity problem. The paper includes (1) the optimal averaging method for minimizing the uncertainties of the global change estimate, (2) the weighted least square detection of both single and multiple signals, (3) numerical examples, and (4) the limitations of the linear optimal averaging and detection methods.
基金Supported by the National Natural Science Foundation of China(71631004, 72033008)National Science Foundation for Distinguished Young Scholars(71625001)Science Foundation of Ministry of Education of China(19YJA910003)。
文摘The era of big data brings opportunities and challenges to developing new statistical methods and models to evaluate social programs or economic policies or interventions. This paper provides a comprehensive review on some recent advances in statistical methodologies and models to evaluate programs with high-dimensional data. In particular, four kinds of methods for making valid statistical inferences for treatment effects in high dimensions are addressed. The first one is the so-called doubly robust type estimation, which models the outcome regression and propensity score functions simultaneously. The second one is the covariate balance method to construct the treatment effect estimators. The third one is the sufficient dimension reduction approach for causal inferences. The last one is the machine learning procedure directly or indirectly to make statistical inferences to treatment effect. In such a way, some of these methods and models are closely related to the de-biased Lasso type methods for the regression model with high dimensions in the statistical literature. Finally, some future research topics are also discussed.
文摘Proposing new statistical distributions which are more flexible than the existing distributions have become a recent trend in the practice of distribution theory.Actuaries often search for new and appropriate statistical models to address data related to financial and risk management problems.In the present study,an extension of the Lomax distribution is proposed via using the approach of the weighted T-X family of distributions.The mathematical properties along with the characterization of the new model via truncated moments are derived.The model parameters are estimated via a prominent approach called the maximum likelihood estimation method.A brief Monte Carlo simulation study to assess the performance of the model parameters is conducted.An application to medical care insurance data is provided to illustrate the potentials of the newly proposed extension of the Lomax distribution.The comparison of the proposed model is made with the(i)Two-parameter Lomax distribution,(ii)Three-parameter models called the half logistic Lomax and exponentiated Lomax distributions,and(iii)A four-parameter model called the Kumaraswamy Lomax distribution.The statistical analysis indicates that the proposed model performs better than the competitive models in analyzing data in financial and actuarial sciences.
文摘Although hierarchical correlated data are increasingly available and are being used in evidence-based medical practices and health policy decision making, there is a lack of information about the strengths and weaknesses of the methods of analysis with such data. In this paper, we describe the use of hierarchical data in a family study of alcohol abuse conducted in Edmonton, Canada, that attempted to determine whether alcohol abuse in probands is associated with abuse in their first-degree relatives. We review three methods of analyzing discrete hierarchical data to account for correlations among the relatives. We conclude that the best analytic choice for typical correlated discrete hierarchical data is by nonlinear mixed effects modeling using a likelihood-based approach or multilevel (hierarchical) modeling using a quasilikelihood approach, especially when dealing with heterogeneous patient data.
文摘In popular Baba-Engle-Kraft-Kroner(BEKK)and dynamic conditional correlation(DCC)multivariate generalized autoregressive conditional heteroskedasticity models,the large number of parameters and the requirement of positive definiteness of the covariance and correlation matrices pose some difficulties during the estimation process.To avoid these issues,we propose two modifications to the BEKK and DCC models that employ two spherical parameterizations applied to the Cholesky decompositions of the covariance and correlation matrices.In their full specifications,the introduced Cholesky-BEKK and Cholesky-DCC models allow for a reduction in the number of parameters compared with their traditional counterparts.Moreover,the application of spherical transformation does not require the imposition of inequality constraints on the parameters during the estimation.An application to two crude oils,WTI and Brent,and the main exchange rate prices demonstrates that the Cholesky-BEKK and Cholesky-DCC models can capture the dynamics of covariances and correlations.In addition,the Kupiec test on different portfolio compositions confirms the satisfactory performance of the proposed models.
基金financially supported by the Natural Sciences and Engineering Research Council of Canada(NSERCGrant No.ALLRP 576708-22)ten industrial partners.
文摘Self-consolidating concrete(SCC)is an important innovation in concrete technology due to its superior properties.However,predicting its compressive strength remains challenging due to variability in its composition and uncertainties in prediction outcomes.This study combines machine learning(ML)models with conformal prediction(CP)to address these issues,offering prediction intervals that quantify uncertainty and reliability.A dataset of over 3000 samples with 17 input variables was used to train four ensemble methods,including Random Forest(RF),Gradient Boosting Regressor(GBR),Extreme gradient boosting(XGBoost),and light gradient boosting machine(LGBM),along with CP techniques,including cross-validation plus(CV+)and conformalized quantile regression(CQR)methods.Results demonstrate that LGBM and XGBoost outperform RF,improving R^(2) by 4.5%and 5.7%and reducing Root-mean-square Error(RMSE)by 24.6%and 24.8%,respectively.While CV+yielded narrower but constant intervals,CV+_Gamma and CQR provided adaptive intervals,highlighting trade-offs among precision,adaptability,and coverage reliability.The integration of CP offers a robust framework for uncertainty quantification in SCC strength prediction and marks a significant step forward in ML applications for concrete research.
基金supported by the National Key Research and Development Program of China(Grant No.2021YFC3000904)the National Natural Science Foundation of China(42005039)the Science and Technology Development Fund of CAMS(Grant No.2024KJ013)。
文摘Climate change is an essential topic in climate science,and the accessibility of accurate,high-resolution datasets in recent years has facilitated the extraction of more insights from big-data resources.Nonetheless,current research predominantly focuses on mean-value changes and largely overlooks changes in the probability distribution.In this study,a novel method called Wasserstein Stability Analysis(WSA)is developed to identify probability density function(PDF)changes,especially the extreme event shift and nonlinear physical value constraint variation in climate change.WSA is applied to the early 21st century and compared with traditional mean-value trend analysis.The results indicate that despite no significant trend,the equatorial eastern Pacific experienced a decline in hot extremes and an increase in cold extremes,indicating a La Nina-like temperature shift.Further analysis at two Arctic locations suggests sea ice severely restricts the hot extremes of surface air temperature.This impact is diminishing as the sea ice melts.By revealing PDF shifts,WSA emerges as a powerful tool to re-examine climate change dynamics,providing enhanced data-driven insights for understanding climate evolution.
基金Supported by the Henan Natural Science Foundation(242300421397)the Basic Research Projects of Key Scientific Research Projects Plan in Henan Higher Education Institutions(25ZX013)+2 种基金the Scientific Research Team Plan of Zhengzhou University of Aeronautics(23ZHTD01003)the National Natural Science Foundation of China(11971475)the FLASS Internationalization and Exchange Scheme(FLASS/IE−D09/19-20−FLASS).
文摘In this paper,we consider the inhomogeneous pressureless Euler equations.First,we present a class of self-similar analytical solutions to the 1D Cauchy problem and investigate the large-time behavior of the solutions,and particularly,we obtain slant kink-wave solutions for the inhomogeneous Burgers(InhB)type equation.Next,we prove the integrability of the InhB equation in the sense of Lax pair.Furthermore,we study the spreading rate of the moving domain occupied by mass for the 1D Cauchy problem with compact support initial density.We find that the expanding domain grows exponentially in time,provided that the solutions exist and smooth at all time.Finally,we extend the corresponding results of the inhomogeneous pressureless Euler equations to the radially symmetric multi-dimensional case.
文摘We prove that for a smooth convex body K⊂ℝ^(d),d≥2,with positive Gauss curvature,its homothety with a certain associated convex body implies that K is either a ball or an ellipsoid,depending on the associated body considered.
基金funded by a grant from the Italian Ministry of Foreign Affairs and International Cooperation and Co-funded within the RETURN Extended Partnership and received funding from the European Union Next-GenerationEU(National Recovery and Resilience Plan-NRRP,Mission 4,Component 2,Investment 1.3-D.D.12432/8/2022,PE0000005)the grant“Progetto INGV Pianeta Dinamico:Near real-time results of Physical and Statistical Seismology for earthquakes observations,modelling and forecasting(NEMESIS)”-code CUP D53J19000170001-funded by Italian Ministry MIUR(“Fondo Finalizzato al rilancio degli investimenti delle amministrazioni centrali dello Stato e allo sviluppo del Paese”,legge 145/2018)supported by the Japan Ministry of Education,Culture,Sports,Science and Technology(MEXT)project for seismology Toward Research innovation with data of earthquakes(STAR-E),Grant Number JPJ010217.
文摘In this study,the advanced machine learning algorithm NESTORE(Next Strong Related Earthquake)was applied to the Japan Meteorological Agency catalog(1973-2024).It calculates the probability that the aftershocks will reach or exceed a magnitude equal to the magnitude of the mainshock minus one and classifies the clusters as type A or type B,depending on whether this condition is met or not.It has been shown useful in the tests in Italy,western Slovenia,Greece,and California.Due to Japan’s high and complex seismic activity,new algorithms were developed to complement NESTORE:a hybrid cluster identification method,which uses both ETAS-based stochastic declustering and deterministic graph-based selection,and REPENESE(RElevant features,class imbalance PErcentage,NEighbour detection,SElection),an algorithm for detecting outliers in skewed class distributions,which takes in account if one class has a larger number of samples with respect to the other(class imbalance).Trained with data from 1973 to 2004(7 type A and 43 type B clusters)and tested from 2005 to 2023(4 type A and 27 type B clusters),the method correctly forecasted 75%of A clusters and 96%of B clusters,achieving a precision of 0.75 and an accuracy of 0.94 six hours after the mainshock.It accurately classified the 2011 Tōhoku event cluster.Near-real-time forecasting was applied to the sequence after the April 17,2024 M6.6 earthquake in Shikoku,correctly classifying it as a“Type B cluster”.These results highlight the potential for the forecasting of strong aftershocks in regions with high seismicity and class imbalance,as evidenced by the high recall,precision and accuracy values achieved in the test phase.
文摘AIM: To evaluate the efficacy of water supplementation treatment in patients with functional dyspepsia or irritable bowe syndrome (IBS) accompanying predominant constipation. METHODS: A total of 3872 patients with functional dyspepsia and 3609 patients with irritable bowel syndrome were enrolled in the study by 18 Italina thermal centres. Patients underwent a first cycle of thermal therapy for 21 d. A year later patients were re-evaluated at the same centre and received another cycle of thermal therapy. A questionnare to collect personal data on social and occupational status, family and pathological case history, life style, clinical records, utilisation of welfare and health structure and devices was administered to each patient at basal time and one year after each thermal treatment. Sixty patients with functional dyspepsia and 20 with IBS and 80 healthy controls received an evaluation of gastric output and oro-cecal transit time by breath test analysis. Breath test was performed at basal time and after water supplementaton therapies. Gastrointestinal symptoms were evaluated at the same time points. Breath samples were analyzed with a mass spectometer and a gascromatograph. Results were expressed as T1/2 and T-lag for octanoic add breath test and as oro-cecal transit time for lactulose breath test. RESULTS: A significant reduction of prevalence of symptoms was observed at the end of the first and second cycles of thermal therapy in dyspeptic and IBS patients, The analysis of variance showed a real and persistant improvement of symptoms in all patients. After water supplementation for 3 wk a reduction of gastric output was observed in 49 (87.5%) of 56 dyspepUc patients. Both T1/2 and T-lag were significantly reduced after the therapy compared to basal values [91 ± 12 (T1/2) and 53± 11 (T-lag), Tables 1 and 2] with results of octanoic acid breath test similar to healthy subjects. After water supplementation for 3 wk oro-cecal transit time was shorter than that at the beginning of the study. CONCLUSION: Mineral water supplementation treatment for functional dyspepsia or conspipation accompanying IBS can improve gastric add output and intestinal transit time.
基金supported by the Natural Science Foundation of Shandong Province(ZR2022MF247)。
文摘Dear Editor,This letter presents a coverage optimization algorithm for underwater acoustic sensor networks(UASN)based on Dijkstra method.Due to the particularity of underwater environment,the multipath effect and channel are easily disturbed,resulting in more node energy consumption.Once the energy is exhausted,the network transmission stability and network connectivity will be affected.
基金funded by the Korea Meteorological Administration Research and Development Program (Grant No. CATER 2013-2040)supported by the Brain Pool program of the Korean Federation of Science and Technology Societies (KOFST) (Grant No. 122S-1-3-0422)
文摘Quantitative precipitation estimation (QPE) plays an important role in meteorological and hydrological applications.Ground-based telemetered rain gauges are widely used to collect precipitation measurements.Spatial interpolation methods are commonly employed to estimate precipitation fields covering non-observed locations.Kriging is a simple and popular geostatistical interpolation method,but it has two known problems:uncertainty underestimation and violation of assumptions.This paper tackles these problems and seeks an optimal spatial interpolation for QPE in order to enhance spatial interpolation through appropriately assessing prediction uncertainty and fulfilling the required assumptions.To this end,several methods are tested:transformation,detrending,multiple spatial correlation functions,and Bayesian kriging.In particular,we focus on a short-term and time-specific rather than a long-term and event-specific analysis.This paper analyzes a stratiform rain event with an embedded convection linked to the passing monsoon front on the 23 August 2012.Data from a total of 100 automatic weather stations are used,and the rainfall intensities are calculated from the difference of 15 minute accumulated rainfall observed every 1 minute.The one-hour average rainfall intensity is then calculated to minimize the measurement random error.Cross-validation is carried out for evaluating the interpolation methods at regional and local levels.As a result,transformation is found to play an important role in improving spatial interpolation and uncertainty assessment,and Bayesian methods generally outperform traditional ones in terms of the criteria.
基金supported by an NSERC granta startup fund of University of Alberta
文摘This article attempts to give a short survey of recent progress on a class of elementary stochastic partial differential equations (for example, stochastic heat equations) driven by Gaussian noise of various covariance structures. The focus is on the existence and uniqueness of the classical (square integrable) solution (mild solution, weak solution). It is also concerned with the Feynman-Kac formula for the solution;Feynman-Kac formula for the moments of the solution;and their applications to the asymptotic moment bounds of the solution. It also briefly touches the exact asymptotics of the moments of the solution.
文摘This paper is concerned with the stability of the rarefaction wave for the Burgers equationwhere 0 ≤ a < 1/4p (q is determined by (2.2)). Roughly speaking, under the assumption that u_ < u+, the authors prove the existence of the global smooth solution to the Cauchy problem (I), also find the solution u(x, t) to the Cauchy problem (I) satisfying sup |u(x, t) -uR(x/t)| → 0 as t → ∞, where uR(x/t) is the rarefaction wave of the non-viscous Burgersequation ut + f(u)x = 0 with Riemann initial data u(x, 0) =