The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of ...The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.展开更多
Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradi...Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.展开更多
The rapid expansion of offshore wind energy necessitates robust and cost-effective electrical collector system(ECS)designs that prioritize lifetime operational reliability.Traditional optimization approaches often sim...The rapid expansion of offshore wind energy necessitates robust and cost-effective electrical collector system(ECS)designs that prioritize lifetime operational reliability.Traditional optimization approaches often simplify reliability considerations or fail to holistically integrate them with economic and technical constraints.This paper introduces a novel,two-stage optimization framework for offshore wind farm(OWF)ECS planning that systematically incorporates reliability.The first stage employs Mixed-Integer Linear Programming(MILP)to determine an optimal radial network topology,considering linearized reliability approximations and geographical constraints.The second stage enhances this design by strategically placing tie-lines using a Mixed-Integer Quadratically Constrained Program(MIQCP).This stage leverages a dynamic-aware adaptation of Multi-Source Multi-Terminal Network Reliability(MSMT-NR)assessment,with its inherent nonlinear equations successfully transformed into a solvable MIQCP form for loopy networks.A benchmark case study demonstrates the framework’s efficacy,illustrating how increasing the emphasis on reliability leads to more distributed and interconnected network topologies,effectively balancing investment costs against enhanced system resilience.展开更多
The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smoot...The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smooth transformation between small-scale laboratory specimens’fatigue properties and full-scale engineering components’fatigue strength has been a long-term challenge.In this work,two dominant factors impeding the smooth transformation—notch and size effect were experimentally studied,in which fatigue tests on Al 7075-T6511(a very high-strength aviation alloy)notched specimens of different scales were carried out.Fractography analyses identified the evidence of the size effect on notch fatigue damage evolution.Accordingly,the Energy Field Intensity(EFI)initially developed for multiaxial notch fatigue analysis was improved by utilizing the volume ratio of the Effective Damage Zones(EDZs)for size effect correction.In particular,it was extended to a probabilistic model considering the inherent variability of the fatigue phenomenon.The experimental data of Al 7075-T6511 notched specimens and the model-predicted results were compared,indicating the high potential of the proposed approach in fatigue evaluation under combined notch and size effects.展开更多
Experimental study is performed on the probabilistic models for the long fatigue crack growth rates (da/dN) of LZ50 axle steel. An equation for crack growth rate was derived to consider the trend of stress intensity f...Experimental study is performed on the probabilistic models for the long fatigue crack growth rates (da/dN) of LZ50 axle steel. An equation for crack growth rate was derived to consider the trend of stress intensity factor range going down to the threshold and the average stress effect. The probabilistic models were presented on the equation. They consist of the probabilistic da/dN-ΔK relations, the confidence-based da/dN-ΔK relations, and the probabilistic- and confidence-based da/dN-ΔK relations. Efforts were made respectively to characterize the effects of probabilistic assessments due to the scattering regularity of test data, the number of sampling, and both of them. These relations can provide wide selections for practice. Analysis on the test data of LZ50 steel indicates that the present models are available and feasible.展开更多
Experimental study is performed on the probabilistic models for the long fatigue crack growth rates (da/dN) of LZ50 axle steel. An equation for crack growth rate was derived to consider the trend of stress intensity...Experimental study is performed on the probabilistic models for the long fatigue crack growth rates (da/dN) of LZ50 axle steel. An equation for crack growth rate was derived to consider the trend of stress intensity factor range going down to the threshold and the average stress effect. The probabilistic models were presented on the equation. They consist of the probabilistic da/dN-△K relations, the confidence-based da/dN-△K relations, and the probabilistic- and confidence-based da/dN-△K relations. Efforts were made respectively to characterize the effects of probabilistic assessments due to the scattering regularity of test data, the number of sampling, and both of them. These relations can provide wide selections for practice. Analysis on the test data of LZ50 steel indicates that the present models are available and feasible.展开更多
The potential for devastating earthquakes in the Himalayan orogeny has long been recognized. The 2015 MW7.8 Gorkha, Nepal earthquake has heightened the likelihood that major earthquakes will occur along this orogenic ...The potential for devastating earthquakes in the Himalayan orogeny has long been recognized. The 2015 MW7.8 Gorkha, Nepal earthquake has heightened the likelihood that major earthquakes will occur along this orogenic belt in the future. Reliable seismic hazard assessment is a critical element in development of policy for seismic hazard mitigation and risk reduction. In this study, we conduct probabilistic seismic hazard assessment using three different seismogenic source models(smoothed gridded, linear, and areal sources)based on the complicated tectonics of the study area. Two sets of ground motion prediction equations are combined in a standard logic tree by taking into account the epistemic uncertainties in hazard estimation. Long-term slip rates and paleoseismic records are also incorporated in the linear source model. Peak ground acceleration and spectral acceleration at 0.2 s and 1.0 s for 2% and 10%probabilities of exceedance in 50 years are estimated. The resulting maps show significant spatial variation in seismic hazard levels. The region of the Lesser Himalaya is found to have high seismic hazard potential. Along the Main Himalayan Thrust from east to west beneath the Main Central Thrust, large earthquakes have occurred regularly in history; hazard values in this region are found to be higher than those shown on existing hazard maps. In essence, the combination of long span earthquake catalogs and multiple seismogenic source models gives improved seismic hazard constraints in Nepal.展开更多
Online automatic fault diagnosis in industrial systems is essential for guaranteeing safe, reliable and efficient operations.However, difficulties associated with computational overload, ubiquitous uncertainties and i...Online automatic fault diagnosis in industrial systems is essential for guaranteeing safe, reliable and efficient operations.However, difficulties associated with computational overload, ubiquitous uncertainties and insufficient fault samples hamper the engineering application of intelligent fault diagnosis technology. Geared towards the settlement of these problems, this paper introduces the method of dynamic uncertain causality graph, which is a new attempt to model complex behaviors of real-world systems under uncertainties. The visual representation to causality pathways and self-relied "chaining" inference mechanisms are analyzed. In particular, some solutions are investigated for the diagnostic reasoning algorithm to aim at reducing its computational complexity and improving the robustness to potential losses and imprecisions in observations. To evaluate the effectiveness and performance of this method, experiments are conducted using both synthetic calculation cases and generator faults of a nuclear power plant. The results manifest the high diagnostic accuracy and efficiency, suggesting its practical significance in large-scale industrial applications.展开更多
In open pit mining,uncontrolled block instabilities have serious social,economic and regulatory consequences,such as casualties,disruption of operation and increased regulation difficulties.For this reason,bench face ...In open pit mining,uncontrolled block instabilities have serious social,economic and regulatory consequences,such as casualties,disruption of operation and increased regulation difficulties.For this reason,bench face angle,as one of the controlling parameters associated with block instabilities,should be carefully designed for sustainable mining.This study introduces a discrete fracture network(DFN)-based probabilistic block theory approach for the fast design of the bench face angle.A major advantage is the explicit incorporation of discontinuity size and spatial distribution in the procedure of key blocks testing.The proposed approach was applied to a granite mine in China.First,DFN models were generated from a multi-step modeling procedure to simulate the complex structural characteristics of pit slopes.Then,a modified key blocks searching method was applied to the slope faces modeled,and a cumulative probability of failure was obtained for each sector.Finally,a bench face angle was determined commensurate with an acceptable risk level of stability.The simulation results have shown that the number of hazardous traces exposed on the slope face can be significantly reduced when the suggested bench face angle is adopted,indicating an extremely low risk of uncontrolled block instabilities.展开更多
Background: With mounting global environmental, social and economic pressures the resilience and stability of forests and thus the provisioning of vital ecosystem services is increasingly threatened. Intensified moni...Background: With mounting global environmental, social and economic pressures the resilience and stability of forests and thus the provisioning of vital ecosystem services is increasingly threatened. Intensified monitoring can help to detect ecological threats and changes earlier, but monitoring resources are limited. Participatory forest monitoring with the help of "citizen scientists" can provide additional resources for forest monitoring and at the same time help to communicate with stakeholders and the general public. Examples for citizen science projects in the forestry domain can be found but a solid, applicable larger framework to utilise public participation in the area of forest monitoring seems to be lacking. We propose that a better understanding of shared and related topics in citizen science and forest monitoring might be a first step towards such a framework. Methods: We conduct a systematic meta-analysis of 1015 publication abstracts addressing "forest monitoring" and "citizen science" in order to explore the combined topical landscape of these subjects. We employ 'topic modelling an unsupervised probabilistic machine learning method, to identify latent shared topics in the analysed publications. Results: We find that large shared topics exist, but that these are primarily topics that would be expected in scientific publications in general. Common domain-specific topics are under-represented and indicate a topical separation of the two document sets on "forest monitoring" and "citizen science" and thus the represented domains. While topic modelling as a method proves to be a scalable and useful analytical tool, we propose that our approach could deliver even more useful data if a larger document set and full-text publications would be available for analysis. Conclusions: We propose that these results, together with the observation of non-shared but related topics, point at under-utilised opportunities for public participation in forest monitoring. Citizen science could be applied as a versatile tool in forest ecosystems monitoring, complementing traditional forest monitoring programmes, assisting early threat recognition and helping to connect forest management with the general public. We conclude that our presented approach should be pursued further as it may aid the understanding and setup of citizen science efforts in the forest monitoring domain.展开更多
Because of the randomness and uncertainty,integration of large-scale wind farms in a power system will exert significant influences on the distribution of power flow.This paper uses polynomial normal transformation me...Because of the randomness and uncertainty,integration of large-scale wind farms in a power system will exert significant influences on the distribution of power flow.This paper uses polynomial normal transformation method to deal with non-normal random variable correlation,and solves probabilistic load flow based on Kriging method.This method is a kind of smallest unbiased variance estimation method which estimates unknown information via employing a point within the confidence scope of weighted linear combination.Compared with traditional approaches which need a greater number of calculation times,long simulation time,and large memory space,Kriging method can rapidly estimate node state variables and branch current power distribution situation.As one of the generator nodes in the western Yunnan power grid,a certain wind farm is chosen for empirical analysis.Results are used to compare with those by Monte Carlo-based accurate solution,which proves the validity and veracity of the model in wind farm power modeling as output of the actual turbine through PSD-BPA.展开更多
The recent outbreak of COVID-19 has caused millions of deaths worldwide and a huge societal and economic impact in virtually all countries. A large variety of mathematical models to describe the dynamics of COVID-19 t...The recent outbreak of COVID-19 has caused millions of deaths worldwide and a huge societal and economic impact in virtually all countries. A large variety of mathematical models to describe the dynamics of COVID-19 transmission have been reported. Among them, Bayesian probabilistic models of COVID-19 transmission dynamics have been very efficient in the interpretation of early data from the beginning of the pandemic, helping to estimate the impact of non-pharmacological measures in each country, and forecasting the evolution of the pandemic in different potential scenarios. These models use probability distribution curves to describe key dynamic aspects of the transmission, like the probability for every infected person of infecting other individuals, dying or recovering, with parameters obtained from experimental epidemiological data. However, the impact of vaccine-induced immunity, which has been key for controlling the public health emergency caused by the pandemic, has been more challenging to describe in these models, due to the complexity of experimental data. Here we report different probability distribution curves to model the acquisition and decay of immunity after vaccination. We discuss the mathematical background and how these models can be integrated in existing Bayesian probabilistic models to provide a good estimation of the dynamics of COVID-19 transmission during the entire pandemic period.展开更多
Probabilistic model checking has been widely applied to quantitative analysis of stochastic systems, e.g., analyzing the performance, reliability and survivability of computer and communication systems. In this paper,...Probabilistic model checking has been widely applied to quantitative analysis of stochastic systems, e.g., analyzing the performance, reliability and survivability of computer and communication systems. In this paper, we extend the application of probabilistic model checking to the vehicle to vehicle(V2V) networks. We first develop a continuous-time Markov chain(CTMC) model for the considered V2V network, after that, the PRISM language is adopted to describe the CTMC model, and continuous-time stochastic logic is used to describe the objective survivability properties. In the analysis, two typical failures are considered, namely the node failure and the link failure, respectively induced by external malicious attacks on a target V2V node, and interrupt in a communication link. Considering these failures, their impacts on the network survivability are demonstrated. It is shown that with increasing failure strength, the network survivability is reduced. On the other hand, the network survivability can be improved with increasing repair rate. The proposed probabilistic model checking-based approach can be effectively used in survivability analysis for the V2V networks, moreover, it is anticipated that the approach can be conveniently extended to other networks.展开更多
A simple probabilistic model for predicting crack growth behavior under random loading is presented. In the model, the parameters c and m in the Paris-Erdogan Equation are taken as random variables, and their stochast...A simple probabilistic model for predicting crack growth behavior under random loading is presented. In the model, the parameters c and m in the Paris-Erdogan Equation are taken as random variables, and their stochastic characteristic values are obtained through fatigue crack propagation tests on an offshore structural steel under constant amplitude loading. Furthermore, by using the Monte Carlo simulation technique, the fatigue crack propagation life to reach a given crack length is predicted. The tests are conducted to verify the applicability of the theoretical prediction of the fatigue crack propagation.展开更多
Based on the interval mathematics and possibility theory, the variables existing in hydraulic turbine blade are described. Considering the multi-failure mode in turbine blade, multi-variable model is established to me...Based on the interval mathematics and possibility theory, the variables existing in hydraulic turbine blade are described. Considering the multi-failure mode in turbine blade, multi-variable model is established to meet the actual situation. Thus, non-probabilistic reliability index is presented by comparing with the output range and the given range.展开更多
In the technique of video multi-target tracking,the common particle filter can not deal well with uncertain relations among multiple targets.To solve this problem,many researchers use data association method to reduce...In the technique of video multi-target tracking,the common particle filter can not deal well with uncertain relations among multiple targets.To solve this problem,many researchers use data association method to reduce the multi-target uncertainty.However,the traditional data association method is difficult to track accurately when the target is occluded.To remove the occlusion in the video,combined with the theory of data association,this paper adopts the probabilistic graphical model for multi-target modeling and analysis of the targets relationship in the particle filter framework.Ex-perimental results show that the proposed algorithm can solve the occlusion problem better compared with the traditional algorithm.展开更多
The enhancement of radio frequency identification(RFID) technology to track and trace objects has attracted a lot of attention from the healthcare and the supply chain industry.However,RFID systems do not always funct...The enhancement of radio frequency identification(RFID) technology to track and trace objects has attracted a lot of attention from the healthcare and the supply chain industry.However,RFID systems do not always function reliably under complex and variable deployment environment.In many cases,RFID systems provide only probabilistic observations of object states.Thus,an approach to predict,record and track real world object states based upon probabilistic RFID observations is required.Hidden Markov model(HMM) has been used in the field of probabilistic location determination.But the inherent duration probability density of a state in HMM is exponential,which may be inappropriate for modeling of object location transitions.Hence,in this paper,we put forward a hidden semi-Markov model(HSMM) based approach for probabilistic location determination. We evaluated its performance comparing with that of the HMM-based approach.The results show that the HSMM-based approach provides a more accurate determination of real world object states based on observation data.展开更多
The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,t...The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,the structural uncertain parameters were described as interval variables.The theoretical analysis model was developed by starting from the 2-D plane and 3-D space.In order to avoid the loss of probable failure points,the 2-D plane and 3-D space were respectively divided into two parts and three parts for further analysis.The study pointed out that the probable failure points only existed among extreme points and root points of the limit state function.Furthermore,the low-dimensional analytical scheme was extended to the high-dimensional case.Using the proposed approach,it is easy to find the most probable failure point and to acquire the reliability index through simple comparison directly.A number of equations used for calculating the extreme points and root points were also evaluated.This result was useful to avoid the loss of probable failure points and meaningful for optimizing searches in the research field.Finally,two kinds of examples were presented and compared with the existing computation.The good agreements show that the proposed theoretical analysis approach in the paper is correct.The efforts were conducted to improve the optimization method,to indicate the search direction and path,and to avoid only searching the local optimal solution which would result in missed probable failure points.展开更多
A Bayesian probabilistic prediction scheme of the Yangtze River Valley (YRV) summer rainfall is proposed to combine forecast information from multi-model ensemble dataset provided by ENSEMBLES project.Due to the low f...A Bayesian probabilistic prediction scheme of the Yangtze River Valley (YRV) summer rainfall is proposed to combine forecast information from multi-model ensemble dataset provided by ENSEMBLES project.Due to the low forecast skill of rainfall in dynamic models,the time series of regressed YRV summer rainfall are selected as ensemble members in the new scheme,instead of commonly-used YRV summer rainfall simulated by models.Each time series of regressed YRV summer rainfall is derived from a simple linear regression.The predictor in each simple linear regression is the skillfully simulated circulation or surface temperature factor which is highly linear with the observed YRV summer rainfall in the training set.The high correlation between the ensemble mean of these regressed YRV summer rainfall and observation benefit extracting more sample information from the ensemble system.The results show that the cross-validated skill of the new scheme over the period of 1960 to 2002 is much higher than equally-weighted ensemble,multiple linear regression,and Bayesian ensemble with simulated YRV summer rainfall as ensemble members.In addition,the new scheme is also more skillful than reference forecasts (random forecast at a 0.01 significance level for ensemble mean and climatology forecast for probability density function).展开更多
This article shows the probabilistic modeling of hydrocarbon spills on the surface of the sea, using climatology data of oil spill trajectories yielded by applying the lagrangian model PETROMAR-3D. To achieve this goa...This article shows the probabilistic modeling of hydrocarbon spills on the surface of the sea, using climatology data of oil spill trajectories yielded by applying the lagrangian model PETROMAR-3D. To achieve this goal, several computing and statistical tools were used to develop the probabilistic modeling solution based in the methodology of Guo. Solution was implemented using a databases approach and SQL language. A case study is presented which is based on a hypothetical spill in a location inside the Exclusive Economic Zone of Cuba. Important outputs and products of probabilistic modeling were obtained, which are very useful for decision-makers and operators in charge to face oil spill accidents and prepare contingency plans to minimize its effects. In order to study the relationship between the initial trajectory and the arrival of hydrocarbons spills to the coast, a new approach is introduced as an incoming perspective for modeling. It consists in storage in databases the direction of movement of the oil slick at the first 24 hours. The probabilistic modeling solution presented is of great importance for hazard studies of oil spills in Cuban coastal areas.展开更多
基金supported by the Opening Fund of Key Laboratory of Geological Survey and Evaluation of Ministry of Education(No.GLAB 2024ZR03)the National Natural Science Foundation of China(No.42407248)+2 种基金the Guizhou Provincial Basic Research Program(Natural Science)(No.QKHJC-[2023]-YB066)the Key Laboratory of Smart Earth(No.KF2023YB04-02)the Fundamental Research Funds for the Central Universities。
文摘The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.
基金the Young Investigator Group“Artificial Intelligence for Probabilistic Weather Forecasting”funded by the Vector Stiftungfunding from the Federal Ministry of Education and Research(BMBF)and the Baden-Württemberg Ministry of Science as part of the Excellence Strategy of the German Federal and State Governments。
文摘Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.
基金supported by the Science and Technology Project of China South Power Grid Co.,Ltd.,Grant Nos.036000KK52222044,GDKJXM20222430。
文摘The rapid expansion of offshore wind energy necessitates robust and cost-effective electrical collector system(ECS)designs that prioritize lifetime operational reliability.Traditional optimization approaches often simplify reliability considerations or fail to holistically integrate them with economic and technical constraints.This paper introduces a novel,two-stage optimization framework for offshore wind farm(OWF)ECS planning that systematically incorporates reliability.The first stage employs Mixed-Integer Linear Programming(MILP)to determine an optimal radial network topology,considering linearized reliability approximations and geographical constraints.The second stage enhances this design by strategically placing tie-lines using a Mixed-Integer Quadratically Constrained Program(MIQCP).This stage leverages a dynamic-aware adaptation of Multi-Source Multi-Terminal Network Reliability(MSMT-NR)assessment,with its inherent nonlinear equations successfully transformed into a solvable MIQCP form for loopy networks.A benchmark case study demonstrates the framework’s efficacy,illustrating how increasing the emphasis on reliability leads to more distributed and interconnected network topologies,effectively balancing investment costs against enhanced system resilience.
基金support from the Key Program of the National Natural Science Foundation of China(No.12232004)the Training Program of the Sichuan Province Science and the Technology Innovation Seedling Project(No.MZGC20230012)are acknowledged.
文摘The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smooth transformation between small-scale laboratory specimens’fatigue properties and full-scale engineering components’fatigue strength has been a long-term challenge.In this work,two dominant factors impeding the smooth transformation—notch and size effect were experimentally studied,in which fatigue tests on Al 7075-T6511(a very high-strength aviation alloy)notched specimens of different scales were carried out.Fractography analyses identified the evidence of the size effect on notch fatigue damage evolution.Accordingly,the Energy Field Intensity(EFI)initially developed for multiaxial notch fatigue analysis was improved by utilizing the volume ratio of the Effective Damage Zones(EDZs)for size effect correction.In particular,it was extended to a probabilistic model considering the inherent variability of the fatigue phenomenon.The experimental data of Al 7075-T6511 notched specimens and the model-predicted results were compared,indicating the high potential of the proposed approach in fatigue evaluation under combined notch and size effects.
基金Project supported by the National Natural Science Foundation of China (Nos.50375130and50323003), the Special Foundation of National Excellent Ph.D.Thesis (No.200234) and thePlanned Itemforthe Outstanding Young Teachers ofMinistry ofEducationofChina (No.2101)
文摘Experimental study is performed on the probabilistic models for the long fatigue crack growth rates (da/dN) of LZ50 axle steel. An equation for crack growth rate was derived to consider the trend of stress intensity factor range going down to the threshold and the average stress effect. The probabilistic models were presented on the equation. They consist of the probabilistic da/dN-ΔK relations, the confidence-based da/dN-ΔK relations, and the probabilistic- and confidence-based da/dN-ΔK relations. Efforts were made respectively to characterize the effects of probabilistic assessments due to the scattering regularity of test data, the number of sampling, and both of them. These relations can provide wide selections for practice. Analysis on the test data of LZ50 steel indicates that the present models are available and feasible.
基金国家自然科学基金,Special Foundation of National Excellent Ph.D.Thesis,Outstanding Young Teachers of Ministry of Education of China
文摘Experimental study is performed on the probabilistic models for the long fatigue crack growth rates (da/dN) of LZ50 axle steel. An equation for crack growth rate was derived to consider the trend of stress intensity factor range going down to the threshold and the average stress effect. The probabilistic models were presented on the equation. They consist of the probabilistic da/dN-△K relations, the confidence-based da/dN-△K relations, and the probabilistic- and confidence-based da/dN-△K relations. Efforts were made respectively to characterize the effects of probabilistic assessments due to the scattering regularity of test data, the number of sampling, and both of them. These relations can provide wide selections for practice. Analysis on the test data of LZ50 steel indicates that the present models are available and feasible.
基金supported by the grants of the National Nature Science Foundation of China (No. 41761144076, 41490611)the collaborative research program of the Disaster Prevention Research Institute of Kyoto University (No. 29W-03)+2 种基金the COX visiting professor fellowship of the Stanford University to L.B.the Chinese Academy of Sciences (CAS)The World Academy of Sciences (TWAS) President’s Ph D Fellowship to M.M.R
文摘The potential for devastating earthquakes in the Himalayan orogeny has long been recognized. The 2015 MW7.8 Gorkha, Nepal earthquake has heightened the likelihood that major earthquakes will occur along this orogenic belt in the future. Reliable seismic hazard assessment is a critical element in development of policy for seismic hazard mitigation and risk reduction. In this study, we conduct probabilistic seismic hazard assessment using three different seismogenic source models(smoothed gridded, linear, and areal sources)based on the complicated tectonics of the study area. Two sets of ground motion prediction equations are combined in a standard logic tree by taking into account the epistemic uncertainties in hazard estimation. Long-term slip rates and paleoseismic records are also incorporated in the linear source model. Peak ground acceleration and spectral acceleration at 0.2 s and 1.0 s for 2% and 10%probabilities of exceedance in 50 years are estimated. The resulting maps show significant spatial variation in seismic hazard levels. The region of the Lesser Himalaya is found to have high seismic hazard potential. Along the Main Himalayan Thrust from east to west beneath the Main Central Thrust, large earthquakes have occurred regularly in history; hazard values in this region are found to be higher than those shown on existing hazard maps. In essence, the combination of long span earthquake catalogs and multiple seismogenic source models gives improved seismic hazard constraints in Nepal.
基金supported by the National Natural Science Foundation of China(Nos.61050005 and 61273330)Research Foundation for the Doctoral Program of China Ministry of Education(No.20120002110037)+1 种基金the 2014 Teaching Reform Project of Shandong Normal UniversityDevelopment Project of China Guangdong Nuclear Power Group(No.CNPRI-ST10P005)
文摘Online automatic fault diagnosis in industrial systems is essential for guaranteeing safe, reliable and efficient operations.However, difficulties associated with computational overload, ubiquitous uncertainties and insufficient fault samples hamper the engineering application of intelligent fault diagnosis technology. Geared towards the settlement of these problems, this paper introduces the method of dynamic uncertain causality graph, which is a new attempt to model complex behaviors of real-world systems under uncertainties. The visual representation to causality pathways and self-relied "chaining" inference mechanisms are analyzed. In particular, some solutions are investigated for the diagnostic reasoning algorithm to aim at reducing its computational complexity and improving the robustness to potential losses and imprecisions in observations. To evaluate the effectiveness and performance of this method, experiments are conducted using both synthetic calculation cases and generator faults of a nuclear power plant. The results manifest the high diagnostic accuracy and efficiency, suggesting its practical significance in large-scale industrial applications.
基金financially supported by the National Natural Science Foundation of China(Grant Nos.42102313 and 52104125)the Fundamental Research Funds for the Central Universities(Grant No.B240201094).
文摘In open pit mining,uncontrolled block instabilities have serious social,economic and regulatory consequences,such as casualties,disruption of operation and increased regulation difficulties.For this reason,bench face angle,as one of the controlling parameters associated with block instabilities,should be carefully designed for sustainable mining.This study introduces a discrete fracture network(DFN)-based probabilistic block theory approach for the fast design of the bench face angle.A major advantage is the explicit incorporation of discontinuity size and spatial distribution in the procedure of key blocks testing.The proposed approach was applied to a granite mine in China.First,DFN models were generated from a multi-step modeling procedure to simulate the complex structural characteristics of pit slopes.Then,a modified key blocks searching method was applied to the slope faces modeled,and a cumulative probability of failure was obtained for each sector.Finally,a bench face angle was determined commensurate with an acceptable risk level of stability.The simulation results have shown that the number of hazardous traces exposed on the slope face can be significantly reduced when the suggested bench face angle is adopted,indicating an extremely low risk of uncontrolled block instabilities.
文摘Background: With mounting global environmental, social and economic pressures the resilience and stability of forests and thus the provisioning of vital ecosystem services is increasingly threatened. Intensified monitoring can help to detect ecological threats and changes earlier, but monitoring resources are limited. Participatory forest monitoring with the help of "citizen scientists" can provide additional resources for forest monitoring and at the same time help to communicate with stakeholders and the general public. Examples for citizen science projects in the forestry domain can be found but a solid, applicable larger framework to utilise public participation in the area of forest monitoring seems to be lacking. We propose that a better understanding of shared and related topics in citizen science and forest monitoring might be a first step towards such a framework. Methods: We conduct a systematic meta-analysis of 1015 publication abstracts addressing "forest monitoring" and "citizen science" in order to explore the combined topical landscape of these subjects. We employ 'topic modelling an unsupervised probabilistic machine learning method, to identify latent shared topics in the analysed publications. Results: We find that large shared topics exist, but that these are primarily topics that would be expected in scientific publications in general. Common domain-specific topics are under-represented and indicate a topical separation of the two document sets on "forest monitoring" and "citizen science" and thus the represented domains. While topic modelling as a method proves to be a scalable and useful analytical tool, we propose that our approach could deliver even more useful data if a larger document set and full-text publications would be available for analysis. Conclusions: We propose that these results, together with the observation of non-shared but related topics, point at under-utilised opportunities for public participation in forest monitoring. Citizen science could be applied as a versatile tool in forest ecosystems monitoring, complementing traditional forest monitoring programmes, assisting early threat recognition and helping to connect forest management with the general public. We conclude that our presented approach should be pursued further as it may aid the understanding and setup of citizen science efforts in the forest monitoring domain.
文摘Because of the randomness and uncertainty,integration of large-scale wind farms in a power system will exert significant influences on the distribution of power flow.This paper uses polynomial normal transformation method to deal with non-normal random variable correlation,and solves probabilistic load flow based on Kriging method.This method is a kind of smallest unbiased variance estimation method which estimates unknown information via employing a point within the confidence scope of weighted linear combination.Compared with traditional approaches which need a greater number of calculation times,long simulation time,and large memory space,Kriging method can rapidly estimate node state variables and branch current power distribution situation.As one of the generator nodes in the western Yunnan power grid,a certain wind farm is chosen for empirical analysis.Results are used to compare with those by Monte Carlo-based accurate solution,which proves the validity and veracity of the model in wind farm power modeling as output of the actual turbine through PSD-BPA.
文摘The recent outbreak of COVID-19 has caused millions of deaths worldwide and a huge societal and economic impact in virtually all countries. A large variety of mathematical models to describe the dynamics of COVID-19 transmission have been reported. Among them, Bayesian probabilistic models of COVID-19 transmission dynamics have been very efficient in the interpretation of early data from the beginning of the pandemic, helping to estimate the impact of non-pharmacological measures in each country, and forecasting the evolution of the pandemic in different potential scenarios. These models use probability distribution curves to describe key dynamic aspects of the transmission, like the probability for every infected person of infecting other individuals, dying or recovering, with parameters obtained from experimental epidemiological data. However, the impact of vaccine-induced immunity, which has been key for controlling the public health emergency caused by the pandemic, has been more challenging to describe in these models, due to the complexity of experimental data. Here we report different probability distribution curves to model the acquisition and decay of immunity after vaccination. We discuss the mathematical background and how these models can be integrated in existing Bayesian probabilistic models to provide a good estimation of the dynamics of COVID-19 transmission during the entire pandemic period.
基金supported by the National Natural Science Foundation of China under Grant no. 61371113 and 61401240Graduate Student Research Innovation Program Foundation of Jiangsu Province no. YKC16006+1 种基金Graduate Student Research Innovation Program Foundation of Nantong University no. KYZZ160354Top-notch Academic Programs Project of Jiangsu Higher Education Institutions (PPZY2015B135)
文摘Probabilistic model checking has been widely applied to quantitative analysis of stochastic systems, e.g., analyzing the performance, reliability and survivability of computer and communication systems. In this paper, we extend the application of probabilistic model checking to the vehicle to vehicle(V2V) networks. We first develop a continuous-time Markov chain(CTMC) model for the considered V2V network, after that, the PRISM language is adopted to describe the CTMC model, and continuous-time stochastic logic is used to describe the objective survivability properties. In the analysis, two typical failures are considered, namely the node failure and the link failure, respectively induced by external malicious attacks on a target V2V node, and interrupt in a communication link. Considering these failures, their impacts on the network survivability are demonstrated. It is shown that with increasing failure strength, the network survivability is reduced. On the other hand, the network survivability can be improved with increasing repair rate. The proposed probabilistic model checking-based approach can be effectively used in survivability analysis for the V2V networks, moreover, it is anticipated that the approach can be conveniently extended to other networks.
文摘A simple probabilistic model for predicting crack growth behavior under random loading is presented. In the model, the parameters c and m in the Paris-Erdogan Equation are taken as random variables, and their stochastic characteristic values are obtained through fatigue crack propagation tests on an offshore structural steel under constant amplitude loading. Furthermore, by using the Monte Carlo simulation technique, the fatigue crack propagation life to reach a given crack length is predicted. The tests are conducted to verify the applicability of the theoretical prediction of the fatigue crack propagation.
基金the Key Scientific Research Fund Project of Xihua University(No.Z1320406)the National Natural Science Foundation of China(No.51379179)
文摘Based on the interval mathematics and possibility theory, the variables existing in hydraulic turbine blade are described. Considering the multi-failure mode in turbine blade, multi-variable model is established to meet the actual situation. Thus, non-probabilistic reliability index is presented by comparing with the output range and the given range.
基金Supported by the National High Technology Research and Development Program of China (No. 2007AA11Z227)the Natural Science Foundation of Jiangsu Province of China(No. BK2009352)the Fundamental Research Funds for the Central Universities of China (No. 2010B16414)
文摘In the technique of video multi-target tracking,the common particle filter can not deal well with uncertain relations among multiple targets.To solve this problem,many researchers use data association method to reduce the multi-target uncertainty.However,the traditional data association method is difficult to track accurately when the target is occluded.To remove the occlusion in the video,combined with the theory of data association,this paper adopts the probabilistic graphical model for multi-target modeling and analysis of the targets relationship in the particle filter framework.Ex-perimental results show that the proposed algorithm can solve the occlusion problem better compared with the traditional algorithm.
基金the National High Technology Research and Development Program(863) of China(No. 2006AA04A114)
文摘The enhancement of radio frequency identification(RFID) technology to track and trace objects has attracted a lot of attention from the healthcare and the supply chain industry.However,RFID systems do not always function reliably under complex and variable deployment environment.In many cases,RFID systems provide only probabilistic observations of object states.Thus,an approach to predict,record and track real world object states based upon probabilistic RFID observations is required.Hidden Markov model(HMM) has been used in the field of probabilistic location determination.But the inherent duration probability density of a state in HMM is exponential,which may be inappropriate for modeling of object location transitions.Hence,in this paper,we put forward a hidden semi-Markov model(HSMM) based approach for probabilistic location determination. We evaluated its performance comparing with that of the HMM-based approach.The results show that the HSMM-based approach provides a more accurate determination of real world object states based on observation data.
基金the National Natural Science Foundation of China (51408444, 51708428)
文摘The aim of this paper is to propose a theoretical approach for performing the nonprobabilistic reliability analysis of structure.Due to a great deal of uncertainties and limited measured data in engineering practice,the structural uncertain parameters were described as interval variables.The theoretical analysis model was developed by starting from the 2-D plane and 3-D space.In order to avoid the loss of probable failure points,the 2-D plane and 3-D space were respectively divided into two parts and three parts for further analysis.The study pointed out that the probable failure points only existed among extreme points and root points of the limit state function.Furthermore,the low-dimensional analytical scheme was extended to the high-dimensional case.Using the proposed approach,it is easy to find the most probable failure point and to acquire the reliability index through simple comparison directly.A number of equations used for calculating the extreme points and root points were also evaluated.This result was useful to avoid the loss of probable failure points and meaningful for optimizing searches in the research field.Finally,two kinds of examples were presented and compared with the existing computation.The good agreements show that the proposed theoretical analysis approach in the paper is correct.The efforts were conducted to improve the optimization method,to indicate the search direction and path,and to avoid only searching the local optimal solution which would result in missed probable failure points.
基金supported by the Knowledge Innovation Key Project of Chinese Academy of Sciences (CAS) under Grant No.KZCX2-YW-217Doctor Research Startup Project at the Institute of Atmospheric Physics,the CAS under Grant No.7-098300
文摘A Bayesian probabilistic prediction scheme of the Yangtze River Valley (YRV) summer rainfall is proposed to combine forecast information from multi-model ensemble dataset provided by ENSEMBLES project.Due to the low forecast skill of rainfall in dynamic models,the time series of regressed YRV summer rainfall are selected as ensemble members in the new scheme,instead of commonly-used YRV summer rainfall simulated by models.Each time series of regressed YRV summer rainfall is derived from a simple linear regression.The predictor in each simple linear regression is the skillfully simulated circulation or surface temperature factor which is highly linear with the observed YRV summer rainfall in the training set.The high correlation between the ensemble mean of these regressed YRV summer rainfall and observation benefit extracting more sample information from the ensemble system.The results show that the cross-validated skill of the new scheme over the period of 1960 to 2002 is much higher than equally-weighted ensemble,multiple linear regression,and Bayesian ensemble with simulated YRV summer rainfall as ensemble members.In addition,the new scheme is also more skillful than reference forecasts (random forecast at a 0.01 significance level for ensemble mean and climatology forecast for probability density function).
文摘This article shows the probabilistic modeling of hydrocarbon spills on the surface of the sea, using climatology data of oil spill trajectories yielded by applying the lagrangian model PETROMAR-3D. To achieve this goal, several computing and statistical tools were used to develop the probabilistic modeling solution based in the methodology of Guo. Solution was implemented using a databases approach and SQL language. A case study is presented which is based on a hypothetical spill in a location inside the Exclusive Economic Zone of Cuba. Important outputs and products of probabilistic modeling were obtained, which are very useful for decision-makers and operators in charge to face oil spill accidents and prepare contingency plans to minimize its effects. In order to study the relationship between the initial trajectory and the arrival of hydrocarbons spills to the coast, a new approach is introduced as an incoming perspective for modeling. It consists in storage in databases the direction of movement of the oil slick at the first 24 hours. The probabilistic modeling solution presented is of great importance for hazard studies of oil spills in Cuban coastal areas.