Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunne...Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunnelling.In this study,a series of probabilistic analyses of surface and subsurface settlements was conducted considering the spatial variability of the friction angle and reference stiffness modulus,under different volumetric block proportions(Pv)and tunnel volume loss rates(ηt).The non-intrusive random finite difference method was used to investigate the probabilistic characteristics of maximum surface settlement,width of subsurface settlement trough,maximum subsurface settlement,and subsurface soil volume loss rate through Monte Carlo simulations.Additionally,a comparison between stochastic and deterministic analysis results is presented to underscore the significance of probabilistic analysis.Parametric analyses were subsequently conducted to investigate the impacts of the key input parameters in random fields on the settlement characteristics.The results indicate that scenarios with higher Pv or greaterηt result in a higher dispersion of stochastic analysis results.Neglecting the spatial variability of soil properties and relying solely on the mean values of material parameters for deterministic analysis may result in an underestimation of surface and subsurface settlements.From a probabilistic perspective,deterministic analysis alone may prove inadequate in accurately capturing the volumetric deformation mode of the soil above the tunnel crown,potentially affecting the prediction of subsurface settlement.展开更多
In order to describe and control the stress distribution and total deformation of bladed disk assemblies used in the aeroengine, a highly efficient and precise method of probabilistic analysis which is called extremum...In order to describe and control the stress distribution and total deformation of bladed disk assemblies used in the aeroengine, a highly efficient and precise method of probabilistic analysis which is called extremum response surface method(ERSM) is produced based on the previous deterministic analysis results with the finite element model(FEM). In this work, many key nonlinear factors, such as the dynamic feature of the temperature load, the centrifugal force and the boundary conditions, are taken into consideration for the model. The changing patterns with time of bladed disk assemblies about stress distribution and total deformation are obtained during the deterministic analysis, and at the same time, the largest deformation and stress nodes of bladed disk assemblies are found and taken as input target of probabilistic analysis in a scientific and reasonable way. Not only their reliability, historical sample, extreme response surface(ERS) and the cumulative probability distribution function but also their sensitivity and effect probability are obtained. Main factors affecting stress distribution and total deformation of bladed disk assemblies are investigated through the sensitivity analysis of the model. Finally, compared with the response surface method(RSM) and the Monte Carlo simulation(MCS), the results show that this new approach is effective.展开更多
The current safety factor method for evaluating earth embankment stability is not very rational since the assessment of slope stability is really an uncertainty problem. In order to consider the random property of thi...The current safety factor method for evaluating earth embankment stability is not very rational since the assessment of slope stability is really an uncertainty problem. In order to consider the random property of this problem, the probabilistic analysis is introduced herein. Finally, the stability of a real beach earth embankment is analysed by means of the suggested probabilitic approach. It may be seen that the results of analysis can represent the numerical assessment of the degree of seismic stability.展开更多
The stability of slope is affected by a number of factors, some of which have not only random property but also fuzzy characteristic. Therefore, the analysis of slope stability is really an uncertain problem. The cust...The stability of slope is affected by a number of factors, some of which have not only random property but also fuzzy characteristic. Therefore, the analysis of slope stability is really an uncertain problem. The customary safety factor does not in reality reflect stability scientifically, quantitatively and practically. In order to obtain more practical results, the slope stability is treated as a fuzzy random event for the evaluation of its fuzzy probability. Finally, the seismic stability of an existing coastal embankment is analyzed by means of the suggested fuzzy probabilistic method. It may be seen that the results of analysis can more fully represent the numerical assessment of the degree of slope seismic stability.展开更多
Person-borne improvised explosive devices(PBIEDs)are often used in terrorist attacks in Western countries.This study aims to predict the trajectories of PBIED fragments and the subsequent safety risks for people expos...Person-borne improvised explosive devices(PBIEDs)are often used in terrorist attacks in Western countries.This study aims to predict the trajectories of PBIED fragments and the subsequent safety risks for people exposed to this hazard.An explosive field test with a typical PBIED composed of a plastic explosive charge and steel nut enhancements was performed to record initial fragment behaviour,including positions,velocity,and trajectory angles.These data were used to predict the full trajectory of PBIED fragments using a probabilistic analysis.In the probabilistic analyses a probability of fatality or serious injury was computed.Based on the results presented,many practical conclusions can be drawn,for instance,regarding safe evacuation distances if a person were exposed to a suspected PBIED.展开更多
This paper mainly investigates the connectivity of the unreliable sensor grid network. We consider an unreliable sensor grid network with mn nodes placed in a certain planar area A, and we assume that each node has in...This paper mainly investigates the connectivity of the unreliable sensor grid network. We consider an unreliable sensor grid network with mn nodes placed in a certain planar area A, and we assume that each node has independent failure probability p and has the same transmission range R. This paper presents a new method for calculating the connectivity probability of the network, which uses thorough mathematical methods to derive the relationship among the network connectivity probability, the probability that a node is "failed" (not active), the numbers of node, and the node's transmission range in unreliable sensor networks. Our approach is more useful and efficient for given problem and conditions. Such as the numerical calculating results indicate that, for a 100×100 size sensot network, if node failure probability is bounded 0.5%, even if the transmission range is small (such as R = 10), we can still maintain very high connectivity probability (reach 95.8%). On the other hand, the simulation results show that building high connectivity probability is entirely possible on unreliable sensor grid networks.展开更多
Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution funct...Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution function provide valuable information.In this study,a simulation-based framework to evaluate these probabilistic characteristics in water distribution systems(WDSs)during post-earthquake recovery is developed.The framework first calculates pipeline failure probabilities using seismic fragility models and then generates damage samples through quasi-Monte Carlo simulations with Sobol’s sequence for faster convergence.System performance is assessed using a hydraulic model,and recovery simulations produce time-varying performance curves,where the dynamic importance of unrepaired damage determines repair sequences.Finally,the probabilistic characteristics of seismic performance indicators,resilience index,resilience loss,and recovery time are evaluated.The framework is applied in two benchmark WDSs with different layouts to investigate the probabilistic characteristics of their seismic performance and resilience.Application results show that the cumulative distribution function reveals the variations in resilience indicators for different exceedance probabilities,and there are dramatic differences among the recovery times corresponding to the system performance recovery targets of 80%,90%,and 100%.展开更多
The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of ...The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.展开更多
A landslide displacement (DLL) attenuation model has been developed using spectral intensity and a ratio of critical acceleration coefficient to ground acceleration coefficient. In the development of the model,a New Z...A landslide displacement (DLL) attenuation model has been developed using spectral intensity and a ratio of critical acceleration coefficient to ground acceleration coefficient. In the development of the model,a New Zealand earthquake record data set with magnitudes ranging from 5.0 to 7.2 within a source distance of 175 km is used. The model can be used to carry out deterministic landslide displacement analysis,and readily extended to carry out probabilistic seismic landslide displacement analysis. DLL attenuation models have also been developed by using earthquake source terms,such as magnitude and source distance,that account for the effects of earthquake faulttype,source type,and site conditions. Sensitivity analyses show that the predicted DLL values from the new models are close to those from the Romeo model that was developed from an Italian earthquake record data set. The proposed models are also applied to an analysis of landslide displacements in the Wenchuan earthquake,and a comparison between the predicted and the observed results shows that the proposed models are reliable,and can be confidently used in mapping landslide potential.展开更多
The failure to achieve minimum design overlap between secant piles compromises the ability of a structure to perform as designed,resulting in water leakage or even ground collapse.To establish a more realistic simulat...The failure to achieve minimum design overlap between secant piles compromises the ability of a structure to perform as designed,resulting in water leakage or even ground collapse.To establish a more realistic simulation and provide guidelines for designing a safe and cost-effective secant-pile wall,a three-dimensional model of a secant pile,considering the geometric imperfections of the diameter and direction of the borehole,is introduced.An ultrasonic cross-hole test was performed during the construction of secant piles in a launching shaft in Beijing,China.Based on the test results,the statistical characteristics of the pile diameters and orientation parameters were obtained.By taking the pile diameter D,inclination angleβ,and azimuth angleαas random variables,Monte Carlo simulations were performed to discuss the influence of different design parameters on the probability density functions of the overlap of secant piles.The obtained results show that the randomness of the inclination angle and pile diameter can be well described by a normal distribution,whereas the azimuth angle is more consistent with a uniform distribution.The integrity of the secant-pile wall can be overestimated without considering the uncertainty of geometric imperfections.The failure of the secant-pile wall increases substantially with increasing spatial variability in drilling inclination and diameter.A design flowchart for pile spacing under the target safety level is proposed to help engineers design a safe and economical pile wall.展开更多
The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analysis. The randomn...The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analysis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are considered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.展开更多
This paper deals with the approximate solution of the Fredholm equation u TKu = f of the second kind from a probabilistic point of view. With Wiener typemeasures on the set of kernels and free terms we determine stati...This paper deals with the approximate solution of the Fredholm equation u TKu = f of the second kind from a probabilistic point of view. With Wiener typemeasures on the set of kernels and free terms we determine statistical features of the approximation process, i.e., the most likely rate of convergence and the dominating individual behavior. The analysis carried out for a kind of Galerkin-like method.展开更多
In this paper,we propose a probabilistic method for analysing the collapse time of steel frame structures in a fire.The method considers the uncertainty of influencing factors.Tornado diagrams are used for sensitivity...In this paper,we propose a probabilistic method for analysing the collapse time of steel frame structures in a fire.The method considers the uncertainty of influencing factors.Tornado diagrams are used for sensitivity analysis of random variables.Structural analysis samples are selected by Monte Carlo method,and the collapse times of different structural samples are calculated by fire time history analysis.A collapse time fragility curve is fitted according to the calculated collapse times of the samples.A reliability index of the collapse time is used as a quantitative standard to evaluate the collapse performance of a steel frame in a fire.Finally,this method is applied to analyse the collapse time fragility of an eight-storey 3 D steel frame structure under different compartment fire scenarios and fire protection levels.According to the collapse time fragility curve,the effects of the different fire scenarios and protection levels on the collapse resistance of the structure under fire are evaluated.展开更多
In this work, fragility analysis is performed to assess two groups of reinforced concrete structures. The first group of structures is composed of buildings that implement three common design practices; namely, fully ...In this work, fragility analysis is performed to assess two groups of reinforced concrete structures. The first group of structures is composed of buildings that implement three common design practices; namely, fully infilled, weak ground story and short columns. The three design practices are applied during the design process of a reinforced concrete building. The structures of the second group vary according to the value of the behavioral factors used to define the seismic forces as specified in design procedures. Most seismic design codes belong to the class of prescriptive procedures where if certain constraints are fulfilled, the structure is considered safe. Prescriptive design procedures express the ability of the structure to absorb energy through inelastic deformation using the behavior factor. The basic objective of this work is to assess both groups of structures with reference to the limit-state probability of exceedance. Thus, four limit state fragility curves are developed on the basis of nonlinear static analysis for both groups of structures. Moreover, the 95% confidence intervals of the fragility curves are also calculated, taking into account two types of random variables that influence structural capacity and seismic demand.展开更多
A novel approach named aligned mixture probabilistic principal component analysis(AMPPCA) is proposed in this study for fault detection of multimode chemical processes. In order to exploit within-mode correlations,the...A novel approach named aligned mixture probabilistic principal component analysis(AMPPCA) is proposed in this study for fault detection of multimode chemical processes. In order to exploit within-mode correlations,the AMPPCA algorithm first estimates a statistical description for each operating mode by applying mixture probabilistic principal component analysis(MPPCA). As a comparison, the combined MPPCA is employed where monitoring results are softly integrated according to posterior probabilities of the test sample in each local model. For exploiting the cross-mode correlations, which may be useful but are inadvertently neglected due to separately held monitoring approaches, a global monitoring model is constructed by aligning all local models together. In this way, both within-mode and cross-mode correlations are preserved in this integrated space. Finally, the utility and feasibility of AMPPCA are demonstrated through a non-isothermal continuous stirred tank reactor and the TE benchmark process.展开更多
In recent years,multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas,especially for automatic image annotation,whose purpose is to provide an efficie...In recent years,multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas,especially for automatic image annotation,whose purpose is to provide an efficient and effective searching environment for users to query their images more easily. In this paper,a semi-supervised learning based probabilistic latent semantic analysis( PLSA) model for automatic image annotation is presenred. Since it's often hard to obtain or create labeled images in large quantities while unlabeled ones are easier to collect,a transductive support vector machine( TSVM) is exploited to enhance the quality of the training image data. Then,different image features with different magnitudes will result in different performance for automatic image annotation. To this end,a Gaussian normalization method is utilized to normalize different features extracted from effective image regions segmented by the normalized cuts algorithm so as to reserve the intrinsic content of images as complete as possible. Finally,a PLSA model with asymmetric modalities is constructed based on the expectation maximization( EM) algorithm to predict a candidate set of annotations with confidence scores. Extensive experiments on the general-purpose Corel5k dataset demonstrate that the proposed model can significantly improve performance of traditional PLSA for the task of automatic image annotation.展开更多
Potential sources are simplified as point sources or linear sources in current probabilistic seismic hazard analysis (PSHA) methods. Focus size of large earthquakes is considerable, and fault rupture attitudes may h...Potential sources are simplified as point sources or linear sources in current probabilistic seismic hazard analysis (PSHA) methods. Focus size of large earthquakes is considerable, and fault rupture attitudes may have great influence upon the seismic hazard of a site which is near the source. Under this circumstance, it is unreasonable to use the simplified potential source models in the PSHA, so a potential rupture surface model is proposed in this paper. Adopting this model, we analyze the seismic hazard near the Chelungpu fault that generated the Chi-Chi (Jiji) earthquake with magnitude 7.6 and the following conclusions are reached. (1) This model is reasonable on the base of focal mechanism, especially for sites near potential earthquakes with large magnitude; (2) The attitudes of potential rupture surfaces have great influence on the results of probabilistic seismic hazard analysis and seismic zoning.展开更多
The phenomenon of data explosion represents a severe challenge for the upcoming big data era.However,the current Internet architecture is insufficient for dealing with a huge amount of traffic owing to an increase in ...The phenomenon of data explosion represents a severe challenge for the upcoming big data era.However,the current Internet architecture is insufficient for dealing with a huge amount of traffic owing to an increase in redundant content transmission and the end-point-based communication model.Information-centric networking(ICN)is a paradigm for the future Internet that can be utilized to resolve the data explosion problem.In this paper,we focus on content-centric networking(CCN),one of the key candidate ICN architectures.CCN has been studied in various network environments with the aim of relieving network and server burden,especially in name-based forwarding and in-network caching functionalities.This paper studies the effect of several caching strategies in the CCN domain from the perspective of network and server overhead.Thus,we comprehensively analyze the in-network caching performance of CCN under several popular cache replication methods(i.e.,cache placement).We evaluate the performance with respect to wellknown Internet traffic patterns that follow certain probabilistic distributions,such as the Zipf/Mandelbrot–Zipf distributions,and flashcrowds.For the experiments,we developed an OPNET-based CCN simulator with a realistic Internet-like topology.展开更多
A Bayesian network approach is presented for probabilistic safety analysis(PSA)of railway lines.The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a rail...A Bayesian network approach is presented for probabilistic safety analysis(PSA)of railway lines.The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a railway line,such as light and speed limit signals,tunnel or viaduct entries or exits,cuttings and embankments,acoustic sounds received in the cabin,curves,etc.In addition,since the human error is very relevant for safety evaluation,the automatic train protection(ATP)systems and the driver behaviour and its time evolution are modelled and taken into account to determine the probabilities of human errors.The nodes of the Bayesian network,their links and the associated probability tables are automatically constructed based on the line data that need to be carefully given.The conditional probability tables are reproduced by closed formulas,which facilitate the modelling and the sensitivity analysis.A sorted list of the most dangerous elements in the line is obtained,which permits making decisions about the line safety and programming maintenance operations in order to optimize them and reduce the maintenance costs substantially.The proposed methodology is illustrated by its application to several cases that include real lines such as the Palencia-Santander and the Dublin-Belfast lines.展开更多
The original internal flooding probabilistic safety analysis (PSA) study of Krsko Nuclear Power Plant (two-loop Pressurized Water Reactor (PWR) plant of Westinghouse design) was performed in mid nineties and lim...The original internal flooding probabilistic safety analysis (PSA) study of Krsko Nuclear Power Plant (two-loop Pressurized Water Reactor (PWR) plant of Westinghouse design) was performed in mid nineties and limited to reactor core damage risk (Level 1 PSA). In 2003, it was, together with other safety and hazard analyses, subject to the Periodic Safety Review (PSR). In the PSR, it was stated that methodological PSA approaches and guidelines have evoluted during the past decade and several observations were provided, concerning the area screening process, residual risk and treatment of plant damage states and risk from radioactivity releases (i.e., Level 2 PSA). In order to address the PSR observations, upgrade ofKrsko NPP internal flooding PSA was undertaken. The area screening process was revisited in order to cover the areas without automatic reactor trip equipment. The model was extended to Level 2. Residual risk was estimated at both Level 1 and Level 2, in terms of core damage frequency (CDF) and large early release frequency (LERF), respectively.展开更多
基金supported by the Natural Science Foundation of Beijing Municipality(No.8222004),Chinathe National Natural Science Foundation of China(No.51978019)+3 种基金the Natural Science Foundation of Henan Province(No.252300420445),Chinathe Doctoral Research Initiation Fund of Henan University of Science and Technology(No.4007/13480062),Chinathe Henan Postdoctoral Foundation(No.13554005),Chinathe Joint Fund of Science and Technology R&D Program of Henan Province(No.232103810082),China。
文摘Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunnelling.In this study,a series of probabilistic analyses of surface and subsurface settlements was conducted considering the spatial variability of the friction angle and reference stiffness modulus,under different volumetric block proportions(Pv)and tunnel volume loss rates(ηt).The non-intrusive random finite difference method was used to investigate the probabilistic characteristics of maximum surface settlement,width of subsurface settlement trough,maximum subsurface settlement,and subsurface soil volume loss rate through Monte Carlo simulations.Additionally,a comparison between stochastic and deterministic analysis results is presented to underscore the significance of probabilistic analysis.Parametric analyses were subsequently conducted to investigate the impacts of the key input parameters in random fields on the settlement characteristics.The results indicate that scenarios with higher Pv or greaterηt result in a higher dispersion of stochastic analysis results.Neglecting the spatial variability of soil properties and relying solely on the mean values of material parameters for deterministic analysis may result in an underestimation of surface and subsurface settlements.From a probabilistic perspective,deterministic analysis alone may prove inadequate in accurately capturing the volumetric deformation mode of the soil above the tunnel crown,potentially affecting the prediction of subsurface settlement.
基金Projects(51375032,51175017,51245027)supported by the National Natural Science Foundation of China
文摘In order to describe and control the stress distribution and total deformation of bladed disk assemblies used in the aeroengine, a highly efficient and precise method of probabilistic analysis which is called extremum response surface method(ERSM) is produced based on the previous deterministic analysis results with the finite element model(FEM). In this work, many key nonlinear factors, such as the dynamic feature of the temperature load, the centrifugal force and the boundary conditions, are taken into consideration for the model. The changing patterns with time of bladed disk assemblies about stress distribution and total deformation are obtained during the deterministic analysis, and at the same time, the largest deformation and stress nodes of bladed disk assemblies are found and taken as input target of probabilistic analysis in a scientific and reasonable way. Not only their reliability, historical sample, extreme response surface(ERS) and the cumulative probability distribution function but also their sensitivity and effect probability are obtained. Main factors affecting stress distribution and total deformation of bladed disk assemblies are investigated through the sensitivity analysis of the model. Finally, compared with the response surface method(RSM) and the Monte Carlo simulation(MCS), the results show that this new approach is effective.
基金Project supported by the National Natural Science Foundation of China
文摘The current safety factor method for evaluating earth embankment stability is not very rational since the assessment of slope stability is really an uncertainty problem. In order to consider the random property of this problem, the probabilistic analysis is introduced herein. Finally, the stability of a real beach earth embankment is analysed by means of the suggested probabilitic approach. It may be seen that the results of analysis can represent the numerical assessment of the degree of seismic stability.
基金This project is financially supported by the National Natural Science Foundation of China
文摘The stability of slope is affected by a number of factors, some of which have not only random property but also fuzzy characteristic. Therefore, the analysis of slope stability is really an uncertain problem. The customary safety factor does not in reality reflect stability scientifically, quantitatively and practically. In order to obtain more practical results, the slope stability is treated as a fuzzy random event for the evaluation of its fuzzy probability. Finally, the seismic stability of an existing coastal embankment is analyzed by means of the suggested fuzzy probabilistic method. It may be seen that the results of analysis can more fully represent the numerical assessment of the degree of slope seismic stability.
基金This work was supported by the Poland National Center for Research and Development,under the grant DOB-BIO10/01/02/2019 within the Defence and Security Programme.
文摘Person-borne improvised explosive devices(PBIEDs)are often used in terrorist attacks in Western countries.This study aims to predict the trajectories of PBIED fragments and the subsequent safety risks for people exposed to this hazard.An explosive field test with a typical PBIED composed of a plastic explosive charge and steel nut enhancements was performed to record initial fragment behaviour,including positions,velocity,and trajectory angles.These data were used to predict the full trajectory of PBIED fragments using a probabilistic analysis.In the probabilistic analyses a probability of fatality or serious injury was computed.Based on the results presented,many practical conclusions can be drawn,for instance,regarding safe evacuation distances if a person were exposed to a suspected PBIED.
基金Supported by the National Natural Science Foundation of China(90412012) the Natural Science Foundation of Guangdong Province andthe Post-doctoral Science Foundation of China
文摘This paper mainly investigates the connectivity of the unreliable sensor grid network. We consider an unreliable sensor grid network with mn nodes placed in a certain planar area A, and we assume that each node has independent failure probability p and has the same transmission range R. This paper presents a new method for calculating the connectivity probability of the network, which uses thorough mathematical methods to derive the relationship among the network connectivity probability, the probability that a node is "failed" (not active), the numbers of node, and the node's transmission range in unreliable sensor networks. Our approach is more useful and efficient for given problem and conditions. Such as the numerical calculating results indicate that, for a 100×100 size sensot network, if node failure probability is bounded 0.5%, even if the transmission range is small (such as R = 10), we can still maintain very high connectivity probability (reach 95.8%). On the other hand, the simulation results show that building high connectivity probability is entirely possible on unreliable sensor grid networks.
基金National Key R&D Program of China under Grant No.2022YFC3003600National Natural Science Foundation of China(NSFC)under Grant No.51978023。
文摘Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution function provide valuable information.In this study,a simulation-based framework to evaluate these probabilistic characteristics in water distribution systems(WDSs)during post-earthquake recovery is developed.The framework first calculates pipeline failure probabilities using seismic fragility models and then generates damage samples through quasi-Monte Carlo simulations with Sobol’s sequence for faster convergence.System performance is assessed using a hydraulic model,and recovery simulations produce time-varying performance curves,where the dynamic importance of unrepaired damage determines repair sequences.Finally,the probabilistic characteristics of seismic performance indicators,resilience index,resilience loss,and recovery time are evaluated.The framework is applied in two benchmark WDSs with different layouts to investigate the probabilistic characteristics of their seismic performance and resilience.Application results show that the cumulative distribution function reveals the variations in resilience indicators for different exceedance probabilities,and there are dramatic differences among the recovery times corresponding to the system performance recovery targets of 80%,90%,and 100%.
基金supported by the Opening Fund of Key Laboratory of Geological Survey and Evaluation of Ministry of Education(No.GLAB 2024ZR03)the National Natural Science Foundation of China(No.42407248)+2 种基金the Guizhou Provincial Basic Research Program(Natural Science)(No.QKHJC-[2023]-YB066)the Key Laboratory of Smart Earth(No.KF2023YB04-02)the Fundamental Research Funds for the Central Universities。
文摘The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.
基金Foundation for Research and Science and Technology of New Zealand,No C05X0208 and C05X0301 Major Project of Chinese National Programs for Fundamental Research and Development (973 Program),No 2008CB425802
文摘A landslide displacement (DLL) attenuation model has been developed using spectral intensity and a ratio of critical acceleration coefficient to ground acceleration coefficient. In the development of the model,a New Zealand earthquake record data set with magnitudes ranging from 5.0 to 7.2 within a source distance of 175 km is used. The model can be used to carry out deterministic landslide displacement analysis,and readily extended to carry out probabilistic seismic landslide displacement analysis. DLL attenuation models have also been developed by using earthquake source terms,such as magnitude and source distance,that account for the effects of earthquake faulttype,source type,and site conditions. Sensitivity analyses show that the predicted DLL values from the new models are close to those from the Romeo model that was developed from an Italian earthquake record data set. The proposed models are also applied to an analysis of landslide displacements in the Wenchuan earthquake,and a comparison between the predicted and the observed results shows that the proposed models are reliable,and can be confidently used in mapping landslide potential.
基金the National Natural Science Foundation of China(Grant Nos.51978040 and 51378054)the National Basic Research Program of China(973 program,No.2015CB057800)for supporting this research.
文摘The failure to achieve minimum design overlap between secant piles compromises the ability of a structure to perform as designed,resulting in water leakage or even ground collapse.To establish a more realistic simulation and provide guidelines for designing a safe and cost-effective secant-pile wall,a three-dimensional model of a secant pile,considering the geometric imperfections of the diameter and direction of the borehole,is introduced.An ultrasonic cross-hole test was performed during the construction of secant piles in a launching shaft in Beijing,China.Based on the test results,the statistical characteristics of the pile diameters and orientation parameters were obtained.By taking the pile diameter D,inclination angleβ,and azimuth angleαas random variables,Monte Carlo simulations were performed to discuss the influence of different design parameters on the probability density functions of the overlap of secant piles.The obtained results show that the randomness of the inclination angle and pile diameter can be well described by a normal distribution,whereas the azimuth angle is more consistent with a uniform distribution.The integrity of the secant-pile wall can be overestimated without considering the uncertainty of geometric imperfections.The failure of the secant-pile wall increases substantially with increasing spatial variability in drilling inclination and diameter.A design flowchart for pile spacing under the target safety level is proposed to help engineers design a safe and economical pile wall.
文摘The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analysis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are considered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.
文摘This paper deals with the approximate solution of the Fredholm equation u TKu = f of the second kind from a probabilistic point of view. With Wiener typemeasures on the set of kernels and free terms we determine statistical features of the approximation process, i.e., the most likely rate of convergence and the dominating individual behavior. The analysis carried out for a kind of Galerkin-like method.
基金Project supported by the National Natural Science Foundation of China (No. 51678358)。
文摘In this paper,we propose a probabilistic method for analysing the collapse time of steel frame structures in a fire.The method considers the uncertainty of influencing factors.Tornado diagrams are used for sensitivity analysis of random variables.Structural analysis samples are selected by Monte Carlo method,and the collapse times of different structural samples are calculated by fire time history analysis.A collapse time fragility curve is fitted according to the calculated collapse times of the samples.A reliability index of the collapse time is used as a quantitative standard to evaluate the collapse performance of a steel frame in a fire.Finally,this method is applied to analyse the collapse time fragility of an eight-storey 3 D steel frame structure under different compartment fire scenarios and fire protection levels.According to the collapse time fragility curve,the effects of the different fire scenarios and protection levels on the collapse resistance of the structure under fire are evaluated.
文摘In this work, fragility analysis is performed to assess two groups of reinforced concrete structures. The first group of structures is composed of buildings that implement three common design practices; namely, fully infilled, weak ground story and short columns. The three design practices are applied during the design process of a reinforced concrete building. The structures of the second group vary according to the value of the behavioral factors used to define the seismic forces as specified in design procedures. Most seismic design codes belong to the class of prescriptive procedures where if certain constraints are fulfilled, the structure is considered safe. Prescriptive design procedures express the ability of the structure to absorb energy through inelastic deformation using the behavior factor. The basic objective of this work is to assess both groups of structures with reference to the limit-state probability of exceedance. Thus, four limit state fragility curves are developed on the basis of nonlinear static analysis for both groups of structures. Moreover, the 95% confidence intervals of the fragility curves are also calculated, taking into account two types of random variables that influence structural capacity and seismic demand.
基金Supported by the National Natural Science Foundation of China(61374140)Shanghai Pujiang Program(12PJ1402200)
文摘A novel approach named aligned mixture probabilistic principal component analysis(AMPPCA) is proposed in this study for fault detection of multimode chemical processes. In order to exploit within-mode correlations,the AMPPCA algorithm first estimates a statistical description for each operating mode by applying mixture probabilistic principal component analysis(MPPCA). As a comparison, the combined MPPCA is employed where monitoring results are softly integrated according to posterior probabilities of the test sample in each local model. For exploiting the cross-mode correlations, which may be useful but are inadvertently neglected due to separately held monitoring approaches, a global monitoring model is constructed by aligning all local models together. In this way, both within-mode and cross-mode correlations are preserved in this integrated space. Finally, the utility and feasibility of AMPPCA are demonstrated through a non-isothermal continuous stirred tank reactor and the TE benchmark process.
基金Supported by the National Program on Key Basic Research Project(No.2013CB329502)the National Natural Science Foundation of China(No.61202212)+1 种基金the Special Research Project of the Educational Department of Shaanxi Province of China(No.15JK1038)the Key Research Project of Baoji University of Arts and Sciences(No.ZK16047)
文摘In recent years,multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas,especially for automatic image annotation,whose purpose is to provide an efficient and effective searching environment for users to query their images more easily. In this paper,a semi-supervised learning based probabilistic latent semantic analysis( PLSA) model for automatic image annotation is presenred. Since it's often hard to obtain or create labeled images in large quantities while unlabeled ones are easier to collect,a transductive support vector machine( TSVM) is exploited to enhance the quality of the training image data. Then,different image features with different magnitudes will result in different performance for automatic image annotation. To this end,a Gaussian normalization method is utilized to normalize different features extracted from effective image regions segmented by the normalized cuts algorithm so as to reserve the intrinsic content of images as complete as possible. Finally,a PLSA model with asymmetric modalities is constructed based on the expectation maximization( EM) algorithm to predict a candidate set of annotations with confidence scores. Extensive experiments on the general-purpose Corel5k dataset demonstrate that the proposed model can significantly improve performance of traditional PLSA for the task of automatic image annotation.
基金Foundation item: Joint Seismological Science Foundation of China (104065)Social Public Welfare Special Foundation of the Na-tional Research Institutes (2005DIB3J119).
文摘Potential sources are simplified as point sources or linear sources in current probabilistic seismic hazard analysis (PSHA) methods. Focus size of large earthquakes is considerable, and fault rupture attitudes may have great influence upon the seismic hazard of a site which is near the source. Under this circumstance, it is unreasonable to use the simplified potential source models in the PSHA, so a potential rupture surface model is proposed in this paper. Adopting this model, we analyze the seismic hazard near the Chelungpu fault that generated the Chi-Chi (Jiji) earthquake with magnitude 7.6 and the following conclusions are reached. (1) This model is reasonable on the base of focal mechanism, especially for sites near potential earthquakes with large magnitude; (2) The attitudes of potential rupture surfaces have great influence on the results of probabilistic seismic hazard analysis and seismic zoning.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(2014R1A1A2057796)and(2015R1D1A1A01059049)
文摘The phenomenon of data explosion represents a severe challenge for the upcoming big data era.However,the current Internet architecture is insufficient for dealing with a huge amount of traffic owing to an increase in redundant content transmission and the end-point-based communication model.Information-centric networking(ICN)is a paradigm for the future Internet that can be utilized to resolve the data explosion problem.In this paper,we focus on content-centric networking(CCN),one of the key candidate ICN architectures.CCN has been studied in various network environments with the aim of relieving network and server burden,especially in name-based forwarding and in-network caching functionalities.This paper studies the effect of several caching strategies in the CCN domain from the perspective of network and server overhead.Thus,we comprehensively analyze the in-network caching performance of CCN under several popular cache replication methods(i.e.,cache placement).We evaluate the performance with respect to wellknown Internet traffic patterns that follow certain probabilistic distributions,such as the Zipf/Mandelbrot–Zipf distributions,and flashcrowds.For the experiments,we developed an OPNET-based CCN simulator with a realistic Internet-like topology.
文摘A Bayesian network approach is presented for probabilistic safety analysis(PSA)of railway lines.The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a railway line,such as light and speed limit signals,tunnel or viaduct entries or exits,cuttings and embankments,acoustic sounds received in the cabin,curves,etc.In addition,since the human error is very relevant for safety evaluation,the automatic train protection(ATP)systems and the driver behaviour and its time evolution are modelled and taken into account to determine the probabilities of human errors.The nodes of the Bayesian network,their links and the associated probability tables are automatically constructed based on the line data that need to be carefully given.The conditional probability tables are reproduced by closed formulas,which facilitate the modelling and the sensitivity analysis.A sorted list of the most dangerous elements in the line is obtained,which permits making decisions about the line safety and programming maintenance operations in order to optimize them and reduce the maintenance costs substantially.The proposed methodology is illustrated by its application to several cases that include real lines such as the Palencia-Santander and the Dublin-Belfast lines.
文摘The original internal flooding probabilistic safety analysis (PSA) study of Krsko Nuclear Power Plant (two-loop Pressurized Water Reactor (PWR) plant of Westinghouse design) was performed in mid nineties and limited to reactor core damage risk (Level 1 PSA). In 2003, it was, together with other safety and hazard analyses, subject to the Periodic Safety Review (PSR). In the PSR, it was stated that methodological PSA approaches and guidelines have evoluted during the past decade and several observations were provided, concerning the area screening process, residual risk and treatment of plant damage states and risk from radioactivity releases (i.e., Level 2 PSA). In order to address the PSR observations, upgrade ofKrsko NPP internal flooding PSA was undertaken. The area screening process was revisited in order to cover the areas without automatic reactor trip equipment. The model was extended to Level 2. Residual risk was estimated at both Level 1 and Level 2, in terms of core damage frequency (CDF) and large early release frequency (LERF), respectively.