To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military ...To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military standards.The PDT method holds the view that there exist defects such as machining scratches and service cracks in the tenon-groove structures of aeroengine disks.However,it is challenging to conduct PDT assessment due to the scarcity of effective Probability of Detection(POD)model and anomaly distribution model.Through a series of Nondestructive Testing(NDT)experiments,the POD model of real cracks in tenon-groove structures is constructed for the first time by employing the Transfer Function Method(TFM).A novel anomaly distribution model is derived through the utilization of the POD model,instead of using the infeasible field data accumulation method.Subsequently,a framework for calculating the Probability of Failure(POF)of the tenon-groove structures is established,and the aforementioned two models exert a significant influence on the results of POF.展开更多
Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradi...Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.展开更多
As the proportion of newenergy increases,the traditional cumulant method(CM)produces significant errorswhen performing probabilistic load flow(PLF)calculations with large-scale wind power integrated.Considering the wi...As the proportion of newenergy increases,the traditional cumulant method(CM)produces significant errorswhen performing probabilistic load flow(PLF)calculations with large-scale wind power integrated.Considering the wind speed correlation,a multi-scenario PLF calculation method that combines random sampling and segmented discrete wind farm power was proposed.Firstly,based on constructing discrete scenes of wind farms,the Nataf transform is used to handle the correlation between wind speeds.Then,the random sampling method determines the output probability of discrete wind power scenarios when wind speed exhibits correlation.Finally,the PLF calculation results of each scenario areweighted and superimposed following the total probability formula to obtain the final power flow calculation result.Verified in the IEEE standard node system,the absolute percent error(APE)for the mean and standard deviation(SD)of the node voltages and branch active power are all within 1%,and the average root mean square(AMSR)values of the probability curves are all less than 1%.展开更多
Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunne...Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunnelling.In this study,a series of probabilistic analyses of surface and subsurface settlements was conducted considering the spatial variability of the friction angle and reference stiffness modulus,under different volumetric block proportions(Pv)and tunnel volume loss rates(ηt).The non-intrusive random finite difference method was used to investigate the probabilistic characteristics of maximum surface settlement,width of subsurface settlement trough,maximum subsurface settlement,and subsurface soil volume loss rate through Monte Carlo simulations.Additionally,a comparison between stochastic and deterministic analysis results is presented to underscore the significance of probabilistic analysis.Parametric analyses were subsequently conducted to investigate the impacts of the key input parameters in random fields on the settlement characteristics.The results indicate that scenarios with higher Pv or greaterηt result in a higher dispersion of stochastic analysis results.Neglecting the spatial variability of soil properties and relying solely on the mean values of material parameters for deterministic analysis may result in an underestimation of surface and subsurface settlements.From a probabilistic perspective,deterministic analysis alone may prove inadequate in accurately capturing the volumetric deformation mode of the soil above the tunnel crown,potentially affecting the prediction of subsurface settlement.展开更多
Assessing the vulnerability of a platform is crucial in its design.In fact,the results obtained from vulnerability analyses provide valuable information,leading to precise design choices or corrective solutions that e...Assessing the vulnerability of a platform is crucial in its design.In fact,the results obtained from vulnerability analyses provide valuable information,leading to precise design choices or corrective solutions that enhance the platform's chances of surviving different scenarios.Such scenarios can involve various types of threats that can affect the platform's survivability.Among such,blast waves impacting the platform's structure represent critical conditions that have not yet been studied in detail.That is,frameworks for vulnerability assessment that can deal with blast loading have not been presented yet.In this context,this work presents a fast-running engineering tool that can quantify the risk that a structure fails when it is subjected to blast loading from the detonation of high explosive-driven threats detonating at various distances from the structure itself.The tool has been implemented in an in-house software that calculates vulnerability to various impacting objects,and its capabilities have been shown through a simplified,yet realistic,case study.The novelty of this research lies in the development of an integrated computational environment capable of calculating the platform's vulnerability to blast waves,without the need for running expensive finite element simulations.In fact,the proposed tool is fully based on analytical models integrated with a probabilistic approach for vulnerability calculation.展开更多
The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smoot...The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smooth transformation between small-scale laboratory specimens’fatigue properties and full-scale engineering components’fatigue strength has been a long-term challenge.In this work,two dominant factors impeding the smooth transformation—notch and size effect were experimentally studied,in which fatigue tests on Al 7075-T6511(a very high-strength aviation alloy)notched specimens of different scales were carried out.Fractography analyses identified the evidence of the size effect on notch fatigue damage evolution.Accordingly,the Energy Field Intensity(EFI)initially developed for multiaxial notch fatigue analysis was improved by utilizing the volume ratio of the Effective Damage Zones(EDZs)for size effect correction.In particular,it was extended to a probabilistic model considering the inherent variability of the fatigue phenomenon.The experimental data of Al 7075-T6511 notched specimens and the model-predicted results were compared,indicating the high potential of the proposed approach in fatigue evaluation under combined notch and size effects.展开更多
The study presents the results of over 30,000 numerical analyses on the stability of lava tubes under lunar conditions.The research considered random irregularities in cave geometry and their impact on stability,with ...The study presents the results of over 30,000 numerical analyses on the stability of lava tubes under lunar conditions.The research considered random irregularities in cave geometry and their impact on stability,with a particular focus on the geometric characteristics of identified collapses.We propose a procedure for extracting the collapse areas and integrating it into the stability analysis results.The results were examined to assess the possibility of describing the geometry characteristics of collapses using commonly applied probability density distributions,such as normal or lognormal distribution.Our aim is to facilitate future risk assessment of lunar caves.Such an assessment will be essential prior to robotically exploring caves beneath the lunar surface and can be extended to be used for planetary caves beyond the Moon.Our findings indicate that several collapse characteristics can be represented by unimodal probability density distributions,which could significantly simplify the candidate selection process.Based on our results,we also highlight several key directions for future research and suggested implications related to their future exploration.展开更多
Pressure has been introduced into power systems owing to the intermittent and uncertain nature of renewable energy.As a result,energy resource aggregators are emerging in the electricity market to realize sustainable ...Pressure has been introduced into power systems owing to the intermittent and uncertain nature of renewable energy.As a result,energy resource aggregators are emerging in the electricity market to realize sustainable and economic advantages through distributed generation,energy storage,and demand response resources.However,resource aggregators face the challenge of dealing with the uncertainty of renewable energy generation and setting appropriate incentives to exploit substantial energy flexibility in the building sector.In this study,a risk-aware optimal dispatch strategy that integrates probabilistic renewable energy prediction and bi-level building flexibility engagements is proposed.A natural gradient boosting algorithm(NGBoost),which requires no prior knowledge of uncertain variables,was adopted to develop a probabilistic photovoltaic(PV)forecasting model.The lack of suitable flexibility incentives is addressed by a novel interactive flexibility engagement scheme that can take into account building users'willingness and optimize the building flexibility provision.The chance-constrained programming method was applied to manage the supply-demand balance of the resource aggregator and ensure risk-aware decision-making in power dispatch.The case study results show the strong economic and environmental performance of the proposed strategy.The proposed strategy leads to a win-win situation in which profit increases through a load reduction of 13% and a carbon emission reduction of 3% is achieved for different stakeholders,which also shows a trade-off between the economic benefits and the risk of supply shortage.展开更多
Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-len...Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-length fields,and the accurate identification of field boundaries has a great impact on the subsequent analysis and final performance.Hence,this paper proposes a new protocol segmentation method based on Information-theoretic statistical analysis for binary protocols by formulating the field segmentation of unsupervised binary protocols as a probabilistic inference problem and modeling its uncertainty.Specifically,we design four related constructions between entropy changes and protocol field segmentation,introduce random variables,and construct joint probability distributions with traffic sample observations.Probabilistic inference is then performed to identify the possible protocol segmentation points.Extensive trials on nine common public and industrial control protocols show that the proposed method yields higher-quality protocol segmentation results.展开更多
The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of ...The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.展开更多
The published article titled“Comparison of Structural Probabilistic and Non-Probabilistic Reliability Computational Methods under Big Data Condition”[1]has been retracted from Structural Durability&Health Monito...The published article titled“Comparison of Structural Probabilistic and Non-Probabilistic Reliability Computational Methods under Big Data Condition”[1]has been retracted from Structural Durability&Health Monitoring(SDHM),Vol.16,No.2,2022,pp.129–143.展开更多
Aiming at the problems of increasing uncertainty of low-carbon generation energy in active distribution network(ADN)and the difficulty of security assessment of distribution network,this paper proposes a two-phase sch...Aiming at the problems of increasing uncertainty of low-carbon generation energy in active distribution network(ADN)and the difficulty of security assessment of distribution network,this paper proposes a two-phase scheduling model for flexible resources in ADN based on probabilistic risk perception.First,a full-cycle probabilistic trend sequence is constructed based on the source-load historical data,and in the day-ahead scheduling phase,the response interval of the flexibility resources on the load and storage side is optimized based on the probabilistic trend,with the probability of the security boundary as the security constraint,and with the economy as the objective.Then in the intraday phase,the core security and economic operation boundary of theADNis screened in real time.Fromthere,it quantitatively senses the degree of threat to the core security and economic operation boundary under the current source-load prediction information,and identifies the strictly secure and low/high-risk time periods.Flexibility resources within the response interval are dynamically adjusted in real-time by focusing on high-risk periods to cope with future core risks of the distribution grid.Finally,the improved IEEE 33-node distribution system is simulated to obtain the flexibility resource scheduling scheme on the load and storage side.Thescheduling results are evaluated from the perspectives of risk probability and flexible resource utilization efficiency,and the analysis shows that the scheduling model in this paper can promote the consumption of low-carbon energy from wind and photovoltaic sourceswhile reducing the operational risk of the distribution network.展开更多
This study explores the initiation mechanisms of convective wind events,emphasizing their variability across different atmospheric circulation patterns.Historically,the inadequate feature categorization within multi-f...This study explores the initiation mechanisms of convective wind events,emphasizing their variability across different atmospheric circulation patterns.Historically,the inadequate feature categorization within multi-faceted forecast models has led to suboptimal forecast efficacy,particularly for events in dynamically weak forcing conditions during the warm season.To improve the prediction accuracy of convective wind events,this research introduces a novel approach that combines machine learning techniques to identify varying meteorological flow regimes.Convective winds(CWs)are defined as wind speeds reaching or exceeding 17.2 m s^(-1)and severe convective winds(SCWs)as speeds surpassing 24.5 m s^(-1).This study examines the spatial and temporal distribution of CW and SCW events from 2013 to 2021 and their circulation dynamics associated with three primary flow regimes:cold air advection,warm air advection,and quasibarotropic conditions.Key circulation features are used as input variables to construct an effective weather system pattern recognition model.This model employs an Adaptive Boosting(AdaBoost)algorithm combined with Random Under-Sampling(RUS)to address the class imbalance issue,achieving a recognition accuracy of 90.9%.Furthermore,utilizing factor analysis and Support Vector Machine(SVM)techniques,three specialized and independent probabilistic prediction models are developed based on the variance in predictor distributions across different flow regimes.By integrating the type of identification model with these prediction models,an enhanced comprehensive model is constructed.This advanced model autonomously identifies flow types and accordingly selects the most appropriate prediction model.Over a three-year validation period,this improved model outperformed the initially unclassified model in terms of prediction accuracy.Notably,for CWs and SCWs,the maximum Peirce Skill Score(PSS)increased from 0.530 and 0.702 to 0.628 and 0.726,respectively,and the corresponding maximum Threat Score(TS)improved from 0.087 and 0.024 to 0.120 and 0.026.These improvements were significant across all samples,with the cold air advection type showing the greatest enhancement due to the significant spatial variability of each factor.Additionally,the model improved forecast precision by prioritizing thermal factors,which played a key role in modulating false alarm rates in warm air advection and quasi-barotropic flow regimes.The results confirm the critical contribution of circulation feature recognition and segmented modeling to enhancing the adaptability and predictive accuracy of weather forecast models.展开更多
Fluid identification and anisotropic parameters characterization are crucial for shale reservoir exploration and development.However,the anisotropic reflection coefficient equation,based on the transverse isotropy wit...Fluid identification and anisotropic parameters characterization are crucial for shale reservoir exploration and development.However,the anisotropic reflection coefficient equation,based on the transverse isotropy with a vertical axis of symmetry(VTI)medium assumption,involves numerous parameters to be inverted.This complexity reduces its stability and impacts the accuracy of seismic amplitude variation with offset(AVO)inversion results.In this study,a novel anisotropic equation that includes the fluid term and Thomsen anisotropic parameters is rewritten,which reduces the equation's dimensionality and increases its stability.Additionally,the traditional Markov Chain Monte Carlo(MCMC)inversion algorithm exhibits a high rejection rate for random samples and relies on known parameter distributions such as the Gaussian distribution,limiting the algorithm's convergence and sample randomness.To address these limitations and evaluate the uncertainty of AVO inversion,the IADR-Gibbs algorithm is proposed,which incorporates the Independent Adaptive Delayed Rejection(IADR)algorithm with the Gibbs sampling algorithm.Grounded in Bayesian theory,the new algorithm introduces support points to construct a proposal distribution of non-parametric distribution and reselects the rejected samples according to the Delayed Rejection(DR)strategy.Rejected samples are then added to the support points to update the proposal distribution function adaptively.The equation rewriting method and the IADR-Gibbs algorithm improve the accuracy and robustness of AVO inversion.The effectiveness and applicability of the proposed method are validated through synthetic gather tests and practical data applications.展开更多
Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution funct...Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution function provide valuable information.In this study,a simulation-based framework to evaluate these probabilistic characteristics in water distribution systems(WDSs)during post-earthquake recovery is developed.The framework first calculates pipeline failure probabilities using seismic fragility models and then generates damage samples through quasi-Monte Carlo simulations with Sobol’s sequence for faster convergence.System performance is assessed using a hydraulic model,and recovery simulations produce time-varying performance curves,where the dynamic importance of unrepaired damage determines repair sequences.Finally,the probabilistic characteristics of seismic performance indicators,resilience index,resilience loss,and recovery time are evaluated.The framework is applied in two benchmark WDSs with different layouts to investigate the probabilistic characteristics of their seismic performance and resilience.Application results show that the cumulative distribution function reveals the variations in resilience indicators for different exceedance probabilities,and there are dramatic differences among the recovery times corresponding to the system performance recovery targets of 80%,90%,and 100%.展开更多
High-Resolution(HR)data on flow fields are critical for accurately evaluating the aerodynamic performance of aircraft.However,acquiring such data through large-scale numerical simulations or wind tunnel experiments is...High-Resolution(HR)data on flow fields are critical for accurately evaluating the aerodynamic performance of aircraft.However,acquiring such data through large-scale numerical simulations or wind tunnel experiments is highly resource intensive.This paper proposes a FlowViT-Diff framework that integrates a Vision Transformer(ViT)with an enhanced denoising diffusion probabilistic model for the Super-Resolution(SR)reconstruction of HR flow fields based on low-resolution inputs.It provides a quick initial prediction of the HR flow field by optimizing the ViT architecture,and incorporates this preliminary output as guidance within an enhanced diffusion model.The latter captures the Gaussian noise distribution during forward diffusion and progressively removes it during backward diffusion to generate the flow field.Experiments on various supercritical airfoils under different flow conditions show that FlowViT-Diff can robustly reconstruct the flow field across multiple levels of downsampling.It obtains more consistent global and local features than traditional SR methods,and yields a 3.6-fold increase in its training speed via transfer learning.Its accuracy of reconstruction of the flow field is 99.7%under ultra-low downsampling.The results demonstrate that Flow Vi T-Diff not only exhibits effective flow field reconstruction capabilities,but also provides two reconstruction strategies,both of which show effective transferability.展开更多
Given a graph F and a positive integer r,the size Ramsey number R(F,r)is defined as the smallest integer m such that there exists a graph G with m edges where every r-color edge coloring of G results in a monochromati...Given a graph F and a positive integer r,the size Ramsey number R(F,r)is defined as the smallest integer m such that there exists a graph G with m edges where every r-color edge coloring of G results in a monochromatic copy of F.Let P_(n)and C_(n)represent a path and a cycle on n vertices,respectively.In this paper,we establish that for sufficiently large n,R(P_(n),P_(n),P_(n))<772n.Furthermore,we demonstrate that for sufficiently large even integers n,R(P_(n),P_(n),C_(n))≤17093n.For sufficiently large odd integer n,we show that R(P_(n),P_(n),C_(n))≥(7.5-o(1))n.展开更多
The analysis of the dynamics of surface girders is of great importance in the design of engineering structures such as steel welded bridge plane girders or concrete plate-column structures.This work is an extension of...The analysis of the dynamics of surface girders is of great importance in the design of engineering structures such as steel welded bridge plane girders or concrete plate-column structures.This work is an extension of the classical deterministic problem of free vibrations of thin(Kirchhoff)plates.Themain aim of this work is the study of stochastic eigenvibrations of thin(Kirchhoff)elastic plates resting on internal continuous and column supports by the Boundary Element Method(BEM).This work is a continuation of previous research related to the random approach in plate analysis using the BEM.The static fundamental solution(Green’s function)is applied,coupled with a nonsingular formulation of the boundary and domain integral equations.These are derived using a modified and simplified formulation of the boundary conditions,inwhich there is no need to introduce theKirchhoff forces on a plate boundary.The role of the Kirchhoff corner forces is played by the boundary elements placed close to a single corner.Internal column or linear continuous supports are introduced using the Bezine technique,where the additional collocation points are introduced inside a plate domain.This allows for significant simplification of the BEM computational algorithm.An application of the polynomial approximations in the Least Squares Method(LSM)recovery of the structural response is done.The probabilistic analysis will employ three independent computational approaches:semi-analytical method(SAM),stochastic perturbation technique(SPT),and Monte-Carlo simulations.Numerical investigations include the fundamental eigenfrequencies of an elastic,thin,homogeneous,and isotropic plate.展开更多
The increasing complexity of on-orbit tasks imposes great demands on the flexible operation of space robotic arms, prompting the development of space robots from single-arm manipulation to multi-arm collaboration. In ...The increasing complexity of on-orbit tasks imposes great demands on the flexible operation of space robotic arms, prompting the development of space robots from single-arm manipulation to multi-arm collaboration. In this paper, a combined approach of Learning from Demonstration (LfD) and Reinforcement Learning (RL) is proposed for space multi-arm collaborative skill learning. The combination effectively resolves the trade-off between learning efficiency and feasible solution in LfD, as well as the time-consuming pursuit of the optimal solution in RL. With the prior knowledge of LfD, space robotic arms can achieve efficient guided learning in high-dimensional state-action space. Specifically, an LfD approach with Probabilistic Movement Primitives (ProMP) is firstly utilized to encode and reproduce the demonstration actions, generating a distribution as the initialization of policy. Then in the RL stage, a Relative Entropy Policy Search (REPS) algorithm modified in continuous state-action space is employed for further policy improvement. More importantly, the learned behaviors can maintain and reflect the characteristics of demonstrations. In addition, a series of supplementary policy search mechanisms are designed to accelerate the exploration process. The effectiveness of the proposed method has been verified both theoretically and experimentally. Moreover, comparisons with state-of-the-art methods have confirmed the outperformance of the approach.展开更多
基金supported by the National Major Science and Technology Project,China(No.J2019-Ⅳ-0007-0075)the Fundamental Research Funds for the Central Universities,China(No.JKF-20240036)。
文摘To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military standards.The PDT method holds the view that there exist defects such as machining scratches and service cracks in the tenon-groove structures of aeroengine disks.However,it is challenging to conduct PDT assessment due to the scarcity of effective Probability of Detection(POD)model and anomaly distribution model.Through a series of Nondestructive Testing(NDT)experiments,the POD model of real cracks in tenon-groove structures is constructed for the first time by employing the Transfer Function Method(TFM).A novel anomaly distribution model is derived through the utilization of the POD model,instead of using the infeasible field data accumulation method.Subsequently,a framework for calculating the Probability of Failure(POF)of the tenon-groove structures is established,and the aforementioned two models exert a significant influence on the results of POF.
基金the Young Investigator Group“Artificial Intelligence for Probabilistic Weather Forecasting”funded by the Vector Stiftungfunding from the Federal Ministry of Education and Research(BMBF)and the Baden-Württemberg Ministry of Science as part of the Excellence Strategy of the German Federal and State Governments。
文摘Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.
基金supported by Basic Science Research Program through the National Natural Science Foundation of China(Grant No.61867003).
文摘As the proportion of newenergy increases,the traditional cumulant method(CM)produces significant errorswhen performing probabilistic load flow(PLF)calculations with large-scale wind power integrated.Considering the wind speed correlation,a multi-scenario PLF calculation method that combines random sampling and segmented discrete wind farm power was proposed.Firstly,based on constructing discrete scenes of wind farms,the Nataf transform is used to handle the correlation between wind speeds.Then,the random sampling method determines the output probability of discrete wind power scenarios when wind speed exhibits correlation.Finally,the PLF calculation results of each scenario areweighted and superimposed following the total probability formula to obtain the final power flow calculation result.Verified in the IEEE standard node system,the absolute percent error(APE)for the mean and standard deviation(SD)of the node voltages and branch active power are all within 1%,and the average root mean square(AMSR)values of the probability curves are all less than 1%.
基金supported by the Natural Science Foundation of Beijing Municipality(No.8222004),Chinathe National Natural Science Foundation of China(No.51978019)+3 种基金the Natural Science Foundation of Henan Province(No.252300420445),Chinathe Doctoral Research Initiation Fund of Henan University of Science and Technology(No.4007/13480062),Chinathe Henan Postdoctoral Foundation(No.13554005),Chinathe Joint Fund of Science and Technology R&D Program of Henan Province(No.232103810082),China。
文摘Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunnelling.In this study,a series of probabilistic analyses of surface and subsurface settlements was conducted considering the spatial variability of the friction angle and reference stiffness modulus,under different volumetric block proportions(Pv)and tunnel volume loss rates(ηt).The non-intrusive random finite difference method was used to investigate the probabilistic characteristics of maximum surface settlement,width of subsurface settlement trough,maximum subsurface settlement,and subsurface soil volume loss rate through Monte Carlo simulations.Additionally,a comparison between stochastic and deterministic analysis results is presented to underscore the significance of probabilistic analysis.Parametric analyses were subsequently conducted to investigate the impacts of the key input parameters in random fields on the settlement characteristics.The results indicate that scenarios with higher Pv or greaterηt result in a higher dispersion of stochastic analysis results.Neglecting the spatial variability of soil properties and relying solely on the mean values of material parameters for deterministic analysis may result in an underestimation of surface and subsurface settlements.From a probabilistic perspective,deterministic analysis alone may prove inadequate in accurately capturing the volumetric deformation mode of the soil above the tunnel crown,potentially affecting the prediction of subsurface settlement.
文摘Assessing the vulnerability of a platform is crucial in its design.In fact,the results obtained from vulnerability analyses provide valuable information,leading to precise design choices or corrective solutions that enhance the platform's chances of surviving different scenarios.Such scenarios can involve various types of threats that can affect the platform's survivability.Among such,blast waves impacting the platform's structure represent critical conditions that have not yet been studied in detail.That is,frameworks for vulnerability assessment that can deal with blast loading have not been presented yet.In this context,this work presents a fast-running engineering tool that can quantify the risk that a structure fails when it is subjected to blast loading from the detonation of high explosive-driven threats detonating at various distances from the structure itself.The tool has been implemented in an in-house software that calculates vulnerability to various impacting objects,and its capabilities have been shown through a simplified,yet realistic,case study.The novelty of this research lies in the development of an integrated computational environment capable of calculating the platform's vulnerability to blast waves,without the need for running expensive finite element simulations.In fact,the proposed tool is fully based on analytical models integrated with a probabilistic approach for vulnerability calculation.
基金support from the Key Program of the National Natural Science Foundation of China(No.12232004)the Training Program of the Sichuan Province Science and the Technology Innovation Seedling Project(No.MZGC20230012)are acknowledged.
文摘The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smooth transformation between small-scale laboratory specimens’fatigue properties and full-scale engineering components’fatigue strength has been a long-term challenge.In this work,two dominant factors impeding the smooth transformation—notch and size effect were experimentally studied,in which fatigue tests on Al 7075-T6511(a very high-strength aviation alloy)notched specimens of different scales were carried out.Fractography analyses identified the evidence of the size effect on notch fatigue damage evolution.Accordingly,the Energy Field Intensity(EFI)initially developed for multiaxial notch fatigue analysis was improved by utilizing the volume ratio of the Effective Damage Zones(EDZs)for size effect correction.In particular,it was extended to a probabilistic model considering the inherent variability of the fatigue phenomenon.The experimental data of Al 7075-T6511 notched specimens and the model-predicted results were compared,indicating the high potential of the proposed approach in fatigue evaluation under combined notch and size effects.
基金The work was performed based on the research project no.2023/51/D/ST10/01956,financed by the National Science Center,Poland.
文摘The study presents the results of over 30,000 numerical analyses on the stability of lava tubes under lunar conditions.The research considered random irregularities in cave geometry and their impact on stability,with a particular focus on the geometric characteristics of identified collapses.We propose a procedure for extracting the collapse areas and integrating it into the stability analysis results.The results were examined to assess the possibility of describing the geometry characteristics of collapses using commonly applied probability density distributions,such as normal or lognormal distribution.Our aim is to facilitate future risk assessment of lunar caves.Such an assessment will be essential prior to robotically exploring caves beneath the lunar surface and can be extended to be used for planetary caves beyond the Moon.Our findings indicate that several collapse characteristics can be represented by unimodal probability density distributions,which could significantly simplify the candidate selection process.Based on our results,we also highlight several key directions for future research and suggested implications related to their future exploration.
基金financially supported by the Collaborative Research Fund(C5018-20GF)of the Research Grant Council(RGC)of Hong Kong Special Administrative Regionthe Shenzhen Science and Technology Innovation Commission Grant(KCXST20221021111203007)。
文摘Pressure has been introduced into power systems owing to the intermittent and uncertain nature of renewable energy.As a result,energy resource aggregators are emerging in the electricity market to realize sustainable and economic advantages through distributed generation,energy storage,and demand response resources.However,resource aggregators face the challenge of dealing with the uncertainty of renewable energy generation and setting appropriate incentives to exploit substantial energy flexibility in the building sector.In this study,a risk-aware optimal dispatch strategy that integrates probabilistic renewable energy prediction and bi-level building flexibility engagements is proposed.A natural gradient boosting algorithm(NGBoost),which requires no prior knowledge of uncertain variables,was adopted to develop a probabilistic photovoltaic(PV)forecasting model.The lack of suitable flexibility incentives is addressed by a novel interactive flexibility engagement scheme that can take into account building users'willingness and optimize the building flexibility provision.The chance-constrained programming method was applied to manage the supply-demand balance of the resource aggregator and ensure risk-aware decision-making in power dispatch.The case study results show the strong economic and environmental performance of the proposed strategy.The proposed strategy leads to a win-win situation in which profit increases through a load reduction of 13% and a carbon emission reduction of 3% is achieved for different stakeholders,which also shows a trade-off between the economic benefits and the risk of supply shortage.
文摘Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-length fields,and the accurate identification of field boundaries has a great impact on the subsequent analysis and final performance.Hence,this paper proposes a new protocol segmentation method based on Information-theoretic statistical analysis for binary protocols by formulating the field segmentation of unsupervised binary protocols as a probabilistic inference problem and modeling its uncertainty.Specifically,we design four related constructions between entropy changes and protocol field segmentation,introduce random variables,and construct joint probability distributions with traffic sample observations.Probabilistic inference is then performed to identify the possible protocol segmentation points.Extensive trials on nine common public and industrial control protocols show that the proposed method yields higher-quality protocol segmentation results.
基金supported by the Opening Fund of Key Laboratory of Geological Survey and Evaluation of Ministry of Education(No.GLAB 2024ZR03)the National Natural Science Foundation of China(No.42407248)+2 种基金the Guizhou Provincial Basic Research Program(Natural Science)(No.QKHJC-[2023]-YB066)the Key Laboratory of Smart Earth(No.KF2023YB04-02)the Fundamental Research Funds for the Central Universities。
文摘The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.
文摘The published article titled“Comparison of Structural Probabilistic and Non-Probabilistic Reliability Computational Methods under Big Data Condition”[1]has been retracted from Structural Durability&Health Monitoring(SDHM),Vol.16,No.2,2022,pp.129–143.
基金supported by Key Technology Research and Application of Online Control Simulation and Intelligent Decision Making for Active Distribution Network(5108-202218280A-2-377-XG).
文摘Aiming at the problems of increasing uncertainty of low-carbon generation energy in active distribution network(ADN)and the difficulty of security assessment of distribution network,this paper proposes a two-phase scheduling model for flexible resources in ADN based on probabilistic risk perception.First,a full-cycle probabilistic trend sequence is constructed based on the source-load historical data,and in the day-ahead scheduling phase,the response interval of the flexibility resources on the load and storage side is optimized based on the probabilistic trend,with the probability of the security boundary as the security constraint,and with the economy as the objective.Then in the intraday phase,the core security and economic operation boundary of theADNis screened in real time.Fromthere,it quantitatively senses the degree of threat to the core security and economic operation boundary under the current source-load prediction information,and identifies the strictly secure and low/high-risk time periods.Flexibility resources within the response interval are dynamically adjusted in real-time by focusing on high-risk periods to cope with future core risks of the distribution grid.Finally,the improved IEEE 33-node distribution system is simulated to obtain the flexibility resource scheduling scheme on the load and storage side.Thescheduling results are evaluated from the perspectives of risk probability and flexible resource utilization efficiency,and the analysis shows that the scheduling model in this paper can promote the consumption of low-carbon energy from wind and photovoltaic sourceswhile reducing the operational risk of the distribution network.
基金Guangdong S&T Program(2024A1111120024)CMA Innovation and Development Fund(CXFZ2024J014)+3 种基金CMA Youth Innovation Team(CMA2024QN01)PRB Meteorological Open Research Fund(ZJLY202425-GD02)GBA Meteorological S&T Program(GHMA2024Y04)Guangzhou Meteorological Research Project(Z202401)。
文摘This study explores the initiation mechanisms of convective wind events,emphasizing their variability across different atmospheric circulation patterns.Historically,the inadequate feature categorization within multi-faceted forecast models has led to suboptimal forecast efficacy,particularly for events in dynamically weak forcing conditions during the warm season.To improve the prediction accuracy of convective wind events,this research introduces a novel approach that combines machine learning techniques to identify varying meteorological flow regimes.Convective winds(CWs)are defined as wind speeds reaching or exceeding 17.2 m s^(-1)and severe convective winds(SCWs)as speeds surpassing 24.5 m s^(-1).This study examines the spatial and temporal distribution of CW and SCW events from 2013 to 2021 and their circulation dynamics associated with three primary flow regimes:cold air advection,warm air advection,and quasibarotropic conditions.Key circulation features are used as input variables to construct an effective weather system pattern recognition model.This model employs an Adaptive Boosting(AdaBoost)algorithm combined with Random Under-Sampling(RUS)to address the class imbalance issue,achieving a recognition accuracy of 90.9%.Furthermore,utilizing factor analysis and Support Vector Machine(SVM)techniques,three specialized and independent probabilistic prediction models are developed based on the variance in predictor distributions across different flow regimes.By integrating the type of identification model with these prediction models,an enhanced comprehensive model is constructed.This advanced model autonomously identifies flow types and accordingly selects the most appropriate prediction model.Over a three-year validation period,this improved model outperformed the initially unclassified model in terms of prediction accuracy.Notably,for CWs and SCWs,the maximum Peirce Skill Score(PSS)increased from 0.530 and 0.702 to 0.628 and 0.726,respectively,and the corresponding maximum Threat Score(TS)improved from 0.087 and 0.024 to 0.120 and 0.026.These improvements were significant across all samples,with the cold air advection type showing the greatest enhancement due to the significant spatial variability of each factor.Additionally,the model improved forecast precision by prioritizing thermal factors,which played a key role in modulating false alarm rates in warm air advection and quasi-barotropic flow regimes.The results confirm the critical contribution of circulation feature recognition and segmented modeling to enhancing the adaptability and predictive accuracy of weather forecast models.
基金the sponsorship of the Key Technology for Geophysical Prediction of Ultra-Deep Carbonate Reservoirs(P24240)the National Natural Science Foundation of China(U24B2020)the National Science and Technology Major Project of China for New Oil and Gas Exploration and Development(Grant No.2024ZD1400102)。
文摘Fluid identification and anisotropic parameters characterization are crucial for shale reservoir exploration and development.However,the anisotropic reflection coefficient equation,based on the transverse isotropy with a vertical axis of symmetry(VTI)medium assumption,involves numerous parameters to be inverted.This complexity reduces its stability and impacts the accuracy of seismic amplitude variation with offset(AVO)inversion results.In this study,a novel anisotropic equation that includes the fluid term and Thomsen anisotropic parameters is rewritten,which reduces the equation's dimensionality and increases its stability.Additionally,the traditional Markov Chain Monte Carlo(MCMC)inversion algorithm exhibits a high rejection rate for random samples and relies on known parameter distributions such as the Gaussian distribution,limiting the algorithm's convergence and sample randomness.To address these limitations and evaluate the uncertainty of AVO inversion,the IADR-Gibbs algorithm is proposed,which incorporates the Independent Adaptive Delayed Rejection(IADR)algorithm with the Gibbs sampling algorithm.Grounded in Bayesian theory,the new algorithm introduces support points to construct a proposal distribution of non-parametric distribution and reselects the rejected samples according to the Delayed Rejection(DR)strategy.Rejected samples are then added to the support points to update the proposal distribution function adaptively.The equation rewriting method and the IADR-Gibbs algorithm improve the accuracy and robustness of AVO inversion.The effectiveness and applicability of the proposed method are validated through synthetic gather tests and practical data applications.
基金National Key R&D Program of China under Grant No.2022YFC3003600National Natural Science Foundation of China(NSFC)under Grant No.51978023。
文摘Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution function provide valuable information.In this study,a simulation-based framework to evaluate these probabilistic characteristics in water distribution systems(WDSs)during post-earthquake recovery is developed.The framework first calculates pipeline failure probabilities using seismic fragility models and then generates damage samples through quasi-Monte Carlo simulations with Sobol’s sequence for faster convergence.System performance is assessed using a hydraulic model,and recovery simulations produce time-varying performance curves,where the dynamic importance of unrepaired damage determines repair sequences.Finally,the probabilistic characteristics of seismic performance indicators,resilience index,resilience loss,and recovery time are evaluated.The framework is applied in two benchmark WDSs with different layouts to investigate the probabilistic characteristics of their seismic performance and resilience.Application results show that the cumulative distribution function reveals the variations in resilience indicators for different exceedance probabilities,and there are dramatic differences among the recovery times corresponding to the system performance recovery targets of 80%,90%,and 100%.
基金supported by the National Natural Science Foundation of China(No.12472265)。
文摘High-Resolution(HR)data on flow fields are critical for accurately evaluating the aerodynamic performance of aircraft.However,acquiring such data through large-scale numerical simulations or wind tunnel experiments is highly resource intensive.This paper proposes a FlowViT-Diff framework that integrates a Vision Transformer(ViT)with an enhanced denoising diffusion probabilistic model for the Super-Resolution(SR)reconstruction of HR flow fields based on low-resolution inputs.It provides a quick initial prediction of the HR flow field by optimizing the ViT architecture,and incorporates this preliminary output as guidance within an enhanced diffusion model.The latter captures the Gaussian noise distribution during forward diffusion and progressively removes it during backward diffusion to generate the flow field.Experiments on various supercritical airfoils under different flow conditions show that FlowViT-Diff can robustly reconstruct the flow field across multiple levels of downsampling.It obtains more consistent global and local features than traditional SR methods,and yields a 3.6-fold increase in its training speed via transfer learning.Its accuracy of reconstruction of the flow field is 99.7%under ultra-low downsampling.The results demonstrate that Flow Vi T-Diff not only exhibits effective flow field reconstruction capabilities,but also provides two reconstruction strategies,both of which show effective transferability.
基金Supported by the Natural Science Foundation of the Jiangsu Higher Education Institutions of China(Grant No.24KJD110008)the National Natural Science Foundation of China(Grant No.12401469)。
文摘Given a graph F and a positive integer r,the size Ramsey number R(F,r)is defined as the smallest integer m such that there exists a graph G with m edges where every r-color edge coloring of G results in a monochromatic copy of F.Let P_(n)and C_(n)represent a path and a cycle on n vertices,respectively.In this paper,we establish that for sufficiently large n,R(P_(n),P_(n),P_(n))<772n.Furthermore,we demonstrate that for sufficiently large even integers n,R(P_(n),P_(n),C_(n))≤17093n.For sufficiently large odd integer n,we show that R(P_(n),P_(n),C_(n))≥(7.5-o(1))n.
基金funded by research grant OPUS no.2021/41/B/ST8/02432 entitled Probabilistic entropy in engineering computations sponsored by The National Science Center in Polandthe Institute of Structural Analysis of Poznan University of Technology in the framework of the internal research grant 0411/SBAD/0010.
文摘The analysis of the dynamics of surface girders is of great importance in the design of engineering structures such as steel welded bridge plane girders or concrete plate-column structures.This work is an extension of the classical deterministic problem of free vibrations of thin(Kirchhoff)plates.Themain aim of this work is the study of stochastic eigenvibrations of thin(Kirchhoff)elastic plates resting on internal continuous and column supports by the Boundary Element Method(BEM).This work is a continuation of previous research related to the random approach in plate analysis using the BEM.The static fundamental solution(Green’s function)is applied,coupled with a nonsingular formulation of the boundary and domain integral equations.These are derived using a modified and simplified formulation of the boundary conditions,inwhich there is no need to introduce theKirchhoff forces on a plate boundary.The role of the Kirchhoff corner forces is played by the boundary elements placed close to a single corner.Internal column or linear continuous supports are introduced using the Bezine technique,where the additional collocation points are introduced inside a plate domain.This allows for significant simplification of the BEM computational algorithm.An application of the polynomial approximations in the Least Squares Method(LSM)recovery of the structural response is done.The probabilistic analysis will employ three independent computational approaches:semi-analytical method(SAM),stochastic perturbation technique(SPT),and Monte-Carlo simulations.Numerical investigations include the fundamental eigenfrequencies of an elastic,thin,homogeneous,and isotropic plate.
基金co-supported by the National Natural Science Foundation of China(No.12372045)the Guangdong Basic and Applied Basic Research Foundation,China(No.2023B1515120018)the Shenzhen Science and Technology Program,China(No.JCYJ20220818102207015).
文摘The increasing complexity of on-orbit tasks imposes great demands on the flexible operation of space robotic arms, prompting the development of space robots from single-arm manipulation to multi-arm collaboration. In this paper, a combined approach of Learning from Demonstration (LfD) and Reinforcement Learning (RL) is proposed for space multi-arm collaborative skill learning. The combination effectively resolves the trade-off between learning efficiency and feasible solution in LfD, as well as the time-consuming pursuit of the optimal solution in RL. With the prior knowledge of LfD, space robotic arms can achieve efficient guided learning in high-dimensional state-action space. Specifically, an LfD approach with Probabilistic Movement Primitives (ProMP) is firstly utilized to encode and reproduce the demonstration actions, generating a distribution as the initialization of policy. Then in the RL stage, a Relative Entropy Policy Search (REPS) algorithm modified in continuous state-action space is employed for further policy improvement. More importantly, the learned behaviors can maintain and reflect the characteristics of demonstrations. In addition, a series of supplementary policy search mechanisms are designed to accelerate the exploration process. The effectiveness of the proposed method has been verified both theoretically and experimentally. Moreover, comparisons with state-of-the-art methods have confirmed the outperformance of the approach.