Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunne...Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunnelling.In this study,a series of probabilistic analyses of surface and subsurface settlements was conducted considering the spatial variability of the friction angle and reference stiffness modulus,under different volumetric block proportions(Pv)and tunnel volume loss rates(ηt).The non-intrusive random finite difference method was used to investigate the probabilistic characteristics of maximum surface settlement,width of subsurface settlement trough,maximum subsurface settlement,and subsurface soil volume loss rate through Monte Carlo simulations.Additionally,a comparison between stochastic and deterministic analysis results is presented to underscore the significance of probabilistic analysis.Parametric analyses were subsequently conducted to investigate the impacts of the key input parameters in random fields on the settlement characteristics.The results indicate that scenarios with higher Pv or greaterηt result in a higher dispersion of stochastic analysis results.Neglecting the spatial variability of soil properties and relying solely on the mean values of material parameters for deterministic analysis may result in an underestimation of surface and subsurface settlements.From a probabilistic perspective,deterministic analysis alone may prove inadequate in accurately capturing the volumetric deformation mode of the soil above the tunnel crown,potentially affecting the prediction of subsurface settlement.展开更多
Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradi...Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.展开更多
As the proportion of newenergy increases,the traditional cumulant method(CM)produces significant errorswhen performing probabilistic load flow(PLF)calculations with large-scale wind power integrated.Considering the wi...As the proportion of newenergy increases,the traditional cumulant method(CM)produces significant errorswhen performing probabilistic load flow(PLF)calculations with large-scale wind power integrated.Considering the wind speed correlation,a multi-scenario PLF calculation method that combines random sampling and segmented discrete wind farm power was proposed.Firstly,based on constructing discrete scenes of wind farms,the Nataf transform is used to handle the correlation between wind speeds.Then,the random sampling method determines the output probability of discrete wind power scenarios when wind speed exhibits correlation.Finally,the PLF calculation results of each scenario areweighted and superimposed following the total probability formula to obtain the final power flow calculation result.Verified in the IEEE standard node system,the absolute percent error(APE)for the mean and standard deviation(SD)of the node voltages and branch active power are all within 1%,and the average root mean square(AMSR)values of the probability curves are all less than 1%.展开更多
The study presents the results of over 30,000 numerical analyses on the stability of lava tubes under lunar conditions.The research considered random irregularities in cave geometry and their impact on stability,with ...The study presents the results of over 30,000 numerical analyses on the stability of lava tubes under lunar conditions.The research considered random irregularities in cave geometry and their impact on stability,with a particular focus on the geometric characteristics of identified collapses.We propose a procedure for extracting the collapse areas and integrating it into the stability analysis results.The results were examined to assess the possibility of describing the geometry characteristics of collapses using commonly applied probability density distributions,such as normal or lognormal distribution.Our aim is to facilitate future risk assessment of lunar caves.Such an assessment will be essential prior to robotically exploring caves beneath the lunar surface and can be extended to be used for planetary caves beyond the Moon.Our findings indicate that several collapse characteristics can be represented by unimodal probability density distributions,which could significantly simplify the candidate selection process.Based on our results,we also highlight several key directions for future research and suggested implications related to their future exploration.展开更多
The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of ...The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.展开更多
The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smoot...The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smooth transformation between small-scale laboratory specimens’fatigue properties and full-scale engineering components’fatigue strength has been a long-term challenge.In this work,two dominant factors impeding the smooth transformation—notch and size effect were experimentally studied,in which fatigue tests on Al 7075-T6511(a very high-strength aviation alloy)notched specimens of different scales were carried out.Fractography analyses identified the evidence of the size effect on notch fatigue damage evolution.Accordingly,the Energy Field Intensity(EFI)initially developed for multiaxial notch fatigue analysis was improved by utilizing the volume ratio of the Effective Damage Zones(EDZs)for size effect correction.In particular,it was extended to a probabilistic model considering the inherent variability of the fatigue phenomenon.The experimental data of Al 7075-T6511 notched specimens and the model-predicted results were compared,indicating the high potential of the proposed approach in fatigue evaluation under combined notch and size effects.展开更多
Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-len...Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-length fields,and the accurate identification of field boundaries has a great impact on the subsequent analysis and final performance.Hence,this paper proposes a new protocol segmentation method based on Information-theoretic statistical analysis for binary protocols by formulating the field segmentation of unsupervised binary protocols as a probabilistic inference problem and modeling its uncertainty.Specifically,we design four related constructions between entropy changes and protocol field segmentation,introduce random variables,and construct joint probability distributions with traffic sample observations.Probabilistic inference is then performed to identify the possible protocol segmentation points.Extensive trials on nine common public and industrial control protocols show that the proposed method yields higher-quality protocol segmentation results.展开更多
Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution funct...Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution function provide valuable information.In this study,a simulation-based framework to evaluate these probabilistic characteristics in water distribution systems(WDSs)during post-earthquake recovery is developed.The framework first calculates pipeline failure probabilities using seismic fragility models and then generates damage samples through quasi-Monte Carlo simulations with Sobol’s sequence for faster convergence.System performance is assessed using a hydraulic model,and recovery simulations produce time-varying performance curves,where the dynamic importance of unrepaired damage determines repair sequences.Finally,the probabilistic characteristics of seismic performance indicators,resilience index,resilience loss,and recovery time are evaluated.The framework is applied in two benchmark WDSs with different layouts to investigate the probabilistic characteristics of their seismic performance and resilience.Application results show that the cumulative distribution function reveals the variations in resilience indicators for different exceedance probabilities,and there are dramatic differences among the recovery times corresponding to the system performance recovery targets of 80%,90%,and 100%.展开更多
Aiming at the problems of increasing uncertainty of low-carbon generation energy in active distribution network(ADN)and the difficulty of security assessment of distribution network,this paper proposes a two-phase sch...Aiming at the problems of increasing uncertainty of low-carbon generation energy in active distribution network(ADN)and the difficulty of security assessment of distribution network,this paper proposes a two-phase scheduling model for flexible resources in ADN based on probabilistic risk perception.First,a full-cycle probabilistic trend sequence is constructed based on the source-load historical data,and in the day-ahead scheduling phase,the response interval of the flexibility resources on the load and storage side is optimized based on the probabilistic trend,with the probability of the security boundary as the security constraint,and with the economy as the objective.Then in the intraday phase,the core security and economic operation boundary of theADNis screened in real time.Fromthere,it quantitatively senses the degree of threat to the core security and economic operation boundary under the current source-load prediction information,and identifies the strictly secure and low/high-risk time periods.Flexibility resources within the response interval are dynamically adjusted in real-time by focusing on high-risk periods to cope with future core risks of the distribution grid.Finally,the improved IEEE 33-node distribution system is simulated to obtain the flexibility resource scheduling scheme on the load and storage side.Thescheduling results are evaluated from the perspectives of risk probability and flexible resource utilization efficiency,and the analysis shows that the scheduling model in this paper can promote the consumption of low-carbon energy from wind and photovoltaic sourceswhile reducing the operational risk of the distribution network.展开更多
Layer pseudospins,exhibiting quantum coherence and precise multistate controllability,present significant potential for the advancement of future computing technologies.In this work,we propose an in-memory probabilist...Layer pseudospins,exhibiting quantum coherence and precise multistate controllability,present significant potential for the advancement of future computing technologies.In this work,we propose an in-memory probabilistic computing scheme based on the electrical manipulation of layer pseudospins in layered materials,by exploiting the interaction between real spins and layer pseudospins.展开更多
Probabilistic assessment of seismic performance(SPPA)is a crucial aspect of evaluating the seismic behavior of structures.For complex bridges with inherent uncertainties,conducting precise and efficient seismic reliab...Probabilistic assessment of seismic performance(SPPA)is a crucial aspect of evaluating the seismic behavior of structures.For complex bridges with inherent uncertainties,conducting precise and efficient seismic reliability analysis remains a significant challenge.To address this issue,the current study introduces a sample-unequal weight fractional moment assessment method,which is based on an improved correlation-reduced Latin hypercube sampling(ICLHS)technique.This method integrates the benefits of important sampling techniques with interpolator quadrature formulas to enhance the accuracy of estimating the extreme value distribution(EVD)for the seismic response of complex nonlinear structures subjected to non-stationary ground motions.Additionally,the core theoretical approaches employed in seismic reliability analysis(SRA)are elaborated,such as dimension reduction for simulating non-stationary random ground motions and a fractional-maximum entropy single-loop solution strategy.The effectiveness of this proposed method is validated through a three-story nonlinear shear frame structure.Furthermore,a comprehensive reliability analysis of a real-world long-span,single-pylon suspension bridge is conducted using the developed theoretical framework within the OpenSees platform,leading to key insights and conclusions.展开更多
The reliable,rapid,and accurate Remaining Useful Life(RUL)prognostics of aircraft power supply and distribution system are essential for enhancing the reliability and stability of system and reducing the life-cycle co...The reliable,rapid,and accurate Remaining Useful Life(RUL)prognostics of aircraft power supply and distribution system are essential for enhancing the reliability and stability of system and reducing the life-cycle costs.To achieve the reliable,rapid,and accurate RUL prognostics,the balance between accuracy and computational burden deserves more attention.In addition,the uncertainty is intrinsically present in RUL prognostic process.Due to the limitation of the uncertainty quantification,the point-wise prognostics strategy is not trustworthy.A Dual Adaptive Sliding-window Hybrid(DASH)RUL probabilistic prognostics strategy is proposed to tackle these deficiencies.The DASH strategy contains two adaptive mechanisms,the adaptive Long Short-Term Memory-Polynomial Regression(LSTM-PR)hybrid prognostics mechanism and the adaptive sliding-window Kernel Density Estimation(KDE)probabilistic prognostics mechanism.Owing to the dual adaptive mechanisms,the DASH strategy can achieve the balance between accuracy and computational burden and obtain the trustworthy probabilistic prognostics.Based on the degradation dataset of aircraft electromagnetic contactors,the superiority of DASH strategy is validated.In terms of probabilistic,point-wise and integrated prognostics performance,the proposed strategy increases by 66.89%,81.73% and 25.84%on average compared with the baseline methods and their variants.展开更多
The published article titled“Comparison of Structural Probabilistic and Non-Probabilistic Reliability Computational Methods under Big Data Condition”[1]has been retracted from Structural Durability&Health Monito...The published article titled“Comparison of Structural Probabilistic and Non-Probabilistic Reliability Computational Methods under Big Data Condition”[1]has been retracted from Structural Durability&Health Monitoring(SDHM),Vol.16,No.2,2022,pp.129–143.展开更多
Dear Editor,This letter presents a joint probabilistic scheduling and resource allocation method(PSRA) for 5G-based wireless networked control systems(WNCSs). As a control-aware optimization method, PSRA minimizes the...Dear Editor,This letter presents a joint probabilistic scheduling and resource allocation method(PSRA) for 5G-based wireless networked control systems(WNCSs). As a control-aware optimization method, PSRA minimizes the linear quadratic Gaussian(LQG) control cost of WNCSs by optimizing the activation probability of subsystems, the number of uplink repetitions, and the durations of uplink and downlink phases. Simulation results show that PSRA achieves smaller LQG control costs than existing works.展开更多
Risk assessment is a crucial component of collision warning and avoidance systems for intelligent vehicles.Reachability-based formal approaches have been developed to ensure driving safety to accurately detect potenti...Risk assessment is a crucial component of collision warning and avoidance systems for intelligent vehicles.Reachability-based formal approaches have been developed to ensure driving safety to accurately detect potential vehicle collisions.However,they suffer from over-conservatism,potentially resulting in false–positive risk events in complicated real-world applications.In this paper,we combine two reachability analysis techniques,a backward reachable set(BRS)and a stochastic forward reachable set(FRS),and propose an integrated probabilistic collision–detection framework for highway driving.Within this framework,we can first use a BRS to formally check whether a two-vehicle interaction is safe;otherwise,a prediction-based stochastic FRS is employed to estimate the collision probability at each future time step.Thus,the framework can not only identify non-risky events with guaranteed safety but also provide accurate collision risk estimation in safety-critical events.To construct the stochastic FRS,we develop a neural network-based acceleration model for surrounding vehicles and further incorporate a confidence-aware dynamic belief to improve the prediction accuracy.Extensive experiments were conducted to validate the performance of the acceleration prediction model based on naturalistic highway driving data.The efficiency and effectiveness of the framework with infused confidence beliefs were tested in both naturalistic and simulated highway scenarios.The proposed risk assessment framework is promising for real-world applications.展开更多
Deep reinforcement learning(DRL) has demonstrated significant potential in industrial manufacturing domains such as workshop scheduling and energy system management.However, due to the model's inherent uncertainty...Deep reinforcement learning(DRL) has demonstrated significant potential in industrial manufacturing domains such as workshop scheduling and energy system management.However, due to the model's inherent uncertainty, rigorous validation is requisite for its application in real-world tasks. Specific tests may reveal inadequacies in the performance of pre-trained DRL models, while the “black-box” nature of DRL poses a challenge for testing model behavior. We propose a novel performance improvement framework based on probabilistic automata,which aims to proactively identify and correct critical vulnerabilities of DRL systems, so that the performance of DRL models in real tasks can be improved with minimal model modifications.First, a probabilistic automaton is constructed from the historical trajectory of the DRL system by abstracting the state to generate probabilistic decision-making units(PDMUs), and a reverse breadth-first search(BFS) method is used to identify the key PDMU-action pairs that have the greatest impact on adverse outcomes. This process relies only on the state-action sequence and final result of each trajectory. Then, under the key PDMU, we search for the new action that has the greatest impact on favorable results. Finally, the key PDMU, undesirable action and new action are encapsulated as monitors to guide the DRL system to obtain more favorable results through real-time monitoring and correction mechanisms. Evaluations in two standard reinforcement learning environments and three actual job scheduling scenarios confirmed the effectiveness of the method, providing certain guarantees for the deployment of DRL models in real-world applications.展开更多
This paper introduces the Particle SwarmOptimization(PSO)algorithmto enhance the LatinHypercube Sampling(LHS)process.The key objective is to mitigate the issues of lengthy computation times and low computational accur...This paper introduces the Particle SwarmOptimization(PSO)algorithmto enhance the LatinHypercube Sampling(LHS)process.The key objective is to mitigate the issues of lengthy computation times and low computational accuracy typically encountered when applying Monte Carlo Simulation(MCS)to LHS for probabilistic trend calculations.The PSOmethod optimizes sample distribution,enhances global search capabilities,and significantly boosts computational efficiency.To validate its effectiveness,the proposed method was applied to IEEE34 and IEEE-118 node systems containing wind power.The performance was then compared with Latin Hypercubic Important Sampling(LHIS),which integrates significant sampling with theMonte Carlomethod.The comparison results indicate that the PSO-enhanced method significantly improves the uniformity and representativeness of the sampling.This enhancement leads to a reduction in data errors and an improvement in both computational accuracy and convergence speed.展开更多
Probabilistic back-analysis is an important means to infer the statistics of uncertain soil parameters,making the slope reliability assessment closer to the engineering reality.However,multi-source information(includi...Probabilistic back-analysis is an important means to infer the statistics of uncertain soil parameters,making the slope reliability assessment closer to the engineering reality.However,multi-source information(including test data,monitored data,field observation and slope survival records)is rarely used in current probabilistic back-analysis.Conducting the probabilistic back-analysis of spatially varying soil parameters and slope reliability prediction under rainfalls by integrating multi-source information is a challenging task since thousands of random variables and high-dimensional likelihood function are usually involved.In this paper,a framework by integrating a modified Bayesian Updating with Subset simulation(mBUS)method with adaptive Conditional Sampling(aCS)algorithm is established for the probabilistic back-analysis of spatially varying soil parameters and slope reliability prediction.Within this framework,the high-dimensional probabilistic back-analysis problem can be easily tackled,and the multi-source information(e.g.monitored pressure heads and slope survival records)can be fully used in the back-analysis.A real Taoyuan landslide case in Taiwan,China is investigated to illustrate the effectiveness and performance of the established framework.The findings show that the posterior knowledge of soil parameters obtained from the established framework is in good agreement with the field observations.Furthermore,the updated knowledge of soil parameters can be utilized to reliably predict the occurrence probability of a landslide caused by the heavy rainfall event on September 12,2004 or forecast the potential landslides under future rainfalls in the Fuhsing District of Taoyuan City,Taiwan,China.展开更多
The task of modeling and analyzing intercepted multifunction radars(MFRs)pulse trains is vital for cognitive electronic reconnaissance.Existing methodologies predominantly rely on prior information or heavily constrai...The task of modeling and analyzing intercepted multifunction radars(MFRs)pulse trains is vital for cognitive electronic reconnaissance.Existing methodologies predominantly rely on prior information or heavily constrained models,posing challenges for non-cooperative applications.This paper introduces a novel approach to model MFRs using a Bayesian network,where the conditional probability density function is approximated by an autoregressive kernel mixture network(ARKMN).Utilizing the estimated probability density function,a dynamic programming algorithm is proposed for denoising and detecting change points in the intercepted MFRs pulse trains.Simulation results affirm the proposed method's efficacy in modeling MFRs,outperforming the state-of-the-art in pulse train denoising and change point detection.展开更多
Deterministic inversion based on deep learning has been widely utilized in model parameters estimation.Constrained by logging data,seismic data,wavelet and modeling operator,deterministic inversion based on deep learn...Deterministic inversion based on deep learning has been widely utilized in model parameters estimation.Constrained by logging data,seismic data,wavelet and modeling operator,deterministic inversion based on deep learning can establish nonlinear relationships between seismic data and model parameters.However,seismic data lacks low-frequency and contains noise,which increases the non-uniqueness of the solutions.The conventional inversion method based on deep learning can only establish the deterministic relationship between seismic data and parameters,and cannot quantify the uncertainty of inversion.In order to quickly quantify the uncertainty,a physics-guided deep mixture density network(PG-DMDN)is established by combining the mixture density network(MDN)with the deep neural network(DNN).Compared with Bayesian neural network(BNN)and network dropout,PG-DMDN has lower computing cost and shorter training time.A low-frequency model is introduced in the training process of the network to help the network learn the nonlinear relationship between narrowband seismic data and low-frequency impedance.In addition,the block constraints are added to the PG-DMDN framework to improve the horizontal continuity of the inversion results.To illustrate the benefits of proposed method,the PG-DMDN is compared with existing semi-supervised inversion method.Four synthetic data examples of Marmousi II model are utilized to quantify the influence of forward modeling part,low-frequency model,noise and the pseudo-wells number on inversion results,and prove the feasibility and stability of the proposed method.In addition,the robustness and generality of the proposed method are verified by the field seismic data.展开更多
基金supported by the Natural Science Foundation of Beijing Municipality(No.8222004),Chinathe National Natural Science Foundation of China(No.51978019)+3 种基金the Natural Science Foundation of Henan Province(No.252300420445),Chinathe Doctoral Research Initiation Fund of Henan University of Science and Technology(No.4007/13480062),Chinathe Henan Postdoctoral Foundation(No.13554005),Chinathe Joint Fund of Science and Technology R&D Program of Henan Province(No.232103810082),China。
文摘Sandy cobble soil exhibits pronounced heterogeneity.The assessment of the uncertainty surrounding its properties is crucial for the analysis of settlement characteristics resulting from volume loss during shield tunnelling.In this study,a series of probabilistic analyses of surface and subsurface settlements was conducted considering the spatial variability of the friction angle and reference stiffness modulus,under different volumetric block proportions(Pv)and tunnel volume loss rates(ηt).The non-intrusive random finite difference method was used to investigate the probabilistic characteristics of maximum surface settlement,width of subsurface settlement trough,maximum subsurface settlement,and subsurface soil volume loss rate through Monte Carlo simulations.Additionally,a comparison between stochastic and deterministic analysis results is presented to underscore the significance of probabilistic analysis.Parametric analyses were subsequently conducted to investigate the impacts of the key input parameters in random fields on the settlement characteristics.The results indicate that scenarios with higher Pv or greaterηt result in a higher dispersion of stochastic analysis results.Neglecting the spatial variability of soil properties and relying solely on the mean values of material parameters for deterministic analysis may result in an underestimation of surface and subsurface settlements.From a probabilistic perspective,deterministic analysis alone may prove inadequate in accurately capturing the volumetric deformation mode of the soil above the tunnel crown,potentially affecting the prediction of subsurface settlement.
基金the Young Investigator Group“Artificial Intelligence for Probabilistic Weather Forecasting”funded by the Vector Stiftungfunding from the Federal Ministry of Education and Research(BMBF)and the Baden-Württemberg Ministry of Science as part of the Excellence Strategy of the German Federal and State Governments。
文摘Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.
基金supported by Basic Science Research Program through the National Natural Science Foundation of China(Grant No.61867003).
文摘As the proportion of newenergy increases,the traditional cumulant method(CM)produces significant errorswhen performing probabilistic load flow(PLF)calculations with large-scale wind power integrated.Considering the wind speed correlation,a multi-scenario PLF calculation method that combines random sampling and segmented discrete wind farm power was proposed.Firstly,based on constructing discrete scenes of wind farms,the Nataf transform is used to handle the correlation between wind speeds.Then,the random sampling method determines the output probability of discrete wind power scenarios when wind speed exhibits correlation.Finally,the PLF calculation results of each scenario areweighted and superimposed following the total probability formula to obtain the final power flow calculation result.Verified in the IEEE standard node system,the absolute percent error(APE)for the mean and standard deviation(SD)of the node voltages and branch active power are all within 1%,and the average root mean square(AMSR)values of the probability curves are all less than 1%.
基金The work was performed based on the research project no.2023/51/D/ST10/01956,financed by the National Science Center,Poland.
文摘The study presents the results of over 30,000 numerical analyses on the stability of lava tubes under lunar conditions.The research considered random irregularities in cave geometry and their impact on stability,with a particular focus on the geometric characteristics of identified collapses.We propose a procedure for extracting the collapse areas and integrating it into the stability analysis results.The results were examined to assess the possibility of describing the geometry characteristics of collapses using commonly applied probability density distributions,such as normal or lognormal distribution.Our aim is to facilitate future risk assessment of lunar caves.Such an assessment will be essential prior to robotically exploring caves beneath the lunar surface and can be extended to be used for planetary caves beyond the Moon.Our findings indicate that several collapse characteristics can be represented by unimodal probability density distributions,which could significantly simplify the candidate selection process.Based on our results,we also highlight several key directions for future research and suggested implications related to their future exploration.
基金supported by the Opening Fund of Key Laboratory of Geological Survey and Evaluation of Ministry of Education(No.GLAB 2024ZR03)the National Natural Science Foundation of China(No.42407248)+2 种基金the Guizhou Provincial Basic Research Program(Natural Science)(No.QKHJC-[2023]-YB066)the Key Laboratory of Smart Earth(No.KF2023YB04-02)the Fundamental Research Funds for the Central Universities。
文摘The constitutive model is essential for predicting the deformation and stability of rocksoil mass.The estimation of constitutive model parameters is a necessary and important task for the reliable characterization of mechanical behaviors.However,constitutive model parameters cannot be evaluated accurately with a limited amount of test data,resulting in uncertainty in the prediction of stress-strain curves.This paper proposes a Bayesian analysis framework to address this issue.It combines the Bayesian updating with the structural reliability and adaptive conditional sampling methods to assess the equation parameter of constitutive models.Based on the triaxial and ring shear tests on shear zone soils from the Huangtupo landslide,a statistical damage constitutive model and a critical state hypoplastic constitutive model were used to demonstrate the effectiveness of the proposed framework.Moreover,the parameter uncertainty effects of the damage constitutive model on landslide stability were investigated.Results show that reasonable assessments of the constitutive model parameter can be well realized.The variability of stress-strain curves is strongly related to the model prediction performance.The estimation uncertainty of constitutive model parameters should not be ignored for the landslide stability calculation.Our study provides a reference for uncertainty analysis and parameter assessment of the constitutive model.
基金support from the Key Program of the National Natural Science Foundation of China(No.12232004)the Training Program of the Sichuan Province Science and the Technology Innovation Seedling Project(No.MZGC20230012)are acknowledged.
文摘The development of modern engineering components and equipment features large size,intricate shape and long service life,which places greater demands on valid methods for fatigue performance analysis.Achieving a smooth transformation between small-scale laboratory specimens’fatigue properties and full-scale engineering components’fatigue strength has been a long-term challenge.In this work,two dominant factors impeding the smooth transformation—notch and size effect were experimentally studied,in which fatigue tests on Al 7075-T6511(a very high-strength aviation alloy)notched specimens of different scales were carried out.Fractography analyses identified the evidence of the size effect on notch fatigue damage evolution.Accordingly,the Energy Field Intensity(EFI)initially developed for multiaxial notch fatigue analysis was improved by utilizing the volume ratio of the Effective Damage Zones(EDZs)for size effect correction.In particular,it was extended to a probabilistic model considering the inherent variability of the fatigue phenomenon.The experimental data of Al 7075-T6511 notched specimens and the model-predicted results were compared,indicating the high potential of the proposed approach in fatigue evaluation under combined notch and size effects.
文摘Protocol Reverse Engineering(PRE)is of great practical importance in Internet security-related fields such as intrusion detection,vulnerability mining,and protocol fuzzing.For unknown binary protocols having fixed-length fields,and the accurate identification of field boundaries has a great impact on the subsequent analysis and final performance.Hence,this paper proposes a new protocol segmentation method based on Information-theoretic statistical analysis for binary protocols by formulating the field segmentation of unsupervised binary protocols as a probabilistic inference problem and modeling its uncertainty.Specifically,we design four related constructions between entropy changes and protocol field segmentation,introduce random variables,and construct joint probability distributions with traffic sample observations.Probabilistic inference is then performed to identify the possible protocol segmentation points.Extensive trials on nine common public and industrial control protocols show that the proposed method yields higher-quality protocol segmentation results.
基金National Key R&D Program of China under Grant No.2022YFC3003600National Natural Science Foundation of China(NSFC)under Grant No.51978023。
文摘Due to uncertainties in seismic pipeline damage and post-earthquake recovery processes,probabilistic characteristics such as mean value,standard deviation,probability density function,and cumulative distribution function provide valuable information.In this study,a simulation-based framework to evaluate these probabilistic characteristics in water distribution systems(WDSs)during post-earthquake recovery is developed.The framework first calculates pipeline failure probabilities using seismic fragility models and then generates damage samples through quasi-Monte Carlo simulations with Sobol’s sequence for faster convergence.System performance is assessed using a hydraulic model,and recovery simulations produce time-varying performance curves,where the dynamic importance of unrepaired damage determines repair sequences.Finally,the probabilistic characteristics of seismic performance indicators,resilience index,resilience loss,and recovery time are evaluated.The framework is applied in two benchmark WDSs with different layouts to investigate the probabilistic characteristics of their seismic performance and resilience.Application results show that the cumulative distribution function reveals the variations in resilience indicators for different exceedance probabilities,and there are dramatic differences among the recovery times corresponding to the system performance recovery targets of 80%,90%,and 100%.
基金supported by Key Technology Research and Application of Online Control Simulation and Intelligent Decision Making for Active Distribution Network(5108-202218280A-2-377-XG).
文摘Aiming at the problems of increasing uncertainty of low-carbon generation energy in active distribution network(ADN)and the difficulty of security assessment of distribution network,this paper proposes a two-phase scheduling model for flexible resources in ADN based on probabilistic risk perception.First,a full-cycle probabilistic trend sequence is constructed based on the source-load historical data,and in the day-ahead scheduling phase,the response interval of the flexibility resources on the load and storage side is optimized based on the probabilistic trend,with the probability of the security boundary as the security constraint,and with the economy as the objective.Then in the intraday phase,the core security and economic operation boundary of theADNis screened in real time.Fromthere,it quantitatively senses the degree of threat to the core security and economic operation boundary under the current source-load prediction information,and identifies the strictly secure and low/high-risk time periods.Flexibility resources within the response interval are dynamically adjusted in real-time by focusing on high-risk periods to cope with future core risks of the distribution grid.Finally,the improved IEEE 33-node distribution system is simulated to obtain the flexibility resource scheduling scheme on the load and storage side.Thescheduling results are evaluated from the perspectives of risk probability and flexible resource utilization efficiency,and the analysis shows that the scheduling model in this paper can promote the consumption of low-carbon energy from wind and photovoltaic sourceswhile reducing the operational risk of the distribution network.
基金supported by the National Natural Science Foundation of China(Grant Nos.12322407,62122036,and 62034004)the Natural Science Foundation of Jiangsu Province(Grant No.BK20233001)+5 种基金the National Key R&D Program of China(Grant Nos.2023YFF0718400 and 2023YFF1203600)the Leading-edge Technology Program of Jiangsu Natural Science Foundation(Grant No.BK20232004)the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB44000000)Innovation Program for Quantum Science and Technologysupport from the Fundamental Research Funds for the Central Universities(Grant Nos.020414380227,020414380240,and 020414380242)the e-Science Center of Collaborative Innovation Center of Advanced Microstructures。
文摘Layer pseudospins,exhibiting quantum coherence and precise multistate controllability,present significant potential for the advancement of future computing technologies.In this work,we propose an in-memory probabilistic computing scheme based on the electrical manipulation of layer pseudospins in layered materials,by exploiting the interaction between real spins and layer pseudospins.
基金Sichuan Science and Technology Program under Grant No.2024NSFSC0932the National Natural Science Foundation of China under Grant No.52008047。
文摘Probabilistic assessment of seismic performance(SPPA)is a crucial aspect of evaluating the seismic behavior of structures.For complex bridges with inherent uncertainties,conducting precise and efficient seismic reliability analysis remains a significant challenge.To address this issue,the current study introduces a sample-unequal weight fractional moment assessment method,which is based on an improved correlation-reduced Latin hypercube sampling(ICLHS)technique.This method integrates the benefits of important sampling techniques with interpolator quadrature formulas to enhance the accuracy of estimating the extreme value distribution(EVD)for the seismic response of complex nonlinear structures subjected to non-stationary ground motions.Additionally,the core theoretical approaches employed in seismic reliability analysis(SRA)are elaborated,such as dimension reduction for simulating non-stationary random ground motions and a fractional-maximum entropy single-loop solution strategy.The effectiveness of this proposed method is validated through a three-story nonlinear shear frame structure.Furthermore,a comprehensive reliability analysis of a real-world long-span,single-pylon suspension bridge is conducted using the developed theoretical framework within the OpenSees platform,leading to key insights and conclusions.
基金co-supported by the National Natural Science Foundation of China(Nos.52272403,52402506)Natural Science Basic Research Program of Shaanxi,China(Nos.2022JC-27,2023-JC-QN-0599)。
文摘The reliable,rapid,and accurate Remaining Useful Life(RUL)prognostics of aircraft power supply and distribution system are essential for enhancing the reliability and stability of system and reducing the life-cycle costs.To achieve the reliable,rapid,and accurate RUL prognostics,the balance between accuracy and computational burden deserves more attention.In addition,the uncertainty is intrinsically present in RUL prognostic process.Due to the limitation of the uncertainty quantification,the point-wise prognostics strategy is not trustworthy.A Dual Adaptive Sliding-window Hybrid(DASH)RUL probabilistic prognostics strategy is proposed to tackle these deficiencies.The DASH strategy contains two adaptive mechanisms,the adaptive Long Short-Term Memory-Polynomial Regression(LSTM-PR)hybrid prognostics mechanism and the adaptive sliding-window Kernel Density Estimation(KDE)probabilistic prognostics mechanism.Owing to the dual adaptive mechanisms,the DASH strategy can achieve the balance between accuracy and computational burden and obtain the trustworthy probabilistic prognostics.Based on the degradation dataset of aircraft electromagnetic contactors,the superiority of DASH strategy is validated.In terms of probabilistic,point-wise and integrated prognostics performance,the proposed strategy increases by 66.89%,81.73% and 25.84%on average compared with the baseline methods and their variants.
文摘The published article titled“Comparison of Structural Probabilistic and Non-Probabilistic Reliability Computational Methods under Big Data Condition”[1]has been retracted from Structural Durability&Health Monitoring(SDHM),Vol.16,No.2,2022,pp.129–143.
基金supported by the Liaoning Revitalization Talents Program(XLYC2203148)
文摘Dear Editor,This letter presents a joint probabilistic scheduling and resource allocation method(PSRA) for 5G-based wireless networked control systems(WNCSs). As a control-aware optimization method, PSRA minimizes the linear quadratic Gaussian(LQG) control cost of WNCSs by optimizing the activation probability of subsystems, the number of uplink repetitions, and the durations of uplink and downlink phases. Simulation results show that PSRA achieves smaller LQG control costs than existing works.
基金supported by the proactive SAFEty systems and tools for a constantly UPgrading road environment(SAFE-UP)projectfunding from the European Union’s Horizon 2020 Research and Innovation Program(861570)。
文摘Risk assessment is a crucial component of collision warning and avoidance systems for intelligent vehicles.Reachability-based formal approaches have been developed to ensure driving safety to accurately detect potential vehicle collisions.However,they suffer from over-conservatism,potentially resulting in false–positive risk events in complicated real-world applications.In this paper,we combine two reachability analysis techniques,a backward reachable set(BRS)and a stochastic forward reachable set(FRS),and propose an integrated probabilistic collision–detection framework for highway driving.Within this framework,we can first use a BRS to formally check whether a two-vehicle interaction is safe;otherwise,a prediction-based stochastic FRS is employed to estimate the collision probability at each future time step.Thus,the framework can not only identify non-risky events with guaranteed safety but also provide accurate collision risk estimation in safety-critical events.To construct the stochastic FRS,we develop a neural network-based acceleration model for surrounding vehicles and further incorporate a confidence-aware dynamic belief to improve the prediction accuracy.Extensive experiments were conducted to validate the performance of the acceleration prediction model based on naturalistic highway driving data.The efficiency and effectiveness of the framework with infused confidence beliefs were tested in both naturalistic and simulated highway scenarios.The proposed risk assessment framework is promising for real-world applications.
基金supported by the Shanghai Science and Technology Committee (22511105500)the National Nature Science Foundation of China (62172299, 62032019)+2 种基金the Space Optoelectronic Measurement and Perception LaboratoryBeijing Institute of Control Engineering(LabSOMP-2023-03)the Central Universities of China (2023-4-YB-05)。
文摘Deep reinforcement learning(DRL) has demonstrated significant potential in industrial manufacturing domains such as workshop scheduling and energy system management.However, due to the model's inherent uncertainty, rigorous validation is requisite for its application in real-world tasks. Specific tests may reveal inadequacies in the performance of pre-trained DRL models, while the “black-box” nature of DRL poses a challenge for testing model behavior. We propose a novel performance improvement framework based on probabilistic automata,which aims to proactively identify and correct critical vulnerabilities of DRL systems, so that the performance of DRL models in real tasks can be improved with minimal model modifications.First, a probabilistic automaton is constructed from the historical trajectory of the DRL system by abstracting the state to generate probabilistic decision-making units(PDMUs), and a reverse breadth-first search(BFS) method is used to identify the key PDMU-action pairs that have the greatest impact on adverse outcomes. This process relies only on the state-action sequence and final result of each trajectory. Then, under the key PDMU, we search for the new action that has the greatest impact on favorable results. Finally, the key PDMU, undesirable action and new action are encapsulated as monitors to guide the DRL system to obtain more favorable results through real-time monitoring and correction mechanisms. Evaluations in two standard reinforcement learning environments and three actual job scheduling scenarios confirmed the effectiveness of the method, providing certain guarantees for the deployment of DRL models in real-world applications.
文摘This paper introduces the Particle SwarmOptimization(PSO)algorithmto enhance the LatinHypercube Sampling(LHS)process.The key objective is to mitigate the issues of lengthy computation times and low computational accuracy typically encountered when applying Monte Carlo Simulation(MCS)to LHS for probabilistic trend calculations.The PSOmethod optimizes sample distribution,enhances global search capabilities,and significantly boosts computational efficiency.To validate its effectiveness,the proposed method was applied to IEEE34 and IEEE-118 node systems containing wind power.The performance was then compared with Latin Hypercubic Important Sampling(LHIS),which integrates significant sampling with theMonte Carlomethod.The comparison results indicate that the PSO-enhanced method significantly improves the uniformity and representativeness of the sampling.This enhancement leads to a reduction in data errors and an improvement in both computational accuracy and convergence speed.
文摘Probabilistic back-analysis is an important means to infer the statistics of uncertain soil parameters,making the slope reliability assessment closer to the engineering reality.However,multi-source information(including test data,monitored data,field observation and slope survival records)is rarely used in current probabilistic back-analysis.Conducting the probabilistic back-analysis of spatially varying soil parameters and slope reliability prediction under rainfalls by integrating multi-source information is a challenging task since thousands of random variables and high-dimensional likelihood function are usually involved.In this paper,a framework by integrating a modified Bayesian Updating with Subset simulation(mBUS)method with adaptive Conditional Sampling(aCS)algorithm is established for the probabilistic back-analysis of spatially varying soil parameters and slope reliability prediction.Within this framework,the high-dimensional probabilistic back-analysis problem can be easily tackled,and the multi-source information(e.g.monitored pressure heads and slope survival records)can be fully used in the back-analysis.A real Taoyuan landslide case in Taiwan,China is investigated to illustrate the effectiveness and performance of the established framework.The findings show that the posterior knowledge of soil parameters obtained from the established framework is in good agreement with the field observations.Furthermore,the updated knowledge of soil parameters can be utilized to reliably predict the occurrence probability of a landslide caused by the heavy rainfall event on September 12,2004 or forecast the potential landslides under future rainfalls in the Fuhsing District of Taoyuan City,Taiwan,China.
基金supported by the National Natural Science Foundation of China under Grant 62301119。
文摘The task of modeling and analyzing intercepted multifunction radars(MFRs)pulse trains is vital for cognitive electronic reconnaissance.Existing methodologies predominantly rely on prior information or heavily constrained models,posing challenges for non-cooperative applications.This paper introduces a novel approach to model MFRs using a Bayesian network,where the conditional probability density function is approximated by an autoregressive kernel mixture network(ARKMN).Utilizing the estimated probability density function,a dynamic programming algorithm is proposed for denoising and detecting change points in the intercepted MFRs pulse trains.Simulation results affirm the proposed method's efficacy in modeling MFRs,outperforming the state-of-the-art in pulse train denoising and change point detection.
基金the sponsorship of Shandong Province Foundation for Laoshan National Laboratory of Science and Technology Foundation(LSKJ202203400)National Natural Science Foundation of China(42174139,42030103)Science Foundation from Innovation and Technology Support Program for Young Scientists in Colleges of Shandong Province and Ministry of Science and Technology of China(2019RA2136)。
文摘Deterministic inversion based on deep learning has been widely utilized in model parameters estimation.Constrained by logging data,seismic data,wavelet and modeling operator,deterministic inversion based on deep learning can establish nonlinear relationships between seismic data and model parameters.However,seismic data lacks low-frequency and contains noise,which increases the non-uniqueness of the solutions.The conventional inversion method based on deep learning can only establish the deterministic relationship between seismic data and parameters,and cannot quantify the uncertainty of inversion.In order to quickly quantify the uncertainty,a physics-guided deep mixture density network(PG-DMDN)is established by combining the mixture density network(MDN)with the deep neural network(DNN).Compared with Bayesian neural network(BNN)and network dropout,PG-DMDN has lower computing cost and shorter training time.A low-frequency model is introduced in the training process of the network to help the network learn the nonlinear relationship between narrowband seismic data and low-frequency impedance.In addition,the block constraints are added to the PG-DMDN framework to improve the horizontal continuity of the inversion results.To illustrate the benefits of proposed method,the PG-DMDN is compared with existing semi-supervised inversion method.Four synthetic data examples of Marmousi II model are utilized to quantify the influence of forward modeling part,low-frequency model,noise and the pseudo-wells number on inversion results,and prove the feasibility and stability of the proposed method.In addition,the robustness and generality of the proposed method are verified by the field seismic data.