In dynamic scenarios,visual simultaneous localization and mapping(SLAM)algorithms often incorrectly incorporate dynamic points during camera pose computation,leading to reduced accuracy and robustness.This paper prese...In dynamic scenarios,visual simultaneous localization and mapping(SLAM)algorithms often incorrectly incorporate dynamic points during camera pose computation,leading to reduced accuracy and robustness.This paper presents a dynamic SLAM algorithm that leverages object detection and regional dynamic probability.Firstly,a parallel thread employs the YOLOX object detectionmodel to gather 2D semantic information and compensate for missed detections.Next,an improved K-means++clustering algorithm clusters bounding box regions,adaptively determining the threshold for extracting dynamic object contours as dynamic points change.This process divides the image into low dynamic,suspicious dynamic,and high dynamic regions.In the tracking thread,the dynamic point removal module assigns dynamic probability weights to the feature points in these regions.Combined with geometric methods,it detects and removes the dynamic points.The final evaluation on the public TUM RGB-D dataset shows that the proposed dynamic SLAM algorithm surpasses most existing SLAM algorithms,providing better pose estimation accuracy and robustness in dynamic environments.展开更多
The fuzzy comfortability of a wind-sensitive super-high tower crane is critical to guarantee occupant health and improve construction efficiency.Therefore,the wind-resistant fuzzy comfortability of a super-high tower ...The fuzzy comfortability of a wind-sensitive super-high tower crane is critical to guarantee occupant health and improve construction efficiency.Therefore,the wind-resistant fuzzy comfortability of a super-high tower crane in the Ma’anshan Yangtze River(MYR)Bridge site is analyzed in this paper.First,the membership function model that represents fuzzy comfortability is introduced in the probability density evolution method(PDEM).Second,based on Fechner’s law,the membership function curves are constructed according to three acceleration thresholds in ISO 2631.Then,the fuzzy comfortability for the super-high tower crane under stochastic wind loads is assessed on the basis of different cut-set levelsλ.Results show that the comfortability is over 0.9 under the required maximum operating wind velocity.The low sensitivity toλcan be observed in the reliability curves of ISOⅡandⅢmembership functions.The reliability of the ISOⅠmembership function is not sensitive toλwhenλ<0.7,whereas it becomes sensitive toλwhenλ>0.7.展开更多
To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military ...To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military standards.The PDT method holds the view that there exist defects such as machining scratches and service cracks in the tenon-groove structures of aeroengine disks.However,it is challenging to conduct PDT assessment due to the scarcity of effective Probability of Detection(POD)model and anomaly distribution model.Through a series of Nondestructive Testing(NDT)experiments,the POD model of real cracks in tenon-groove structures is constructed for the first time by employing the Transfer Function Method(TFM).A novel anomaly distribution model is derived through the utilization of the POD model,instead of using the infeasible field data accumulation method.Subsequently,a framework for calculating the Probability of Failure(POF)of the tenon-groove structures is established,and the aforementioned two models exert a significant influence on the results of POF.展开更多
Dear Editor,As an important energy storage device,lithium-ion battery plays a vital role in electric aircrafts,which are new and promising equipment of transportation in the future with low carbon emissions.Accurate p...Dear Editor,As an important energy storage device,lithium-ion battery plays a vital role in electric aircrafts,which are new and promising equipment of transportation in the future with low carbon emissions.Accurate prediction of the state of charge(SOC)of lithium-ion batteries is of great importance in reducing the probability of abnormal accidents and ensuring flight safety.展开更多
Landslide susceptibility mapping(LSM)plays a crucial role in assessing geological risks.The current LSM techniques face a significant challenge in achieving accurate results due to uncertainties associated with region...Landslide susceptibility mapping(LSM)plays a crucial role in assessing geological risks.The current LSM techniques face a significant challenge in achieving accurate results due to uncertainties associated with regional-scale geotechnical parameters.To explore rainfall-induced LSM,this study proposes a hybrid model that combines the physically-based probabilistic model(PPM)with convolutional neural network(CNN).The PPM is capable of effectively capturing the spatial distribution of landslides by incorporating the probability of failure(POF)considering the slope stability mechanism under rainfall conditions.This significantly characterizes the variation of POF caused by parameter uncertainties.CNN was used as a binary classifier to capture the spatial and channel correlation between landslide conditioning factors and the probability of landslide occurrence.OpenCV image enhancement technique was utilized to extract non-landslide points based on the POF of landslides.The proposed model comprehensively considers physical mechanics when selecting non-landslide samples,effectively filtering out samples that do not adhere to physical principles and reduce the risk of overfitting.The results indicate that the proposed PPM-CNN hybrid model presents a higher prediction accuracy,with an area under the curve(AUC)value of 0.85 based on the landslide case of the Niangniangba area of Gansu Province,China compared with the individual CNN model(AUC=0.61)and the PPM(AUC=0.74).This model can also consider the statistical correlation and non-normal probability distributions of model parameters.These results offer practical guidance for future research on rainfall-induced LSM at the regional scale.展开更多
The utilization of unmanned aerial vehicle(UAV) relays in cooperative communication has gained considerable attention in recent years.However,the current research is mostly based on fixed base stations and users,lacki...The utilization of unmanned aerial vehicle(UAV) relays in cooperative communication has gained considerable attention in recent years.However,the current research is mostly based on fixed base stations and users,lacking sufficient exploration of scenarios where communication nodes are in motion.This paper presents a multi-destination vehicle communication system based on decode-and-forward(DF)UAV relays,where source and destination vehicles are moving and an internal eavesdropper intercepts messages from UAV.The closed-form expressions for system outage probability and secrecy outage probability are derived to analyze the reliability and security of the system.Furthermore,the impact of the UAV's position,signal transmission power,and system time allocation ratio on the system's performance are also analyzed.The numerical simulation results validate the accuracy of the derived formulas and confirm the correctness of the analysis.The appropriate time allocation ratio significantly enhances the security performance of system under various environmental conditions.展开更多
Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived chall...Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.展开更多
Persistent flows are defined as network flows that persist over multiple time intervals and continue to exhibit activity over extended periods,which are critical for identifying long-term behaviors and subtle security...Persistent flows are defined as network flows that persist over multiple time intervals and continue to exhibit activity over extended periods,which are critical for identifying long-term behaviors and subtle security threats.Programmable switches provide line-rate packet processing to meet the requirements of high-speed network environments,yet they are fundamentally limited in computational and memory resources.Accurate and memoryefficient persistent flow detection on programmable switches is therefore essential.However,existing approaches often rely on fixed-window sketches or multiple sketches instances,which either suffer from insufficient temporal precision or incur substantial memory overhead,making them ineffective on programmable switches.To address these challenges,we propose SP-Sketch,an innovative sliding-window-based sketch that leverages a probabilistic update mechanism to emulate slot expiration without maintaining multiple sketch instances.This innovative design significantly reduces memory consumption while preserving high detection accuracy across multiple time intervals.We provide rigorous theoretical analyses of the estimation errors,deriving precise error bounds for the proposed method,and validate our approach through comprehensive implementations on both P4 hardware switches(with Intel Tofino ASIC)and software switches(i.e.,BMv2).Experimental evaluations using real-world traffic traces demonstrate that SP-Sketch outperforms traditional methods,improving accuracy by up to 20%over baseline sliding window approaches and enhancing recall by 5%compared to non-sliding alternatives.Furthermore,SP-Sketch achieves a significant reduction in memory utilization,reducing memory consumption by up to 65%compared to traditional methods,while maintaining a robust capability to accurately track persistent flow behavior over extended time periods.展开更多
Assessing forest vulnerability to disturbances at a high spatial resolution and for regional and national scales has become attainable with the combination of remote sensing-derived high-resolution forest maps and mec...Assessing forest vulnerability to disturbances at a high spatial resolution and for regional and national scales has become attainable with the combination of remote sensing-derived high-resolution forest maps and mechanistic risk models. This study demonstrated large-scale and high-resolution modelling of wind damage vulnerability in Norway. The hybrid mechanistic wind damage model, ForestGALES, was adapted to map the critical wind speeds(CWS) of damage across Norway using a national forest attribute map at a 16 m × 16 m spatial resolution. P arametrization of the model for the Norwegian context was done using the literature and the National Forest Inventory data. This new parametrization of the model for Norwegian forests yielded estimates of CWS significantly different from the default parametrization. Both parametrizations fell short of providing acceptable discrimination of the damaged area following the storm of November 19, 2021 in the central southern region of Norway when using unadjusted CWS. After adjusting the CWS and the storm wind speeds by a constant factor, the Norwegian parametrization provided acceptable discrimination and was thus defined as suitable to use in future studies, despite the lack of field-and laboratory experiments to directly derive parameters for Norwegian forests. The windstorm event used for model validation in this study highlighted the challenges of predicting wind damage to forests in landscapes with complex topography. Future studies should focus on further developing ForestGALES and new datasets describing extreme wind climates to better represent the wind and tree interactions in complex topography, and predict the level of risk in order to develop local climate-smart forest management strategies.展开更多
Risk prediction has long been a cornerstone of surgical oncology,enabling surgeons to anticipate complications,tailor perioperative care,and improve outcomes.With the rise of artificial intelligence,machine learning(M...Risk prediction has long been a cornerstone of surgical oncology,enabling surgeons to anticipate complications,tailor perioperative care,and improve outcomes.With the rise of artificial intelligence,machine learning(ML)models are increasingly being applied to predict outcomes,highlighting the growing significance of data-driven methods for clinical decision-making.Currently,frequentist approaches dominate prediction models,including most ML algorithms;these rely exclusively on observed datasets and risk overlooking the cumulative value of prior clinical knowledge.In contrast,Bayesian reasoning formally integrates existing evidence with new data.In this letter,we examine the strengths of frequentist-based prediction models,discuss how Bayesian methods may improve predictive accuracy,and argue that combining both approaches offers a promising path toward more robust,interpretable,and clinically useful prediction tools in surgery.This integration can yield robust,interpretable,and clinically relevant tools that advance personalized surgical care.展开更多
Climate change is an essential topic in climate science,and the accessibility of accurate,high-resolution datasets in recent years has facilitated the extraction of more insights from big-data resources.Nonetheless,cu...Climate change is an essential topic in climate science,and the accessibility of accurate,high-resolution datasets in recent years has facilitated the extraction of more insights from big-data resources.Nonetheless,current research predominantly focuses on mean-value changes and largely overlooks changes in the probability distribution.In this study,a novel method called Wasserstein Stability Analysis(WSA)is developed to identify probability density function(PDF)changes,especially the extreme event shift and nonlinear physical value constraint variation in climate change.WSA is applied to the early 21st century and compared with traditional mean-value trend analysis.The results indicate that despite no significant trend,the equatorial eastern Pacific experienced a decline in hot extremes and an increase in cold extremes,indicating a La Nina-like temperature shift.Further analysis at two Arctic locations suggests sea ice severely restricts the hot extremes of surface air temperature.This impact is diminishing as the sea ice melts.By revealing PDF shifts,WSA emerges as a powerful tool to re-examine climate change dynamics,providing enhanced data-driven insights for understanding climate evolution.展开更多
Estimating probability density functions(PDFs)is critical in data analysis,particularly for complex multimodal distributions.traditional kernel density estimator(KDE)methods often face challenges in accurately capturi...Estimating probability density functions(PDFs)is critical in data analysis,particularly for complex multimodal distributions.traditional kernel density estimator(KDE)methods often face challenges in accurately capturing multimodal structures due to their uniform weighting scheme,leading to mode loss and degraded estimation accuracy.This paper presents the flexible kernel density estimator(F-KDE),a novel nonparametric approach designed to address these limitations.F-KDE introduces the concept of kernel unit inequivalence,assigning adaptive weights to each kernel unit,which better models local density variations in multimodal data.The method optimises an objective function that integrates estimation error and log-likelihood,using a particle swarm optimisation(PSO)algorithm that automatically determines optimal weights and bandwidths.Through extensive experiments on synthetic and real-world datasets,we demonstrated that(1)the weights and bandwidths in F-KDE stabilise as the optimisation algorithm iterates,(2)F-KDE effectively captures the multimodal characteristics and(3)F-KDE outperforms state-of-the-art density estimation methods regarding accuracy and robustness.The results confirm that F-KDE provides a valuable solution for accurately estimating multimodal PDFs.展开更多
In this study,we explore the impact of state-of-the-art laser fields on theαdecay half-life of deformed ground-state odd-A nuclei within the proton number range of 52–107.The calculations show that the presence of a...In this study,we explore the impact of state-of-the-art laser fields on theαdecay half-life of deformed ground-state odd-A nuclei within the proton number range of 52–107.The calculations show that the presence of a laser field modulates theαdecay half-life by altering theαdecay penetration probability within a limited range.Moreover,the variance in the penetration probability rate of change between even–odd and odd–even nuclei is investigated.Furthermore,we investigate the rate of change of the penetration probability for the same parent nucleus with different neutron numbers,based on the characteristics of the odd-A nucleus.We found that the influence of the laser field on the penetration probability is determined by both the shell effect and odd–even staggering.This research contributes to the understanding of nuanced interactions between laser fields and nuclear decay processes.Therefore,valuable insights for future experiments in laser–nuclear physics are attainable using this study.展开更多
Tracking multiple space objects using multiple surveillance sensors is a critical approach in many Space Situation Awareness(SSA) applications. In this process, the uncertainties of targets,dynamics, and observations ...Tracking multiple space objects using multiple surveillance sensors is a critical approach in many Space Situation Awareness(SSA) applications. In this process, the uncertainties of targets,dynamics, and observations are usually represented by the probability distributions. However, precise characterization of uncertainty becomes challenging due to imperfect knowledge about some key aspects, such as birth targets and sensor detection profiles. To overcome this challenge, this paper proposes a multi-sensor possibility PHD filter based on the theory of outer probability measures. An effective compensation method is introduced to tackle variations in the fields of view of SSA sensors or instances of missed detections, aiming to mitigate the inconsistency in localized information. The proposed method is adapted to centralized and distributed sensor networks, offering effective solutions for multi-sensor multi-target tracking. The major innovation of the proposed method compared with typical methods is the proper description of epistemic uncertainty, which yields more robust performance in the scenarios of lacking some information about the system.The effectiveness of the multi-sensor possibility PHD filter is demonstrated by a comparison with conventional methods in two simulated scenarios.展开更多
Dear Editor,This letter presents a joint probabilistic scheduling and resource allocation method(PSRA) for 5G-based wireless networked control systems(WNCSs). As a control-aware optimization method, PSRA minimizes the...Dear Editor,This letter presents a joint probabilistic scheduling and resource allocation method(PSRA) for 5G-based wireless networked control systems(WNCSs). As a control-aware optimization method, PSRA minimizes the linear quadratic Gaussian(LQG) control cost of WNCSs by optimizing the activation probability of subsystems, the number of uplink repetitions, and the durations of uplink and downlink phases. Simulation results show that PSRA achieves smaller LQG control costs than existing works.展开更多
Based on the analysis and research of the airworthiness objective of integrated modular avionics system(IMA),and the characteristics of IMA system’s comprehensive and complex cross-linking with other airborne systems...Based on the analysis and research of the airworthiness objective of integrated modular avionics system(IMA),and the characteristics of IMA system’s comprehensive and complex cross-linking with other airborne systems,the extraction strategy of IMA system’s compliance flight test subjects and the selection method of IMA system’s compliance flight test parameters are proposed.The data analysis method based on the abnormal probability matrix of the IMA system is proposed for the first time,and the abnormal state information of the IMA system can be quickly identified.The compliance flight test of the IMA system is completed with limited flight test resources,which achieves the purpose of saving flight test sorties and improving flight test efficiency.This research has been successfully applied to the airworthiness certification flight test of a certain civil transport aircraft in China,and can provide technical support for the subsequent type flight test.展开更多
We apply methods of algebraic integral geometry to prove a special case of the Gaussian kinematic formula of Adler-Taylor.The idea,suggested already by Adler and Taylor,is to view the GKF as the limit of spherical kin...We apply methods of algebraic integral geometry to prove a special case of the Gaussian kinematic formula of Adler-Taylor.The idea,suggested already by Adler and Taylor,is to view the GKF as the limit of spherical kinematic formulas for spheres of large dimension N and curvature1/N.展开更多
Backscatter communication(BC)is con-sidered a key technology in self-sustainable commu-nications,and the unmanned aerial vehicle(UAV)as a data collector can improve the efficiency of data col-lection.We consider a UAV...Backscatter communication(BC)is con-sidered a key technology in self-sustainable commu-nications,and the unmanned aerial vehicle(UAV)as a data collector can improve the efficiency of data col-lection.We consider a UAV-aided BC system,where the power beacons(PBs)are deployed as dedicated radio frequency(RF)sources to supply power for backscatter devices(BDs).After harvesting enough energy,the BDs transmit data to the UAV.We use stochastic geometry to model the large-scale BC sys-tem.Specifically,the PBs are modeled as a type II Mat´ern hard-core point process(MHCPP II)and the BDs are modeled as a homogeneous Poisson point process(HPPP).Firstly,the BDs’activation proba-bility and average coverage probability are derived.Then,to maximize the energy efficiency(EE),we opti-mize the RF power of the PBs under different PB den-sities.Furthermore,we compare the coverage proba-bility and EE performance of our system with a bench-mark scheme,in which the distribution of PBs is mod-eled as a HPPP.Simulation results show that the PBs modeled as MHCPP II has better performance,and we found that the higher the density of PBs,the smaller the RF power required,and the EE is also higher.展开更多
Models that predict a forest stand’s evolution are essential for developing plans for sustainable management.A simple mathematical framework was developed that con-siders the individual tree and stand basal area unde...Models that predict a forest stand’s evolution are essential for developing plans for sustainable management.A simple mathematical framework was developed that con-siders the individual tree and stand basal area under random resource competition and is based on two assumptions:(1)a sigmoid-type stochastic process governs tree and stand basal area dynamics of living and dying trees,and(2)the total area that a tree may potentially occupy determines the number of trees per hectare.The most effective method to satisfy these requirements is formalizing each tree diameter and potentially occupied area using Gompertz-type stochastic differential equations governed by fixed and mixed-effect parameters.Data from permanent experimental plots from long-term Lithuania experiments were used to construct the tree and stand basal area models.The new models were relatively unbiased for live trees of all species,including silver birch(Betula pen-dula Roth)and downy birch(Betula pubescens Ehrh.),[spruce(Picea abies),and pine(Pinus sylvestris)].Less reliable predic-tions were made for the basal area of dying trees.Pines gave the highest accuracy prediction of mean basal area among all live trees.The mean basal area prediction for all dying trees was lower than that for live trees.Among all species,pine also had the best average basal area prediction accuracy for live trees.Newly developed basal area growth and yield models can be recommended despite their complex formulation and implementation challenges,particularly in situations when data is scarce.This is because the newly observed plot provides sufficient information to calibrate random effects.展开更多
The Argo program measures temperature and salinity in the upper ocean(0–2000 m).These observations are critical for weather/climate studies,ocean circulation analysis,and sea-level monitoring.To address the limitatio...The Argo program measures temperature and salinity in the upper ocean(0–2000 m).These observations are critical for weather/climate studies,ocean circulation analysis,and sea-level monitoring.To address the limitations of traditional thresholds in Argo data quality control(QC),this study proposes a novel probability distribution-based inference method(PDIM)for temperature-salinity threshold inference.By integrating historical observations with climatological data,the method utilizes historical data corresponding to latitude and longitude grids,calculates temperature/salinity frequency distributions for each depth,and determines“zero probability”boundaries through combined frequency distribution and climatology data.Then a probability distribution model is established to detect outliers automatically based on the features in the probability density function,which eliminates the traditional dependence on the normal distribution hypothesis.When applied to global Argo datasets from China Argo Real-time Data Center(CARDC),PDIM successfully identifies suspicious profiles and sensor drifts with high reliability,achieving a low false positive rate(0.55%for temperature,0.18%for salinity)while maintaining competitive true positive rate(28.29%for temperature,55.15%for salinity).This method is expected to improve the reliability of Argo data QC and has important significance for Argo QC.展开更多
基金the National Natural Science Foundation of China(No.62063006)to the Guangxi Natural Science Foundation under Grant(Nos.2023GXNSFAA026025,AA24010001)+3 种基金to the Innovation Fund of Chinese Universities Industry-University-Research(ID:2023RY018)to the Special Guangxi Industry and Information Technology Department,Textile and Pharmaceutical Division(ID:2021 No.231)to the Special Research Project of Hechi University(ID:2021GCC028)to the Key Laboratory of AI and Information Processing,Education Department of Guangxi Zhuang Autonomous Region(Hechi University),No.2024GXZDSY009。
文摘In dynamic scenarios,visual simultaneous localization and mapping(SLAM)algorithms often incorrectly incorporate dynamic points during camera pose computation,leading to reduced accuracy and robustness.This paper presents a dynamic SLAM algorithm that leverages object detection and regional dynamic probability.Firstly,a parallel thread employs the YOLOX object detectionmodel to gather 2D semantic information and compensate for missed detections.Next,an improved K-means++clustering algorithm clusters bounding box regions,adaptively determining the threshold for extracting dynamic object contours as dynamic points change.This process divides the image into low dynamic,suspicious dynamic,and high dynamic regions.In the tracking thread,the dynamic point removal module assigns dynamic probability weights to the feature points in these regions.Combined with geometric methods,it detects and removes the dynamic points.The final evaluation on the public TUM RGB-D dataset shows that the proposed dynamic SLAM algorithm surpasses most existing SLAM algorithms,providing better pose estimation accuracy and robustness in dynamic environments.
基金The National Natural Science Foundation of China(No.52108274,52208481,52338011)State Scholarship Fund of China Scholarship Council(No.202306090285).
文摘The fuzzy comfortability of a wind-sensitive super-high tower crane is critical to guarantee occupant health and improve construction efficiency.Therefore,the wind-resistant fuzzy comfortability of a super-high tower crane in the Ma’anshan Yangtze River(MYR)Bridge site is analyzed in this paper.First,the membership function model that represents fuzzy comfortability is introduced in the probability density evolution method(PDEM).Second,based on Fechner’s law,the membership function curves are constructed according to three acceleration thresholds in ISO 2631.Then,the fuzzy comfortability for the super-high tower crane under stochastic wind loads is assessed on the basis of different cut-set levelsλ.Results show that the comfortability is over 0.9 under the required maximum operating wind velocity.The low sensitivity toλcan be observed in the reliability curves of ISOⅡandⅢmembership functions.The reliability of the ISOⅠmembership function is not sensitive toλwhenλ<0.7,whereas it becomes sensitive toλwhenλ>0.7.
基金supported by the National Major Science and Technology Project,China(No.J2019-Ⅳ-0007-0075)the Fundamental Research Funds for the Central Universities,China(No.JKF-20240036)。
文摘To ensure the structural integrity of life-limiting component of aeroengines,Probabilistic Damage Tolerance(PDT)assessment is applied to evaluate the failure risk as required by airworthiness regulations and military standards.The PDT method holds the view that there exist defects such as machining scratches and service cracks in the tenon-groove structures of aeroengine disks.However,it is challenging to conduct PDT assessment due to the scarcity of effective Probability of Detection(POD)model and anomaly distribution model.Through a series of Nondestructive Testing(NDT)experiments,the POD model of real cracks in tenon-groove structures is constructed for the first time by employing the Transfer Function Method(TFM).A novel anomaly distribution model is derived through the utilization of the POD model,instead of using the infeasible field data accumulation method.Subsequently,a framework for calculating the Probability of Failure(POF)of the tenon-groove structures is established,and the aforementioned two models exert a significant influence on the results of POF.
基金supported in part by the Chunhui Project of the Ministry of Education of China(HZKY20220429)the Department of Science&Technology of Liaoning Province(2022-MS-300)the Educational Department of Liaoning Province(LJKMZ20220561)
文摘Dear Editor,As an important energy storage device,lithium-ion battery plays a vital role in electric aircrafts,which are new and promising equipment of transportation in the future with low carbon emissions.Accurate prediction of the state of charge(SOC)of lithium-ion batteries is of great importance in reducing the probability of abnormal accidents and ensuring flight safety.
基金funding support from the National Natural Science Foundation of China(Grant Nos.U22A20594,52079045)Hong-Zhi Cui acknowledges the financial support of the China Scholarship Council(Grant No.CSC:202206710014)for his research at Universitat Politecnica de Catalunya,Barcelona.
文摘Landslide susceptibility mapping(LSM)plays a crucial role in assessing geological risks.The current LSM techniques face a significant challenge in achieving accurate results due to uncertainties associated with regional-scale geotechnical parameters.To explore rainfall-induced LSM,this study proposes a hybrid model that combines the physically-based probabilistic model(PPM)with convolutional neural network(CNN).The PPM is capable of effectively capturing the spatial distribution of landslides by incorporating the probability of failure(POF)considering the slope stability mechanism under rainfall conditions.This significantly characterizes the variation of POF caused by parameter uncertainties.CNN was used as a binary classifier to capture the spatial and channel correlation between landslide conditioning factors and the probability of landslide occurrence.OpenCV image enhancement technique was utilized to extract non-landslide points based on the POF of landslides.The proposed model comprehensively considers physical mechanics when selecting non-landslide samples,effectively filtering out samples that do not adhere to physical principles and reduce the risk of overfitting.The results indicate that the proposed PPM-CNN hybrid model presents a higher prediction accuracy,with an area under the curve(AUC)value of 0.85 based on the landslide case of the Niangniangba area of Gansu Province,China compared with the individual CNN model(AUC=0.61)and the PPM(AUC=0.74).This model can also consider the statistical correlation and non-normal probability distributions of model parameters.These results offer practical guidance for future research on rainfall-induced LSM at the regional scale.
基金supported by the National Natural Science Foundation of China under Grants 62001359 and 61901201by the Key Science and Technology Research Project of Henan Province under Grants 232102211059the Natural Science Basic Research Program of Shaanxi under Grants 2022JQ-658 and 2022JQ-621。
文摘The utilization of unmanned aerial vehicle(UAV) relays in cooperative communication has gained considerable attention in recent years.However,the current research is mostly based on fixed base stations and users,lacking sufficient exploration of scenarios where communication nodes are in motion.This paper presents a multi-destination vehicle communication system based on decode-and-forward(DF)UAV relays,where source and destination vehicles are moving and an internal eavesdropper intercepts messages from UAV.The closed-form expressions for system outage probability and secrecy outage probability are derived to analyze the reliability and security of the system.Furthermore,the impact of the UAV's position,signal transmission power,and system time allocation ratio on the system's performance are also analyzed.The numerical simulation results validate the accuracy of the derived formulas and confirm the correctness of the analysis.The appropriate time allocation ratio significantly enhances the security performance of system under various environmental conditions.
文摘Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.
基金supported by the National Undergraduate Innovation and Entrepreneurship Training Program of China(Project No.202510559076)at Jinan University,a nationwide initiative administered by the Ministry of Educationthe National Natural Science Foundation of China(NSFC)under Grant No.62172189.
文摘Persistent flows are defined as network flows that persist over multiple time intervals and continue to exhibit activity over extended periods,which are critical for identifying long-term behaviors and subtle security threats.Programmable switches provide line-rate packet processing to meet the requirements of high-speed network environments,yet they are fundamentally limited in computational and memory resources.Accurate and memoryefficient persistent flow detection on programmable switches is therefore essential.However,existing approaches often rely on fixed-window sketches or multiple sketches instances,which either suffer from insufficient temporal precision or incur substantial memory overhead,making them ineffective on programmable switches.To address these challenges,we propose SP-Sketch,an innovative sliding-window-based sketch that leverages a probabilistic update mechanism to emulate slot expiration without maintaining multiple sketch instances.This innovative design significantly reduces memory consumption while preserving high detection accuracy across multiple time intervals.We provide rigorous theoretical analyses of the estimation errors,deriving precise error bounds for the proposed method,and validate our approach through comprehensive implementations on both P4 hardware switches(with Intel Tofino ASIC)and software switches(i.e.,BMv2).Experimental evaluations using real-world traffic traces demonstrate that SP-Sketch outperforms traditional methods,improving accuracy by up to 20%over baseline sliding window approaches and enhancing recall by 5%compared to non-sliding alternatives.Furthermore,SP-Sketch achieves a significant reduction in memory utilization,reducing memory consumption by up to 65%compared to traditional methods,while maintaining a robust capability to accurately track persistent flow behavior over extended time periods.
基金funded by the Norwegian Research Council(NFR project 302701 Climate Smart Forestry Norway).
文摘Assessing forest vulnerability to disturbances at a high spatial resolution and for regional and national scales has become attainable with the combination of remote sensing-derived high-resolution forest maps and mechanistic risk models. This study demonstrated large-scale and high-resolution modelling of wind damage vulnerability in Norway. The hybrid mechanistic wind damage model, ForestGALES, was adapted to map the critical wind speeds(CWS) of damage across Norway using a national forest attribute map at a 16 m × 16 m spatial resolution. P arametrization of the model for the Norwegian context was done using the literature and the National Forest Inventory data. This new parametrization of the model for Norwegian forests yielded estimates of CWS significantly different from the default parametrization. Both parametrizations fell short of providing acceptable discrimination of the damaged area following the storm of November 19, 2021 in the central southern region of Norway when using unadjusted CWS. After adjusting the CWS and the storm wind speeds by a constant factor, the Norwegian parametrization provided acceptable discrimination and was thus defined as suitable to use in future studies, despite the lack of field-and laboratory experiments to directly derive parameters for Norwegian forests. The windstorm event used for model validation in this study highlighted the challenges of predicting wind damage to forests in landscapes with complex topography. Future studies should focus on further developing ForestGALES and new datasets describing extreme wind climates to better represent the wind and tree interactions in complex topography, and predict the level of risk in order to develop local climate-smart forest management strategies.
文摘Risk prediction has long been a cornerstone of surgical oncology,enabling surgeons to anticipate complications,tailor perioperative care,and improve outcomes.With the rise of artificial intelligence,machine learning(ML)models are increasingly being applied to predict outcomes,highlighting the growing significance of data-driven methods for clinical decision-making.Currently,frequentist approaches dominate prediction models,including most ML algorithms;these rely exclusively on observed datasets and risk overlooking the cumulative value of prior clinical knowledge.In contrast,Bayesian reasoning formally integrates existing evidence with new data.In this letter,we examine the strengths of frequentist-based prediction models,discuss how Bayesian methods may improve predictive accuracy,and argue that combining both approaches offers a promising path toward more robust,interpretable,and clinically useful prediction tools in surgery.This integration can yield robust,interpretable,and clinically relevant tools that advance personalized surgical care.
基金supported by the National Key Research and Development Program of China(Grant No.2021YFC3000904)the National Natural Science Foundation of China(42005039)the Science and Technology Development Fund of CAMS(Grant No.2024KJ013)。
文摘Climate change is an essential topic in climate science,and the accessibility of accurate,high-resolution datasets in recent years has facilitated the extraction of more insights from big-data resources.Nonetheless,current research predominantly focuses on mean-value changes and largely overlooks changes in the probability distribution.In this study,a novel method called Wasserstein Stability Analysis(WSA)is developed to identify probability density function(PDF)changes,especially the extreme event shift and nonlinear physical value constraint variation in climate change.WSA is applied to the early 21st century and compared with traditional mean-value trend analysis.The results indicate that despite no significant trend,the equatorial eastern Pacific experienced a decline in hot extremes and an increase in cold extremes,indicating a La Nina-like temperature shift.Further analysis at two Arctic locations suggests sea ice severely restricts the hot extremes of surface air temperature.This impact is diminishing as the sea ice melts.By revealing PDF shifts,WSA emerges as a powerful tool to re-examine climate change dynamics,providing enhanced data-driven insights for understanding climate evolution.
基金supported by the Natural Science Foundation of Guangdong Province(Grant 2023A1515011667)Science and Technology Major Project of Shenzhen(Grant KJZD20230923114809020)Key Basic Research Foundation of Shenzhen(Grant JCYJ20220818100205012).
文摘Estimating probability density functions(PDFs)is critical in data analysis,particularly for complex multimodal distributions.traditional kernel density estimator(KDE)methods often face challenges in accurately capturing multimodal structures due to their uniform weighting scheme,leading to mode loss and degraded estimation accuracy.This paper presents the flexible kernel density estimator(F-KDE),a novel nonparametric approach designed to address these limitations.F-KDE introduces the concept of kernel unit inequivalence,assigning adaptive weights to each kernel unit,which better models local density variations in multimodal data.The method optimises an objective function that integrates estimation error and log-likelihood,using a particle swarm optimisation(PSO)algorithm that automatically determines optimal weights and bandwidths.Through extensive experiments on synthetic and real-world datasets,we demonstrated that(1)the weights and bandwidths in F-KDE stabilise as the optimisation algorithm iterates,(2)F-KDE effectively captures the multimodal characteristics and(3)F-KDE outperforms state-of-the-art density estimation methods regarding accuracy and robustness.The results confirm that F-KDE provides a valuable solution for accurately estimating multimodal PDFs.
基金supported by the National Natural Science Foundation of China(Nos.12375244 and 12135009)the Hunan Provincial Innovation Foundation for Postgraduate(Nos.CX20210007 and CX20230008)。
文摘In this study,we explore the impact of state-of-the-art laser fields on theαdecay half-life of deformed ground-state odd-A nuclei within the proton number range of 52–107.The calculations show that the presence of a laser field modulates theαdecay half-life by altering theαdecay penetration probability within a limited range.Moreover,the variance in the penetration probability rate of change between even–odd and odd–even nuclei is investigated.Furthermore,we investigate the rate of change of the penetration probability for the same parent nucleus with different neutron numbers,based on the characteristics of the odd-A nucleus.We found that the influence of the laser field on the penetration probability is determined by both the shell effect and odd–even staggering.This research contributes to the understanding of nuanced interactions between laser fields and nuclear decay processes.Therefore,valuable insights for future experiments in laser–nuclear physics are attainable using this study.
基金funded by the National Natural Science Foundation of China(No.12202049)the Beijing Institute of Technology Research Fund Program for Young Scholars,China.
文摘Tracking multiple space objects using multiple surveillance sensors is a critical approach in many Space Situation Awareness(SSA) applications. In this process, the uncertainties of targets,dynamics, and observations are usually represented by the probability distributions. However, precise characterization of uncertainty becomes challenging due to imperfect knowledge about some key aspects, such as birth targets and sensor detection profiles. To overcome this challenge, this paper proposes a multi-sensor possibility PHD filter based on the theory of outer probability measures. An effective compensation method is introduced to tackle variations in the fields of view of SSA sensors or instances of missed detections, aiming to mitigate the inconsistency in localized information. The proposed method is adapted to centralized and distributed sensor networks, offering effective solutions for multi-sensor multi-target tracking. The major innovation of the proposed method compared with typical methods is the proper description of epistemic uncertainty, which yields more robust performance in the scenarios of lacking some information about the system.The effectiveness of the multi-sensor possibility PHD filter is demonstrated by a comparison with conventional methods in two simulated scenarios.
基金supported by the Liaoning Revitalization Talents Program(XLYC2203148)
文摘Dear Editor,This letter presents a joint probabilistic scheduling and resource allocation method(PSRA) for 5G-based wireless networked control systems(WNCSs). As a control-aware optimization method, PSRA minimizes the linear quadratic Gaussian(LQG) control cost of WNCSs by optimizing the activation probability of subsystems, the number of uplink repetitions, and the durations of uplink and downlink phases. Simulation results show that PSRA achieves smaller LQG control costs than existing works.
文摘Based on the analysis and research of the airworthiness objective of integrated modular avionics system(IMA),and the characteristics of IMA system’s comprehensive and complex cross-linking with other airborne systems,the extraction strategy of IMA system’s compliance flight test subjects and the selection method of IMA system’s compliance flight test parameters are proposed.The data analysis method based on the abnormal probability matrix of the IMA system is proposed for the first time,and the abnormal state information of the IMA system can be quickly identified.The compliance flight test of the IMA system is completed with limited flight test resources,which achieves the purpose of saving flight test sorties and improving flight test efficiency.This research has been successfully applied to the airworthiness certification flight test of a certain civil transport aircraft in China,and can provide technical support for the subsequent type flight test.
文摘We apply methods of algebraic integral geometry to prove a special case of the Gaussian kinematic formula of Adler-Taylor.The idea,suggested already by Adler and Taylor,is to view the GKF as the limit of spherical kinematic formulas for spheres of large dimension N and curvature1/N.
文摘Backscatter communication(BC)is con-sidered a key technology in self-sustainable commu-nications,and the unmanned aerial vehicle(UAV)as a data collector can improve the efficiency of data col-lection.We consider a UAV-aided BC system,where the power beacons(PBs)are deployed as dedicated radio frequency(RF)sources to supply power for backscatter devices(BDs).After harvesting enough energy,the BDs transmit data to the UAV.We use stochastic geometry to model the large-scale BC sys-tem.Specifically,the PBs are modeled as a type II Mat´ern hard-core point process(MHCPP II)and the BDs are modeled as a homogeneous Poisson point process(HPPP).Firstly,the BDs’activation proba-bility and average coverage probability are derived.Then,to maximize the energy efficiency(EE),we opti-mize the RF power of the PBs under different PB den-sities.Furthermore,we compare the coverage proba-bility and EE performance of our system with a bench-mark scheme,in which the distribution of PBs is mod-eled as a HPPP.Simulation results show that the PBs modeled as MHCPP II has better performance,and we found that the higher the density of PBs,the smaller the RF power required,and the EE is also higher.
基金supported by the Horizon Europe Framework Programme(HORIZON),call Teaming for Excellence(HORIZONWIDERA-2022-ACCESS-01-two-stage)-Creation of the Centre of Excellence in Smart Forestry“Forest 4.0”No.101059985″This research was cofunded by FOREST 4.0-“Ekscelencijos centras tvariai miško bioekonomikai vystyti”(Nr.10-042-P-0002).
文摘Models that predict a forest stand’s evolution are essential for developing plans for sustainable management.A simple mathematical framework was developed that con-siders the individual tree and stand basal area under random resource competition and is based on two assumptions:(1)a sigmoid-type stochastic process governs tree and stand basal area dynamics of living and dying trees,and(2)the total area that a tree may potentially occupy determines the number of trees per hectare.The most effective method to satisfy these requirements is formalizing each tree diameter and potentially occupied area using Gompertz-type stochastic differential equations governed by fixed and mixed-effect parameters.Data from permanent experimental plots from long-term Lithuania experiments were used to construct the tree and stand basal area models.The new models were relatively unbiased for live trees of all species,including silver birch(Betula pen-dula Roth)and downy birch(Betula pubescens Ehrh.),[spruce(Picea abies),and pine(Pinus sylvestris)].Less reliable predic-tions were made for the basal area of dying trees.Pines gave the highest accuracy prediction of mean basal area among all live trees.The mean basal area prediction for all dying trees was lower than that for live trees.Among all species,pine also had the best average basal area prediction accuracy for live trees.Newly developed basal area growth and yield models can be recommended despite their complex formulation and implementation challenges,particularly in situations when data is scarce.This is because the newly observed plot provides sufficient information to calibrate random effects.
基金The National Key Research and Development Program of China under contract No.2021YFC3101503the Hunan Provincial Natural Science Foundation of China under contract No.2023JJ10053+1 种基金the National Natural Science Foundation of China under contract Nos 42276205 and 42406195the Youth Independent Innovation Science Foundation under contract No.ZK24-54.
文摘The Argo program measures temperature and salinity in the upper ocean(0–2000 m).These observations are critical for weather/climate studies,ocean circulation analysis,and sea-level monitoring.To address the limitations of traditional thresholds in Argo data quality control(QC),this study proposes a novel probability distribution-based inference method(PDIM)for temperature-salinity threshold inference.By integrating historical observations with climatological data,the method utilizes historical data corresponding to latitude and longitude grids,calculates temperature/salinity frequency distributions for each depth,and determines“zero probability”boundaries through combined frequency distribution and climatology data.Then a probability distribution model is established to detect outliers automatically based on the features in the probability density function,which eliminates the traditional dependence on the normal distribution hypothesis.When applied to global Argo datasets from China Argo Real-time Data Center(CARDC),PDIM successfully identifies suspicious profiles and sensor drifts with high reliability,achieving a low false positive rate(0.55%for temperature,0.18%for salinity)while maintaining competitive true positive rate(28.29%for temperature,55.15%for salinity).This method is expected to improve the reliability of Argo data QC and has important significance for Argo QC.