Engineering tests can yield inaccurate data due to instrument errors,human factors,and environmental interference,introducing uncertainty in numerical model updating.This study employs the probability-box(p-box)method...Engineering tests can yield inaccurate data due to instrument errors,human factors,and environmental interference,introducing uncertainty in numerical model updating.This study employs the probability-box(p-box)method for representing observational uncertainty and develops a two-step approximate Bayesian computation(ABC)framework using time-series data.Within the ABC framework,Euclidean and Bhattacharyya distances are employed as uncertainty quantification metrics to delineate approximate likelihood functions in the initial and subsequent steps,respectively.A novel variational Bayesian Monte Carlo method is introduced to efficiently apply the ABC framework amidst observational uncertainty,resulting in rapid convergence and accurate parameter estimation with minimal iterations.The efficacy of the proposed updating strategy is validated by its application to a shear frame model excited by seismic wave and an aviation pump force sensor for thermal output analysis.The results affirm the efficiency,robustness,and practical applicability of the proposed method.展开更多
The effective and timely diagnosis and treatment of ocular diseases are key to the rapid recovery of patients.Today,the mass disease that needs attention in this context is cataracts.Although deep learning has signifi...The effective and timely diagnosis and treatment of ocular diseases are key to the rapid recovery of patients.Today,the mass disease that needs attention in this context is cataracts.Although deep learning has significantly advanced the analysis of ocular disease images,there is a need for a probabilistic model to generate the distributions of potential outcomes and thusmake decisions related to uncertainty quantification.Therefore,this study implements a Bayesian Convolutional Neural Networks(BCNN)model for predicting cataracts by assigning probability values to the predictions.It prepares convolutional neural network(CNN)and BCNN models.The proposed BCNN model is CNN-based in which reparameterization is in the first and last layers of the CNN model.This study then trains them on a dataset of cataract images filtered from the ocular disease fundus images fromKaggle.The deep CNN model has an accuracy of 95%,while the BCNN model has an accuracy of 93.75% along with information on uncertainty estimation of cataracts and normal eye conditions.When compared with other methods,the proposed work reveals that it can be a promising solution for cataract prediction with uncertainty estimation.展开更多
This study investigates photonuclear reaction(γ,n)cross-sections using Bayesian neural network(BNN)analysis.After determining the optimal network architecture,which features two hidden layers,each with 50 hidden node...This study investigates photonuclear reaction(γ,n)cross-sections using Bayesian neural network(BNN)analysis.After determining the optimal network architecture,which features two hidden layers,each with 50 hidden nodes,training was conducted for 30,000 iterations to ensure comprehensive data capture.By analyzing the distribution of absolute errors positively correlated with the cross-section for the isotope 159Tb,as well as the relative errors unrelated to the cross-section,we confirmed that the network effectively captured the data features without overfitting.Comparison with the TENDL-2021 Database demonstrated the BNN's reliability in fitting photonuclear cross-sections with lower average errors.The predictions for nuclei with single and double giant dipole resonance peak cross-sections,the accurate determination of the photoneutron reaction threshold in the low-energy region,and the precise description of trends in the high-energy cross-sections further demonstrate the network's generalization ability on the validation set.This can be attributed to the consistency of the training data.By using consistent training sets from different laboratories,Bayesian neural networks can predict nearby unknown cross-sections based on existing laboratory data,thereby estimating the potential differences between other laboratories'existing data and their own measurement results.Experimental measurements of photonuclear reactions on the newly constructed SLEGS beamline will contribute to clarifying the differences in cross-sections within the existing data.展开更多
Bayesian-optimized lithology identification has important basic geological research significance and engineering application value,and this paper proposes a Bayesian-optimized lithology identification method based on ...Bayesian-optimized lithology identification has important basic geological research significance and engineering application value,and this paper proposes a Bayesian-optimized lithology identification method based on machine learning of rock visible and near-infrared spectral data.First,the rock spectral data are preprocessed using Savitzky-Golay(SG)smoothing to remove the noise of the spectral data;then,the preprocessed rock spectral data are downscaled using Principal Component Analysis(PCA)to reduce the redundancy of the data,optimize the effective discriminative information,and obtain the rock spectral features;finally,a Bayesian-optimized lithology identification model is established based on rock spectral features,optimize the model hyperparameters using Bayesian optimization(BO)algorithm to avoid the combination of hyperparameters falling into the local optimal solution,and output the predicted type of rock,so as to realize the Bayesian-optimized lithology identification.In addition,this paper conducts comparative analysis on models based on Artificial Neural Network(ANN)/Random Forest(RF),dimensionality reduction/full band,and optimization algorithms.It uses the confusion matrix,accuracy,Precison(P),Recall(R)and F_(1)values(F_(1))as the evaluation indexes of model accuracy.The results indicate that the lithology identification model optimized by the BO-ANN after dimensionality reduction achieves an accuracy of up to 99.80%,up to 99.79%and up to 99.79%.Compared with the BO-RF model,it has higher identification accuracy and better stability for each type of rock identification.The experiments and reliability analysis show that the Bayesian-optimized lithology identification method proposed in this paper has good robustness and generalization performance,which is of great significance for realizing fast,accurate and Bayesian-optimized lithology identification in tunnel site.展开更多
In this paper,an advanced satellite navigation filter design,referred to as the Variational Bayesian Maximum Correntropy Extended Kalman Filter(VBMCEKF),is introduced to enhance robustness and adaptability in scenario...In this paper,an advanced satellite navigation filter design,referred to as the Variational Bayesian Maximum Correntropy Extended Kalman Filter(VBMCEKF),is introduced to enhance robustness and adaptability in scenarios with non-Gaussian noise and heavy-tailed outliers.The proposed design modifies the extended Kalman filter(EKF)for the global navigation satellite system(GNSS),integrating the maximum correntropy criterion(MCC)and the variational Bayesian(VB)method.This adaptive algorithm effectively reduces non-line-of-sight(NLOS)reception contamination and improves estimation accuracy,particularly in time-varying GNSS measurements.Experimental results show that the proposed method significantly outperforms conventional approaches in estimation accuracy under heavy-tailed outliers and non-Gaussian noise.By combining MCC with VB approximation for real-time noise covariance estimation using fixed-point iteration,the VBMCEKF achieves superior filtering performance in challenging GNSS conditions.The method’s adaptability and precision make it ideal for improving satellite navigation performance in stochastic environments.展开更多
Integrating Bayesian Optimization with Volume of Fluid (VOF) simulations, this work aims to optimize the operational conditions and geometric parameters of T-junction microchannels for target droplet sizes. Bayesian O...Integrating Bayesian Optimization with Volume of Fluid (VOF) simulations, this work aims to optimize the operational conditions and geometric parameters of T-junction microchannels for target droplet sizes. Bayesian Optimization utilizes Gaussian Process (GP) as its core model and employs an adaptive search strategy to efficiently explore and identify optimal combinations of operational parameters within a limited parameter space, thereby enabling rapid optimization of the required parameters to achieve the target droplet size. Traditional methods typically rely on manually selecting a series of operational parameters and conducting multiple simulations to gradually approach the target droplet size. This process is time-consuming and prone to getting trapped in local optima. In contrast, Bayesian Optimization adaptively adjusts its search strategy, significantly reducing computational costs and effectively exploring global optima, thus greatly improving optimization efficiency. Additionally, the study investigates the impact of rectangular rib structures within the T-junction microchannel on droplet generation, revealing how the channel geometry influences droplet formation and size. After determining the target droplet size, we further applied Bayesian Optimization to refine the rib geometry. The integration of Bayesian Optimization with computational fluid dynamics (CFD) offers a promising tool and provides new insights into the optimal design of microfluidic devices.展开更多
A Bayesian network reconstruction method based on norm minimization is proposed to address the sparsity and iterative divergence issues in network reconstruction caused by noise and missing values.This method achieves...A Bayesian network reconstruction method based on norm minimization is proposed to address the sparsity and iterative divergence issues in network reconstruction caused by noise and missing values.This method achieves precise adjustment of the network structure by constructing a preliminary random network model and introducing small-world network characteristics and combines L1 norm minimization regularization techniques to control model complexity and optimize the inference process of variable dependencies.In the experiment of game network reconstruction,when the success rate of the L1 norm minimization model’s existence connection reconstruction reaches 100%,the minimum data required is about 40%,while the minimum data required for a sparse Bayesian learning network is about 45%.In terms of operational efficiency,the running time for minimizing the L1 normis basically maintained at 1.0 s,while the success rate of connection reconstruction increases significantly with an increase in data volume,reaching a maximum of 13.2 s.Meanwhile,in the case of a signal-to-noise ratio of 10 dB,the L1 model achieves a 100% success rate in the reconstruction of existing connections,while the sparse Bayesian network had the highest success rate of 90% in the reconstruction of non-existent connections.In the analysis of actual cases,the maximum lift and drop track of the research method is 0.08 m.The mean square error is 5.74 cm^(2).The results indicate that this norm minimization-based method has good performance in data efficiency and model stability,effectively reducing the impact of outliers on the reconstruction results to more accurately reflect the actual situation.展开更多
Objective To investigate the spatiotemporal patterns and socioeconomic factors influencing the incidence of tuberculosis(TB)in the Guangdong Province between 2010 and 2019.Method Spatial and temporal variations in TB ...Objective To investigate the spatiotemporal patterns and socioeconomic factors influencing the incidence of tuberculosis(TB)in the Guangdong Province between 2010 and 2019.Method Spatial and temporal variations in TB incidence were mapped using heat maps and hierarchical clustering.Socioenvironmental influencing factors were evaluated using a Bayesian spatiotemporal conditional autoregressive(ST-CAR)model.Results Annual incidence of TB in Guangdong decreased from 91.85/100,000 in 2010 to 53.06/100,000in 2019.Spatial hotspots were found in northeastern Guangdong,particularly in Heyuan,Shanwei,and Shantou,while Shenzhen,Dongguan,and Foshan had the lowest rates in the Pearl River Delta.The STCAR model showed that the TB risk was lower with higher per capita Gross Domestic Product(GDP)[Relative Risk(RR),0.91;95%Confidence Interval(CI):0.86–0.98],more the ratio of licensed physicians and physician(RR,0.94;95%CI:0.90-0.98),and higher per capita public expenditure(RR,0.94;95%CI:0.90–0.97),with a marginal effect of population density(RR,0.86;95%CI:0.86–1.00).Conclusion The incidence of TB in Guangdong varies spatially and temporally.Areas with poor economic conditions and insufficient healthcare resources are at an increased risk of TB infection.Strategies focusing on equitable health resource distribution and economic development are the key to TB control.展开更多
Objective:Esophageal cancer has made a great contribution to the cancer burden in Jiangsu Province,East China.This study was aimed at reporting esophageal cancer incidence trend in 2009-2019 and its prediction to 2030...Objective:Esophageal cancer has made a great contribution to the cancer burden in Jiangsu Province,East China.This study was aimed at reporting esophageal cancer incidence trend in 2009-2019 and its prediction to 2030.Methods:The burden of esophageal cancer in Jiangsu in 2019 was estimated using 54 cancer registries’data selected from Jiangsu Cancer Registry.Incident cases of 16 cancer registries were applied for the temporal trend from 2009 to 2019.The burden of esophageal cancer by 2030 was projected using the Bayesian age-period-cohort(BAPC)model.Results:About 24,886 new cases of esophageal cancer(17,233 males and 7,653 females)occurred in Jiangsu in 2019.Rural regions of Jiangsu had the highest incidence rate.The age-standardized incidence rate(ASIR,per 100,000 population)of esophageal cancer in Jiangsu decreased from 27.72 per 100,000 in 2009 to 14.18 per 100,000 in 2019.The BAPC model showed that the ASIR would decline from 13.01 per 100,000 in 2020 to 4.88 per 100,000 in 2030.Conclusions:According to the data,esophageal cancer incidence rates were predicted to decline until 2030,yet the disease burden is still significant in Jiangsu.The existing approaches to prevention and control are effective and need to be maintained.展开更多
Fire can cause significant damage to the environment,economy,and human lives.If fire can be detected early,the damage can be minimized.Advances in technology,particularly in computer vision powered by deep learning,ha...Fire can cause significant damage to the environment,economy,and human lives.If fire can be detected early,the damage can be minimized.Advances in technology,particularly in computer vision powered by deep learning,have enabled automated fire detection in images and videos.Several deep learning models have been developed for object detection,including applications in fire and smoke detection.This study focuses on optimizing the training hyperparameters of YOLOv8 andYOLOv10models usingBayesianTuning(BT).Experimental results on the large-scale D-Fire dataset demonstrate that this approach enhances detection performance.Specifically,the proposed approach improves the mean average precision at an Intersection over Union(IoU)threshold of 0.5(mAP50)of the YOLOv8s,YOLOv10s,YOLOv8l,and YOLOv10lmodels by 0.26,0.21,0.84,and 0.63,respectively,compared tomodels trainedwith the default hyperparameters.The performance gains are more pronounced in larger models,YOLOv8l and YOLOv10l,than in their smaller counterparts,YOLOv8s and YOLOv10s.Furthermore,YOLOv8 models consistently outperform YOLOv10,with mAP50 improvements of 0.26 for YOLOv8s over YOLOv10s and 0.65 for YOLOv8l over YOLOv10l when trained with BT.These results establish YOLOv8 as the preferred model for fire detection applications where detection performance is prioritized.展开更多
Disaster mitigation necessitates scientifi c and accurate aftershock forecasting during the critical 2 h after an earthquake. However, this action faces immense challenges due to the lack of early postearthquake data ...Disaster mitigation necessitates scientifi c and accurate aftershock forecasting during the critical 2 h after an earthquake. However, this action faces immense challenges due to the lack of early postearthquake data and the unreliability of forecasts. To obtain foundational data for sequence parameters of the land-sea adjacent zone and establish a reliable and operational aftershock forecasting framework, we combined the initial sequence parameters extracted from envelope functions and incorporated small-earthquake information into our model to construct a Bayesian algorithm for the early postearthquake stage. We performed parameter fitting and early postearthquake aftershock occurrence rate forecasting and effectiveness evaluation for 36 earthquake sequences with M ≥ 4.0 in the Bohai Rim region since 2010. According to the results, during the early stage after the mainshock, earthquake sequence parameters exhibited relatively drastic fl uctuations with signifi cant errors. The integration of prior information can mitigate the intensity of these changes and reduce errors. The initial and stable sequence parameters generally display advantageous distribution characteristics, with each parameter’s distribution being relatively concentrated and showing good symmetry and remarkable consistency. The sequence parameter p-values were relatively small, which indicates the comparatively slow attenuation of signifi cant earthquake events in the Bohai Rim region. A certain positive correlation was observed between earthquake sequence parameters b and p. However, sequence parameters are unrelated to the mainshock magnitude, which implies that their statistical characteristics and trends are universal. The Bayesian algorithm revealed a good forecasting capability for aftershocks in the early postearthquake period (2 h) in the Bohai Rim region, with an overall forecasting effi cacy rate of 76.39%. The proportion of “too low” failures exceeded that of “too high” failures, and the number of forecasting failures for the next three days was greater than that for the next day.展开更多
Edge computing(EC)combined with the Internet of Things(IoT)provides a scalable and efficient solution for smart homes.Therapid proliferation of IoT devices poses real-time data processing and security challenges.EC ha...Edge computing(EC)combined with the Internet of Things(IoT)provides a scalable and efficient solution for smart homes.Therapid proliferation of IoT devices poses real-time data processing and security challenges.EC has become a transformative paradigm for addressing these challenges,particularly in intrusion detection and anomaly mitigation.The widespread connectivity of IoT edge networks has exposed them to various security threats,necessitating robust strategies to detect malicious activities.This research presents a privacy-preserving federated anomaly detection framework combined with Bayesian game theory(BGT)and double deep Q-learning(DDQL).The proposed framework integrates BGT to model attacker and defender interactions for dynamic threat level adaptation and resource availability.It also models a strategic layout between attackers and defenders that takes into account uncertainty.DDQL is incorporated to optimize decision-making and aids in learning optimal defense policies at the edge,thereby ensuring policy and decision optimization.Federated learning(FL)enables decentralized and unshared anomaly detection for sensitive data between devices.Data collection has been performed from various sensors in a real-time EC-IoT network to identify irregularities that occurred due to different attacks.The results reveal that the proposed model achieves high detection accuracy of up to 98%while maintaining low resource consumption.This study demonstrates the synergy between game theory and FL to strengthen anomaly detection in EC-IoT networks.展开更多
In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes...In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes that the projectile dispersion of ammunition is a two-dimensional joint normal distribution,and proposes a new Bayesian inference method of ammunition hit probability based on normal-inverse Wishart distribution.Firstly,the conjugate joint prior distribution of the projectile dispersion characteristic parameters is determined to be a normal inverse Wishart distribution,and the hyperparameters in the prior distribution are estimated by simulation experimental data and historical measured data.Secondly,the field test data is integrated with the Bayesian formula to obtain the joint posterior distribution of the projectile dispersion characteristic parameters,and then the hit probability of the ammunition is estimated.Finally,compared with the binomial distribution method,the method in this paper can consider the dispersion information of ammunition projectiles,and the hit probability information is more fully utilized.The hit probability results are closer to the field shooting test samples.This method has strong applicability and is conducive to obtaining more accurate hit probability estimation results.展开更多
Advanced programmable metamaterials with heterogeneous microstructures have become increasingly prevalent in scientific and engineering disciplines attributed to their tunable properties.However,exploring the structur...Advanced programmable metamaterials with heterogeneous microstructures have become increasingly prevalent in scientific and engineering disciplines attributed to their tunable properties.However,exploring the structure-property relationship in these materials,including forward prediction and inverse design,presents substantial challenges.The inhomogeneous microstructures significantly complicate traditional analytical or simulation-based approaches.Here,we establish a novel framework that integrates the machine learning(ML)-encoded multiscale computational method for forward prediction and Bayesian optimization for inverse design.Unlike prior end-to-end ML methods limited to specific problems,our framework is both load-independent and geometry-independent.This means that a single training session for a constitutive model suffices to tackle various problems directly,eliminating the need for repeated data collection or training.We demonstrate the efficacy and efficiency of this framework using metamaterials with designable elliptical holes or lattice honeycombs microstructures.Leveraging accelerated forward prediction,we can precisely customize the stiffness and shape of metamaterials under diverse loading scenarios,and extend this capability to multi-objective customization seamlessly.Moreover,we achieve topology optimization for stress alleviation at the crack tip,resulting in a significant reduction of Mises stress by up to 41.2%and yielding a theoretical interpretable pattern.This framework offers a general,efficient and precise tool for analyzing the structure-property relationships of novel metamaterials.展开更多
Assessing the stability of slopes is one of the crucial tasks of geotechnical engineering for assessing and managing risks related to natural hazards,directly affecting safety and sustainable development.This study pr...Assessing the stability of slopes is one of the crucial tasks of geotechnical engineering for assessing and managing risks related to natural hazards,directly affecting safety and sustainable development.This study primarily focuses on developing robust and practical hybrid models to predict the slope stability status of circular failure mode.For this purpose,three robust models were developed using a database including 627 case histories of slope stability status.The models were developed using the random forest(RF),support vector machine(SVM),and extreme gradient boosting(XGB)techniques,employing 5-fold cross validation approach.To enhance the performance of models,this study employs Bayesian optimizer(BO)to fine-tuning their hyperparameters.The results indicate that the performance order of the three developed models is RF-BO>SVM-BO>XGB-BO.Furthermore,comparing the developed models with previous models,it was found that the RF-BO model can effectively determine the slope stability status with outstanding performance.This implies that the RF-BO model could serve as a dependable tool for project managers,assisting in the evaluation of slope stability during both the design and operational phases of projects,despite the inherent challenges in this domain.The results regarding the importance of influencing parameters indicate that cohesion,friction angle,and slope height exert the most significant impact on slope stability status.This suggests that concentrating on these parameters and employing the RF-BO model can effectively mitigate the severity of geohazards in the short-term and contribute to the attainment of long-term sustainable development objectives.展开更多
This study investigated forest recovery in the Atlantic Rainforest and Rupestrian Grassland of Brazil using the diffusive-logistic growth(DLG)model.This model simulates vegetation growth in the two mountain biomes con...This study investigated forest recovery in the Atlantic Rainforest and Rupestrian Grassland of Brazil using the diffusive-logistic growth(DLG)model.This model simulates vegetation growth in the two mountain biomes considering spatial location,time,and two key parameters:diffusion rate and growth rate.A Bayesian framework is employed to analyze the model's parameters and assess prediction uncertainties.Satellite imagery from 1992 and 2022 was used for model calibration and validation.By solving the DLG model using the finite difference method,we predicted a 6.6%–51.1%increase in vegetation density for the Atlantic Rainforest and a 5.3%–99.9%increase for the Rupestrian Grassland over 30 years,with the latter showing slower recovery but achieving a better model fit(lower RMSE)compared to the Atlantic Rainforest.The Bayesian approach revealed well-defined parameter distributions and lower parameter values for the Rupestrian Grassland,supporting the slower recovery prediction.Importantly,the model achieved good agreement with observed vegetation patterns in unseen validation data for both biomes.While there were minor spatial variations in accuracy,the overall distributions of predicted and observed vegetation density were comparable.Furthermore,this study highlights the importance of considering uncertainty in model predictions.Bayesian inference allowed us to quantify this uncertainty,demonstrating that the model's performance can vary across locations.Our approach provides valuable insights into forest regeneration process uncertainties,enabling comparisons of modeled scenarios at different recovery stages for better decision-making in these critical mountain biomes.展开更多
The Husimi function(Q-function)of a quantum state is the distribution function of the density operator in the coherent state representation.It is widely used in theoretical research,such as in quantum optics.The Wehrl...The Husimi function(Q-function)of a quantum state is the distribution function of the density operator in the coherent state representation.It is widely used in theoretical research,such as in quantum optics.The Wehrl entropy is the Shannon entropy of the Husimi function,and is nonzero even for pure states.This entropy has been extensively studied in mathematical physics.Recent research also suggests a significant connection between the Wehrl entropy and manybody quantum entanglement in spin systems.We investigate the statistical interpretation of the Husimi function and the Wehrl entropy,taking the system of N spin-1/2 particles as an example.Due to the completeness of coherent states,the Husimi function and Wehrl entropy can be explained via the positive operator-valued measurement(POVM)theory,although the coherent states are not a set of orthonormal basis.Here,with the help of the Bayes’theorem,we provide an alternative probabilistic interpretation for the Husimi function and the Wehrl entropy.This interpretation is based on direct measurements of the system,and thus does not require the introduction of an ancillary system as in the POVM theory.Moreover,under this interpretation the classical correspondences of the Husimi function and the Wehrl entropy are just phase-space probability distribution function of N classical tops,and its associated entropy,respectively.Therefore,this explanation contributes to a better understanding of the relationship between the Husimi function,Wehrl entropy,and classical-quantum correspondence.The generalization of this statistical interpretation to continuous-variable systems is also discussed.展开更多
基金supported by the National Natural Science Foundation of China(Grant No.U23B20105).
文摘Engineering tests can yield inaccurate data due to instrument errors,human factors,and environmental interference,introducing uncertainty in numerical model updating.This study employs the probability-box(p-box)method for representing observational uncertainty and develops a two-step approximate Bayesian computation(ABC)framework using time-series data.Within the ABC framework,Euclidean and Bhattacharyya distances are employed as uncertainty quantification metrics to delineate approximate likelihood functions in the initial and subsequent steps,respectively.A novel variational Bayesian Monte Carlo method is introduced to efficiently apply the ABC framework amidst observational uncertainty,resulting in rapid convergence and accurate parameter estimation with minimal iterations.The efficacy of the proposed updating strategy is validated by its application to a shear frame model excited by seismic wave and an aviation pump force sensor for thermal output analysis.The results affirm the efficiency,robustness,and practical applicability of the proposed method.
基金Saudi Arabia for funding this work through Small Research Group Project under Grant Number RGP.1/316/45.
文摘The effective and timely diagnosis and treatment of ocular diseases are key to the rapid recovery of patients.Today,the mass disease that needs attention in this context is cataracts.Although deep learning has significantly advanced the analysis of ocular disease images,there is a need for a probabilistic model to generate the distributions of potential outcomes and thusmake decisions related to uncertainty quantification.Therefore,this study implements a Bayesian Convolutional Neural Networks(BCNN)model for predicting cataracts by assigning probability values to the predictions.It prepares convolutional neural network(CNN)and BCNN models.The proposed BCNN model is CNN-based in which reparameterization is in the first and last layers of the CNN model.This study then trains them on a dataset of cataract images filtered from the ocular disease fundus images fromKaggle.The deep CNN model has an accuracy of 95%,while the BCNN model has an accuracy of 93.75% along with information on uncertainty estimation of cataracts and normal eye conditions.When compared with other methods,the proposed work reveals that it can be a promising solution for cataract prediction with uncertainty estimation.
基金supported by National key research and development program(No.2022YFA1602404)the National Natural Science Foundation of China(Nos.12388102,12275338,12005280)the Key Laboratory of Nuclear Data foundation(No.JCKY2022201C152)。
文摘This study investigates photonuclear reaction(γ,n)cross-sections using Bayesian neural network(BNN)analysis.After determining the optimal network architecture,which features two hidden layers,each with 50 hidden nodes,training was conducted for 30,000 iterations to ensure comprehensive data capture.By analyzing the distribution of absolute errors positively correlated with the cross-section for the isotope 159Tb,as well as the relative errors unrelated to the cross-section,we confirmed that the network effectively captured the data features without overfitting.Comparison with the TENDL-2021 Database demonstrated the BNN's reliability in fitting photonuclear cross-sections with lower average errors.The predictions for nuclei with single and double giant dipole resonance peak cross-sections,the accurate determination of the photoneutron reaction threshold in the low-energy region,and the precise description of trends in the high-energy cross-sections further demonstrate the network's generalization ability on the validation set.This can be attributed to the consistency of the training data.By using consistent training sets from different laboratories,Bayesian neural networks can predict nearby unknown cross-sections based on existing laboratory data,thereby estimating the potential differences between other laboratories'existing data and their own measurement results.Experimental measurements of photonuclear reactions on the newly constructed SLEGS beamline will contribute to clarifying the differences in cross-sections within the existing data.
基金support from the National Natural Science Foundation of China(Grant Nos:52379103 and 52279103)the Natural Science Foundation of Shandong Province(Grant No:ZR2023YQ049).
文摘Bayesian-optimized lithology identification has important basic geological research significance and engineering application value,and this paper proposes a Bayesian-optimized lithology identification method based on machine learning of rock visible and near-infrared spectral data.First,the rock spectral data are preprocessed using Savitzky-Golay(SG)smoothing to remove the noise of the spectral data;then,the preprocessed rock spectral data are downscaled using Principal Component Analysis(PCA)to reduce the redundancy of the data,optimize the effective discriminative information,and obtain the rock spectral features;finally,a Bayesian-optimized lithology identification model is established based on rock spectral features,optimize the model hyperparameters using Bayesian optimization(BO)algorithm to avoid the combination of hyperparameters falling into the local optimal solution,and output the predicted type of rock,so as to realize the Bayesian-optimized lithology identification.In addition,this paper conducts comparative analysis on models based on Artificial Neural Network(ANN)/Random Forest(RF),dimensionality reduction/full band,and optimization algorithms.It uses the confusion matrix,accuracy,Precison(P),Recall(R)and F_(1)values(F_(1))as the evaluation indexes of model accuracy.The results indicate that the lithology identification model optimized by the BO-ANN after dimensionality reduction achieves an accuracy of up to 99.80%,up to 99.79%and up to 99.79%.Compared with the BO-RF model,it has higher identification accuracy and better stability for each type of rock identification.The experiments and reliability analysis show that the Bayesian-optimized lithology identification method proposed in this paper has good robustness and generalization performance,which is of great significance for realizing fast,accurate and Bayesian-optimized lithology identification in tunnel site.
基金supported by the National Science and Technology Council,Taiwan under grants NSTC 111-2221-E-019-047 and NSTC 112-2221-E-019-030.
文摘In this paper,an advanced satellite navigation filter design,referred to as the Variational Bayesian Maximum Correntropy Extended Kalman Filter(VBMCEKF),is introduced to enhance robustness and adaptability in scenarios with non-Gaussian noise and heavy-tailed outliers.The proposed design modifies the extended Kalman filter(EKF)for the global navigation satellite system(GNSS),integrating the maximum correntropy criterion(MCC)and the variational Bayesian(VB)method.This adaptive algorithm effectively reduces non-line-of-sight(NLOS)reception contamination and improves estimation accuracy,particularly in time-varying GNSS measurements.Experimental results show that the proposed method significantly outperforms conventional approaches in estimation accuracy under heavy-tailed outliers and non-Gaussian noise.By combining MCC with VB approximation for real-time noise covariance estimation using fixed-point iteration,the VBMCEKF achieves superior filtering performance in challenging GNSS conditions.The method’s adaptability and precision make it ideal for improving satellite navigation performance in stochastic environments.
基金support from National Key Research and Development Program of China(2023YFC3905400)the Strategic Priority Research Program of the Chinese Academy of Sciences(XDA0490102)National Natural Science Foundation of China(22178354,2242100322408374).
文摘Integrating Bayesian Optimization with Volume of Fluid (VOF) simulations, this work aims to optimize the operational conditions and geometric parameters of T-junction microchannels for target droplet sizes. Bayesian Optimization utilizes Gaussian Process (GP) as its core model and employs an adaptive search strategy to efficiently explore and identify optimal combinations of operational parameters within a limited parameter space, thereby enabling rapid optimization of the required parameters to achieve the target droplet size. Traditional methods typically rely on manually selecting a series of operational parameters and conducting multiple simulations to gradually approach the target droplet size. This process is time-consuming and prone to getting trapped in local optima. In contrast, Bayesian Optimization adaptively adjusts its search strategy, significantly reducing computational costs and effectively exploring global optima, thus greatly improving optimization efficiency. Additionally, the study investigates the impact of rectangular rib structures within the T-junction microchannel on droplet generation, revealing how the channel geometry influences droplet formation and size. After determining the target droplet size, we further applied Bayesian Optimization to refine the rib geometry. The integration of Bayesian Optimization with computational fluid dynamics (CFD) offers a promising tool and provides new insights into the optimal design of microfluidic devices.
基金supported by the Scientific and Technological Developing Scheme of Jilin Province,China(No.20240101371JC)the National Natural Science Foundation of China(No.62107008).
文摘A Bayesian network reconstruction method based on norm minimization is proposed to address the sparsity and iterative divergence issues in network reconstruction caused by noise and missing values.This method achieves precise adjustment of the network structure by constructing a preliminary random network model and introducing small-world network characteristics and combines L1 norm minimization regularization techniques to control model complexity and optimize the inference process of variable dependencies.In the experiment of game network reconstruction,when the success rate of the L1 norm minimization model’s existence connection reconstruction reaches 100%,the minimum data required is about 40%,while the minimum data required for a sparse Bayesian learning network is about 45%.In terms of operational efficiency,the running time for minimizing the L1 normis basically maintained at 1.0 s,while the success rate of connection reconstruction increases significantly with an increase in data volume,reaching a maximum of 13.2 s.Meanwhile,in the case of a signal-to-noise ratio of 10 dB,the L1 model achieves a 100% success rate in the reconstruction of existing connections,while the sparse Bayesian network had the highest success rate of 90% in the reconstruction of non-existent connections.In the analysis of actual cases,the maximum lift and drop track of the research method is 0.08 m.The mean square error is 5.74 cm^(2).The results indicate that this norm minimization-based method has good performance in data efficiency and model stability,effectively reducing the impact of outliers on the reconstruction results to more accurately reflect the actual situation.
基金supported by the Guangdong Provincial Clinical Research Center for Tuberculosis(No.2020B1111170014)。
文摘Objective To investigate the spatiotemporal patterns and socioeconomic factors influencing the incidence of tuberculosis(TB)in the Guangdong Province between 2010 and 2019.Method Spatial and temporal variations in TB incidence were mapped using heat maps and hierarchical clustering.Socioenvironmental influencing factors were evaluated using a Bayesian spatiotemporal conditional autoregressive(ST-CAR)model.Results Annual incidence of TB in Guangdong decreased from 91.85/100,000 in 2010 to 53.06/100,000in 2019.Spatial hotspots were found in northeastern Guangdong,particularly in Heyuan,Shanwei,and Shantou,while Shenzhen,Dongguan,and Foshan had the lowest rates in the Pearl River Delta.The STCAR model showed that the TB risk was lower with higher per capita Gross Domestic Product(GDP)[Relative Risk(RR),0.91;95%Confidence Interval(CI):0.86–0.98],more the ratio of licensed physicians and physician(RR,0.94;95%CI:0.90-0.98),and higher per capita public expenditure(RR,0.94;95%CI:0.90–0.97),with a marginal effect of population density(RR,0.86;95%CI:0.86–1.00).Conclusion The incidence of TB in Guangdong varies spatially and temporally.Areas with poor economic conditions and insufficient healthcare resources are at an increased risk of TB infection.Strategies focusing on equitable health resource distribution and economic development are the key to TB control.
文摘Objective:Esophageal cancer has made a great contribution to the cancer burden in Jiangsu Province,East China.This study was aimed at reporting esophageal cancer incidence trend in 2009-2019 and its prediction to 2030.Methods:The burden of esophageal cancer in Jiangsu in 2019 was estimated using 54 cancer registries’data selected from Jiangsu Cancer Registry.Incident cases of 16 cancer registries were applied for the temporal trend from 2009 to 2019.The burden of esophageal cancer by 2030 was projected using the Bayesian age-period-cohort(BAPC)model.Results:About 24,886 new cases of esophageal cancer(17,233 males and 7,653 females)occurred in Jiangsu in 2019.Rural regions of Jiangsu had the highest incidence rate.The age-standardized incidence rate(ASIR,per 100,000 population)of esophageal cancer in Jiangsu decreased from 27.72 per 100,000 in 2009 to 14.18 per 100,000 in 2019.The BAPC model showed that the ASIR would decline from 13.01 per 100,000 in 2020 to 4.88 per 100,000 in 2030.Conclusions:According to the data,esophageal cancer incidence rates were predicted to decline until 2030,yet the disease burden is still significant in Jiangsu.The existing approaches to prevention and control are effective and need to be maintained.
基金supported by the MSIT(Ministry of Science and ICT),Republic of Korea,under the ITRC(Information Technology Research Center)Support Program(IITP-2024-RS-2022-00156354)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation)supported by the Technology Development Program(RS-2023-00264489)funded by the Ministry of SMEs and Startups(MSS,Republic of Korea).
文摘Fire can cause significant damage to the environment,economy,and human lives.If fire can be detected early,the damage can be minimized.Advances in technology,particularly in computer vision powered by deep learning,have enabled automated fire detection in images and videos.Several deep learning models have been developed for object detection,including applications in fire and smoke detection.This study focuses on optimizing the training hyperparameters of YOLOv8 andYOLOv10models usingBayesianTuning(BT).Experimental results on the large-scale D-Fire dataset demonstrate that this approach enhances detection performance.Specifically,the proposed approach improves the mean average precision at an Intersection over Union(IoU)threshold of 0.5(mAP50)of the YOLOv8s,YOLOv10s,YOLOv8l,and YOLOv10lmodels by 0.26,0.21,0.84,and 0.63,respectively,compared tomodels trainedwith the default hyperparameters.The performance gains are more pronounced in larger models,YOLOv8l and YOLOv10l,than in their smaller counterparts,YOLOv8s and YOLOv10s.Furthermore,YOLOv8 models consistently outperform YOLOv10,with mAP50 improvements of 0.26 for YOLOv8s over YOLOv10s and 0.65 for YOLOv8l over YOLOv10l when trained with BT.These results establish YOLOv8 as the preferred model for fire detection applications where detection performance is prioritized.
基金supported by the Natural Science Foundation of Tianjin (No. 22JCQNJC01070)the National Natural Science Foundation of China (No. 42404079)the Key Project of Tianjin Earthquake Agency (No. Zd202402)。
文摘Disaster mitigation necessitates scientifi c and accurate aftershock forecasting during the critical 2 h after an earthquake. However, this action faces immense challenges due to the lack of early postearthquake data and the unreliability of forecasts. To obtain foundational data for sequence parameters of the land-sea adjacent zone and establish a reliable and operational aftershock forecasting framework, we combined the initial sequence parameters extracted from envelope functions and incorporated small-earthquake information into our model to construct a Bayesian algorithm for the early postearthquake stage. We performed parameter fitting and early postearthquake aftershock occurrence rate forecasting and effectiveness evaluation for 36 earthquake sequences with M ≥ 4.0 in the Bohai Rim region since 2010. According to the results, during the early stage after the mainshock, earthquake sequence parameters exhibited relatively drastic fl uctuations with signifi cant errors. The integration of prior information can mitigate the intensity of these changes and reduce errors. The initial and stable sequence parameters generally display advantageous distribution characteristics, with each parameter’s distribution being relatively concentrated and showing good symmetry and remarkable consistency. The sequence parameter p-values were relatively small, which indicates the comparatively slow attenuation of signifi cant earthquake events in the Bohai Rim region. A certain positive correlation was observed between earthquake sequence parameters b and p. However, sequence parameters are unrelated to the mainshock magnitude, which implies that their statistical characteristics and trends are universal. The Bayesian algorithm revealed a good forecasting capability for aftershocks in the early postearthquake period (2 h) in the Bohai Rim region, with an overall forecasting effi cacy rate of 76.39%. The proportion of “too low” failures exceeded that of “too high” failures, and the number of forecasting failures for the next three days was greater than that for the next day.
基金The authors extend their appreciation to the Deanship of Research and Graduate Studies at King Khalid University for funding this work through the Large Group Project under grant number(RGP2/337/46)The research team thanks the Deanship of Graduate Studies and Scientific Research at Najran University for supporting the research project through the Nama’a program,with the project code NU/GP/SERC/13/352-4.
文摘Edge computing(EC)combined with the Internet of Things(IoT)provides a scalable and efficient solution for smart homes.Therapid proliferation of IoT devices poses real-time data processing and security challenges.EC has become a transformative paradigm for addressing these challenges,particularly in intrusion detection and anomaly mitigation.The widespread connectivity of IoT edge networks has exposed them to various security threats,necessitating robust strategies to detect malicious activities.This research presents a privacy-preserving federated anomaly detection framework combined with Bayesian game theory(BGT)and double deep Q-learning(DDQL).The proposed framework integrates BGT to model attacker and defender interactions for dynamic threat level adaptation and resource availability.It also models a strategic layout between attackers and defenders that takes into account uncertainty.DDQL is incorporated to optimize decision-making and aids in learning optimal defense policies at the edge,thereby ensuring policy and decision optimization.Federated learning(FL)enables decentralized and unshared anomaly detection for sensitive data between devices.Data collection has been performed from various sensors in a real-time EC-IoT network to identify irregularities that occurred due to different attacks.The results reveal that the proposed model achieves high detection accuracy of up to 98%while maintaining low resource consumption.This study demonstrates the synergy between game theory and FL to strengthen anomaly detection in EC-IoT networks.
基金supported by the National Natural Science Foundation of China(No.71501183).
文摘In order to solve the problems of high experimental cost of ammunition,lack of field test data,and the difficulty in applying the ammunition hit probability estimation method in classical statistics,this paper assumes that the projectile dispersion of ammunition is a two-dimensional joint normal distribution,and proposes a new Bayesian inference method of ammunition hit probability based on normal-inverse Wishart distribution.Firstly,the conjugate joint prior distribution of the projectile dispersion characteristic parameters is determined to be a normal inverse Wishart distribution,and the hyperparameters in the prior distribution are estimated by simulation experimental data and historical measured data.Secondly,the field test data is integrated with the Bayesian formula to obtain the joint posterior distribution of the projectile dispersion characteristic parameters,and then the hit probability of the ammunition is estimated.Finally,compared with the binomial distribution method,the method in this paper can consider the dispersion information of ammunition projectiles,and the hit probability information is more fully utilized.The hit probability results are closer to the field shooting test samples.This method has strong applicability and is conducive to obtaining more accurate hit probability estimation results.
基金supported by the National Natural Science Foundation of China (Grant Nos.12102021,12372105,12172026,and 12225201)the Fundamental Research Funds for the Central Universities and the Academic Excellence Foundation of BUAA for PhD Students.
文摘Advanced programmable metamaterials with heterogeneous microstructures have become increasingly prevalent in scientific and engineering disciplines attributed to their tunable properties.However,exploring the structure-property relationship in these materials,including forward prediction and inverse design,presents substantial challenges.The inhomogeneous microstructures significantly complicate traditional analytical or simulation-based approaches.Here,we establish a novel framework that integrates the machine learning(ML)-encoded multiscale computational method for forward prediction and Bayesian optimization for inverse design.Unlike prior end-to-end ML methods limited to specific problems,our framework is both load-independent and geometry-independent.This means that a single training session for a constitutive model suffices to tackle various problems directly,eliminating the need for repeated data collection or training.We demonstrate the efficacy and efficiency of this framework using metamaterials with designable elliptical holes or lattice honeycombs microstructures.Leveraging accelerated forward prediction,we can precisely customize the stiffness and shape of metamaterials under diverse loading scenarios,and extend this capability to multi-objective customization seamlessly.Moreover,we achieve topology optimization for stress alleviation at the crack tip,resulting in a significant reduction of Mises stress by up to 41.2%and yielding a theoretical interpretable pattern.This framework offers a general,efficient and precise tool for analyzing the structure-property relationships of novel metamaterials.
文摘Assessing the stability of slopes is one of the crucial tasks of geotechnical engineering for assessing and managing risks related to natural hazards,directly affecting safety and sustainable development.This study primarily focuses on developing robust and practical hybrid models to predict the slope stability status of circular failure mode.For this purpose,three robust models were developed using a database including 627 case histories of slope stability status.The models were developed using the random forest(RF),support vector machine(SVM),and extreme gradient boosting(XGB)techniques,employing 5-fold cross validation approach.To enhance the performance of models,this study employs Bayesian optimizer(BO)to fine-tuning their hyperparameters.The results indicate that the performance order of the three developed models is RF-BO>SVM-BO>XGB-BO.Furthermore,comparing the developed models with previous models,it was found that the RF-BO model can effectively determine the slope stability status with outstanding performance.This implies that the RF-BO model could serve as a dependable tool for project managers,assisting in the evaluation of slope stability during both the design and operational phases of projects,despite the inherent challenges in this domain.The results regarding the importance of influencing parameters indicate that cohesion,friction angle,and slope height exert the most significant impact on slope stability status.This suggests that concentrating on these parameters and employing the RF-BO model can effectively mitigate the severity of geohazards in the short-term and contribute to the attainment of long-term sustainable development objectives.
基金financial support from the Brazilian National Council for Scientific and Technological Development(CNPq)and the Federal University of Ouro PretoFinancial support from the Minas Gerais Research Foundation(FAPEMIG)under grant number APQ-06559-24 is also gratefully acknowledged。
文摘This study investigated forest recovery in the Atlantic Rainforest and Rupestrian Grassland of Brazil using the diffusive-logistic growth(DLG)model.This model simulates vegetation growth in the two mountain biomes considering spatial location,time,and two key parameters:diffusion rate and growth rate.A Bayesian framework is employed to analyze the model's parameters and assess prediction uncertainties.Satellite imagery from 1992 and 2022 was used for model calibration and validation.By solving the DLG model using the finite difference method,we predicted a 6.6%–51.1%increase in vegetation density for the Atlantic Rainforest and a 5.3%–99.9%increase for the Rupestrian Grassland over 30 years,with the latter showing slower recovery but achieving a better model fit(lower RMSE)compared to the Atlantic Rainforest.The Bayesian approach revealed well-defined parameter distributions and lower parameter values for the Rupestrian Grassland,supporting the slower recovery prediction.Importantly,the model achieved good agreement with observed vegetation patterns in unseen validation data for both biomes.While there were minor spatial variations in accuracy,the overall distributions of predicted and observed vegetation density were comparable.Furthermore,this study highlights the importance of considering uncertainty in model predictions.Bayesian inference allowed us to quantify this uncertainty,demonstrating that the model's performance can vary across locations.Our approach provides valuable insights into forest regeneration process uncertainties,enabling comparisons of modeled scenarios at different recovery stages for better decision-making in these critical mountain biomes.
基金supported by the National Key Research and Development Program of China[Grant No.2022YFA1405300(PZ)]the Innovation Program for Quantum Science and Technology(Grant No.2023ZD0300700)。
文摘The Husimi function(Q-function)of a quantum state is the distribution function of the density operator in the coherent state representation.It is widely used in theoretical research,such as in quantum optics.The Wehrl entropy is the Shannon entropy of the Husimi function,and is nonzero even for pure states.This entropy has been extensively studied in mathematical physics.Recent research also suggests a significant connection between the Wehrl entropy and manybody quantum entanglement in spin systems.We investigate the statistical interpretation of the Husimi function and the Wehrl entropy,taking the system of N spin-1/2 particles as an example.Due to the completeness of coherent states,the Husimi function and Wehrl entropy can be explained via the positive operator-valued measurement(POVM)theory,although the coherent states are not a set of orthonormal basis.Here,with the help of the Bayes’theorem,we provide an alternative probabilistic interpretation for the Husimi function and the Wehrl entropy.This interpretation is based on direct measurements of the system,and thus does not require the introduction of an ancillary system as in the POVM theory.Moreover,under this interpretation the classical correspondences of the Husimi function and the Wehrl entropy are just phase-space probability distribution function of N classical tops,and its associated entropy,respectively.Therefore,this explanation contributes to a better understanding of the relationship between the Husimi function,Wehrl entropy,and classical-quantum correspondence.The generalization of this statistical interpretation to continuous-variable systems is also discussed.