Gastrointestinal cancers,including esophageal,gastric,colorectal,liver,gallbladder,cholangiocarcinoma,and pancreatic cancers,pose a significant global health challenge due to their high mortality rates and poor progno...Gastrointestinal cancers,including esophageal,gastric,colorectal,liver,gallbladder,cholangiocarcinoma,and pancreatic cancers,pose a significant global health challenge due to their high mortality rates and poor prognosis,particularly when diagnosed at advanced stages.These malignancies,characterized by diverse clinical presentations and etiologies,require innovative approaches for improved management.Bayesian networks(BN)have emerged as a powerful tool in this field,offering the ability to manage uncertainty,integrate heterogeneous data sources,and support clinical decision-making.This review explores the application of BN in addressing critical challenges in gastrointestinal cancers,including the identification of risk factors,early detection,treatment optimization,and prognosis prediction.By integrating genetic predispositions,lifestyle factors,and clinical data,BN hold the potential to enhance survival rates and improve quality of life through personalized treatment strategies.Despite their promise,the widespread adoption of BN is hindered by challenges such as data quality limitations,computational complexities,and the need for greater clinical acceptance.The review concludes with future research directions,emphasizing the development of advanced BN algorithms,the integration of multi-omics data,and strategies to ensure clinical applicability,aiming to fully realize the potential of BN in personalized medicine for gastrointestinal cancers.展开更多
Test case prioritization and ranking play a crucial role in software testing by improving fault detection efficiency and ensuring software reliability.While prioritization selects the most relevant test cases for opti...Test case prioritization and ranking play a crucial role in software testing by improving fault detection efficiency and ensuring software reliability.While prioritization selects the most relevant test cases for optimal coverage,ranking further refines their execution order to detect critical faults earlier.This study investigates machine learning techniques to enhance both prioritization and ranking,contributing to more effective and efficient testing processes.We first employ advanced feature engineering alongside ensemble models,including Gradient Boosted,Support Vector Machines,Random Forests,and Naive Bayes classifiers to optimize test case prioritization,achieving an accuracy score of 0.98847 and significantly improving the Average Percentage of Fault Detection(APFD).Subsequently,we introduce a deep Q-learning framework combined with a Genetic Algorithm(GA)to refine test case ranking within priority levels.This approach achieves a rank accuracy of 0.9172,demonstrating robust performance despite the increasing computational demands of specialized variation operators.Our findings highlight the effectiveness of stacked ensemble learning and reinforcement learning in optimizing test case prioritization and ranking.This integrated approach improves testing efficiency,reduces late-stage defects,and improves overall software stability.The study provides valuable information for AI-driven testing frameworks,paving the way for more intelligent and adaptive software quality assurance methodologies.展开更多
Accurate retrieval of atmospheric vertical profiles is critical for improving weather prediction and climate monitoring.However,the complexity of atmospheric processes in cloudy regions poses challenges compared to th...Accurate retrieval of atmospheric vertical profiles is critical for improving weather prediction and climate monitoring.However,the complexity of atmospheric processes in cloudy regions poses challenges compared to those of clear sky scenarios.This study presents a novel framework that integrates Bayesian optimization and machine learning approaches to retrieve atmospheric vertical profiles—including temperature,humidity,ozone concentration,cloud fraction,ice water content(IWC),and liquid water content(LWC)—from hyperspectral infrared observations.Specifically,a Bayesian method was used to refine ERA5 reanalysis data by minimizing brightness temperature(BT)discrepancies against FY-4B Geostationary Interferometric Infrared Sounder(GIIRS)observations,generating a high-quality profile database(~2.8 million profiles)across diverse weather systems.The optimized profiles improve radiative consistency,reducing BT biases from>40 K to<10 K in cloudy regions.To further overcome the limitations of the Bayesian method,we developed a Transformer-Resnet hybrid model(TERNet),which achieved superior performance with RMSE values of 1.61 K(temperature),5.77%(humidity),and 2.25×10^(–6)/6.09×10^(–6)kg kg^(–1)(IWC/LWC)across the entire vertical levels in all-sky conditions.The TERNet outperforms both ERA5 in cloud parameter retrieval and the GIIRS L2 product in thermodynamic profiling.Independent verification with radiosonde and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations(CALIPSO)datasets confirms the framework's reliability across various meteorological regimes.This work demonstrates the capability of combining physics-informed Bayesian methods with data-driven machine learning to fully exploit hyperspectral IR data.展开更多
With the rapid development of artificial intelligence,the intelligence level of software is increasingly improving.Intelligent software,which is widely applied in crucial fields such as autonomous driving,intelligent ...With the rapid development of artificial intelligence,the intelligence level of software is increasingly improving.Intelligent software,which is widely applied in crucial fields such as autonomous driving,intelligent customer service,and medical diagnosis,is constructed based on complex technologies like machine learning and deep learning.Its uncertain behavior and data dependence pose unprecedented challenges to software testing.However,existing software testing courses mainly focus on conventional contents and are unable to meet the requirements of intelligent software testing.Therefore,this work deeply analyzed the relevant technologies of intelligent software testing,including reliability evaluation indicator system,neuron coverage,and test case generation.It also systematically designed an intelligent software testing course,covering teaching objectives,teaching content,teaching methods,and a teaching case.Verified by the practical teaching in four classes,this course has achieved remarkable results,providing practical experience for the reform of software testing courses.展开更多
The integrated nested Laplace approximation(INLA)algorithm provides a computationally efficient approach for approximate Bayesian inference,overcoming the limitations of traditional Markov chain Monte Carlo(MCMC)metho...The integrated nested Laplace approximation(INLA)algorithm provides a computationally efficient approach for approximate Bayesian inference,overcoming the limitations of traditional Markov chain Monte Carlo(MCMC)methods.This paper reviews INLA algorithm and provides a systematic review of six key books that explore the theoretical foundations,practical implementations,and diverse applications of INLA.These six books cover spatial and spatio-temporal modelling,general Bayesian inference,SPDE-based spatial analysis,geospatial health data,regression modelling,and dynamic time series.In addition,these books highlight the versatility of INLA method in handling complex models while maintaining high computational efficiency.This paper begins with an introduction to the INLA method and algorithm,followed by a systematic review of six key publications in the field.展开更多
Objective To study the drug test data protection system in foreign countries,and to foster pharmaceutical innovation and increase drug accessibility in China.Methods The development history of drug test data protectio...Objective To study the drug test data protection system in foreign countries,and to foster pharmaceutical innovation and increase drug accessibility in China.Methods The development history of drug test data protection was analyzed to examine and evaluate China’s current drug test data protection system so as to offer recommendations for its improvement.Finally,the drug test data protection system in China can be officially implemented.Results and Conclusion The drug test data protection system aims to promote innovation by protecting the trial data of innovative drugs.In a broad sense,this belongs to intellectual property protection,but it is different from patent protection.Although China has established a drug testing data protection system after joining the“Agreement on Trade-Related Aspects of Intellectual Property Rights(TRIPS)”,the relevant provisions and regulations have not yet been formally formed,and the system has not yet been implemented.Therefore,some suggestions for improving China’s drug testing data protection system are proposed to achieve good social benefits.展开更多
The reliable operation of power grid secondary equipment is an important guarantee for the safety and stability of the power system.However,various defects could be produced in the secondary equipment during longtermo...The reliable operation of power grid secondary equipment is an important guarantee for the safety and stability of the power system.However,various defects could be produced in the secondary equipment during longtermoperation.The complex relationship between the defect phenomenon andmulti-layer causes and the probabilistic influence of secondary equipment cannot be described through knowledge extraction and fusion technology by existing methods,which limits the real-time and accuracy of defect identification.Therefore,a defect recognition method based on the Bayesian network and knowledge graph fusion is proposed.The defect data of secondary equipment is transformed into the structured knowledge graph through knowledge extraction and fusion technology.The knowledge graph of power grid secondary equipment is mapped to the Bayesian network framework,combined with historical defect data,and introduced Noisy-OR nodes.The prior and conditional probabilities of the Bayesian network are then reasonably assigned to build a model that reflects the probability dependence between defect phenomena and potential causes in power grid secondary equipment.Defect identification of power grid secondary equipment is achieved by defect subgraph search based on the knowledge graph,and defect inference based on the Bayesian network.Practical application cases prove this method’s effectiveness in identifying secondary equipment defect causes,improving identification accuracy and efficiency.展开更多
Recommendation systems have become indispensable for providing tailored suggestions and capturing evolving user preferences based on interaction histories.The collaborative filtering(CF)model,which depends exclusively...Recommendation systems have become indispensable for providing tailored suggestions and capturing evolving user preferences based on interaction histories.The collaborative filtering(CF)model,which depends exclusively on user-item interactions,commonly encounters challenges,including the cold-start problem and an inability to effectively capture the sequential and temporal characteristics of user behavior.This paper introduces a personalized recommendation system that combines deep learning techniques with Bayesian Personalized Ranking(BPR)optimization to address these limitations.With the strong support of Long Short-Term Memory(LSTM)networks,we apply it to identify sequential dependencies of user behavior and then incorporate an attention mechanism to improve the prioritization of relevant items,thereby enhancing recommendations based on the hybrid feedback of the user and its interaction patterns.The proposed system is empirically evaluated using publicly available datasets from movie and music,and we evaluate the performance against standard recommendation models,including Popularity,BPR,ItemKNN,FPMC,LightGCN,GRU4Rec,NARM,SASRec,and BERT4Rec.The results demonstrate that our proposed framework consistently achieves high outcomes in terms of HitRate,NDCG,MRR,and Precision at K=100,with scores of(0.6763,0.1892,0.0796,0.0068)on MovieLens-100K,(0.6826,0.1920,0.0813,0.0068)on MovieLens-1M,and(0.7937,0.3701,0.2756,0.0078)on Last.fm.The results show an average improvement of around 15%across all metrics compared to existing sequence models,proving that our framework ranks and recommends items more accurately.展开更多
Members of the British Textile Machinery Association(BTMA)can look back on 2025 as a year marked by notable technological advances and continued progress in global trade,despite an uncertain and volatile market.“Our ...Members of the British Textile Machinery Association(BTMA)can look back on 2025 as a year marked by notable technological advances and continued progress in global trade,despite an uncertain and volatile market.“Our members have been very active over the past 12 months and this has resulted in new technologies for the production of technical fibres and fabrics,the introduction of AI and machine learning into process control systems and significant advances in materials testing,”says BTMA CEO Jason Kent.“There’s real excitement about what can be achieved in 2026 as we look ahead to upcoming exhibitions such as JEC Composites in Paris in March and Techtextil in Frankfurt in April.”展开更多
This paper investigates the reliability of internal marine combustion engines using an integrated approach that combines Fault Tree Analysis(FTA)and Bayesian Networks(BN).FTA provides a structured,top-down method for ...This paper investigates the reliability of internal marine combustion engines using an integrated approach that combines Fault Tree Analysis(FTA)and Bayesian Networks(BN).FTA provides a structured,top-down method for identifying critical failure modes and their root causes,while BN introduces flexibility in probabilistic reasoning,enabling dynamic updates based on new evidence.This dual methodology overcomes the limitations of static FTA models,offering a comprehensive framework for system reliability analysis.Critical failures,including External Leakage(ELU),Failure to Start(FTS),and Overheating(OHE),were identified as key risks.By incorporating redundancy into high-risk components such as pumps and batteries,the likelihood of these failures was significantly reduced.For instance,redundant pumps reduced the probability of ELU by 31.88%,while additional batteries decreased the occurrence of FTS by 36.45%.The results underscore the practical benefits of combining FTA and BN for enhancing system reliability,particularly in maritime applications where operational safety and efficiency are critical.This research provides valuable insights for maintenance planning and highlights the importance of redundancy in critical systems,especially as the industry transitions toward more autonomous vessels.展开更多
With the deep integration of smart manufacturing and IoT technologies,higher demands are placed on the intelligence and real-time performance of industrial equipment fault detection.For industrial fans,base bolt loose...With the deep integration of smart manufacturing and IoT technologies,higher demands are placed on the intelligence and real-time performance of industrial equipment fault detection.For industrial fans,base bolt loosening faults are difficult to identify through conventional spectrum analysis,and the extreme scarcity of fault data leads to limited training datasets,making traditional deep learning methods inaccurate in fault identification and incapable of detecting loosening severity.This paper employs Bayesian Learning by training on a small fault dataset collected from the actual operation of axial-flow fans in a factory to obtain posterior distribution.This method proposes specific data processing approaches and a configuration of Bayesian Convolutional Neural Network(BCNN).It can effectively improve the model’s generalization ability.Experimental results demonstrate high detection accuracy and alignment with real-world applications,offering practical significance and reference value for industrial fan bolt loosening detection under data-limited conditions.展开更多
Leveraging high-precision lattice QCD data on the equation of state and baryon number susceptibility at a vanishing chemical potential,we constructed a Bayesian holographic QCD model and systematically analyzed the th...Leveraging high-precision lattice QCD data on the equation of state and baryon number susceptibility at a vanishing chemical potential,we constructed a Bayesian holographic QCD model and systematically analyzed the thermodynamic properties of heavy quarkonium in QCD matter under varying temperatures and chemical potentials.We computed the quark-antiquark interquark distance,potential energy,entropy,binding energy,and internal energy.We present detailed posterior distribution results of the thermodynamic quantities of heavy quarkonium,including maximum a posteriori(MAP)value estimates and 95%confidence levels(CL).Through numerical simulations and theoretical analysis,we find that an increase in the temperature and chemical potential reduces the quark distance,thereby facilitating the dissociation of heavy quarkonium and leading to a suppressed potential energy.The increase in temperature and chemical potential also raises the entropy and entropy force,further accelerating the dissociation of heavy quarkonium.The calculated results of binding energy indicate that a higher temperature and chemical potential enhance the tendency of heavy quarkonium to dissociate into free quarks.The internal energy also increases with rising temperature and chemical potential.These findings provide significant theoretical insights into the properties of strongly interacting matter under extreme conditions and lay a solid foundation for the interpretation and validation of future experimental data.Finally,we also present the results for the free energy,entropy,and internal energy of a single quark.展开更多
Inverse design of advanced materials represents a pivotal challenge in materials science.Leveraging the latent space of Variational Autoencoders(VAEs)for material optimization has emerged as a significant advancement ...Inverse design of advanced materials represents a pivotal challenge in materials science.Leveraging the latent space of Variational Autoencoders(VAEs)for material optimization has emerged as a significant advancement in the field of material inverse design.However,VAEs are inherently prone to generating blurred images,posing challenges for precise inverse design and microstructure manufacturing.While increasing the dimensionality of the VAE latent space can mitigate reconstruction blurriness to some extent,it simultaneously imposes a substantial burden on target optimization due to an excessively high search space.To address these limitations,this study adopts a Variational Autoencoder guided Conditional Diffusion Generative Model(VAE-CDGM)framework integrated with Bayesian optimization to achieve the inverse design of composite materials with targeted mechanical properties.The VAE-CDGM model synergizes the strengths of VAEs and Denoising Diffusion Probabilistic Models(DDPM),enabling the generation of high-quality,sharp images while preserving a manipulable latent space.To accommodate varying dimensional requirements of the latent space,two optimization strategies are proposed.When the latent space dimensionality is excessively high,SHapley Additive exPlanations(SHAP)sensitivity analysis is employed to identify critical latent features for optimization within a reduced subspace.Conversely,direct optimization is performed in the low-dimensional latent space of VAE-CDGM when dimensionality is modest.The results demonstrate that both strategies accurately achieve the targeted design of composite materials while circumventing the blurred reconstruction flaws of VAEs,which offers a novel pathway for the precise design of advanced materials.展开更多
With the rapid development of Internet technology,REST APIs(Representational State Transfer Application Programming Interfaces)have become the primary communication standard in modern microservice architectures,raisin...With the rapid development of Internet technology,REST APIs(Representational State Transfer Application Programming Interfaces)have become the primary communication standard in modern microservice architectures,raising increasing concerns about their security.Existing fuzz testing methods include random or dictionary-based input generation,which often fail to ensure both syntactic and semantic correctness,and OpenAPIbased approaches,which offer better accuracy but typically lack detailed descriptions of endpoints,parameters,or data formats.To address these issues,this paper proposes the APIDocX fuzz testing framework.It introduces a crawler tailored for dynamic web pages that automatically simulates user interactions to trigger APIs,capturing and extracting parameter information from communication packets.A multi-endpoint parameter adaptation method based on improved Jaccard similarity is then used to generalize these parameters to other potential API endpoints,filling in gaps in OpenAPI specifications.Experimental results demonstrate that the extracted parameters can be generalized with 79.61%accuracy.Fuzz testing using the enriched OpenAPI documents leads to improvements in test coverage,the number of valid test cases generated,and fault detection capabilities.This approach offers an effective enhancement to automated REST API security testing.展开更多
Cloud services,favored by many enterprises due to their high flexibility and easy operation,are widely used for data storage and processing.However,the high latency,together with transmission overheads of the cloud ar...Cloud services,favored by many enterprises due to their high flexibility and easy operation,are widely used for data storage and processing.However,the high latency,together with transmission overheads of the cloud architecture,makes it difficult to quickly respond to the demands of IoT applications and local computation.To make up for these deficiencies in the cloud,fog computing has emerged as a critical role in the IoT applications.It decentralizes the computing power to various lower nodes close to data sources,so as to achieve the goal of low latency and distributed processing.With the data being frequently exchanged and shared between multiple nodes,it becomes a challenge to authorize data securely and efficiently while protecting user privacy.To address this challenge,proxy re-encryption(PRE)schemes provide a feasible way allowing an intermediary proxy node to re-encrypt ciphertext designated for different authorized data requesters without compromising any plaintext information.Since the proxy is viewed as a semi-trusted party,it should be taken to prevent malicious behaviors and reduce the risk of data leakage when implementing PRE schemes.This paper proposes a new fog-assisted identity-based PRE scheme supporting anonymous key generation,equality test,and user revocation to fulfill various IoT application requirements.Specifically,in a traditional identity-based public key architecture,the key escrow problem and the necessity of a secure channel are major security concerns.We utilize an anonymous key generation technique to solve these problems.The equality test functionality further enables a cloud server to inspect whether two candidate trapdoors contain an identical keyword.In particular,the proposed scheme realizes fine-grained user-level authorization while maintaining strong key confidentiality.To revoke an invalid user identity,we add a revocation list to the system flows to restrict access privileges without increasing additional computation cost.To ensure security,it is shown that our system meets the security notion of IND-PrID-CCA and OW-ID-CCA under the Decisional Bilinear Diffie-Hellman(DBDH)assumption.展开更多
Lateral flow immunoassays(LFIAs)are low-cost,rapid,and easy to use for pointof-care testing(POCT),but the majority of the available LFIA tests are indicative,rather than quantitative,and their sensitivity in antigen t...Lateral flow immunoassays(LFIAs)are low-cost,rapid,and easy to use for pointof-care testing(POCT),but the majority of the available LFIA tests are indicative,rather than quantitative,and their sensitivity in antigen tests are usually limited at the nanogram range,which is primarily due to the passive capillary fluidics through nitrocellulose membranes,often associated with non-specific bindings and high background noise.To overcome this challenge,we report a Beads-on-a-Tip design by replacing nitrocellulose membranes with a pipette tip loaded with magnetic beads.The beads are pre-conjugated with capture antibodies that support a typical sandwich immunoassay.This design enriches the low-abundant antigen proteins and allows an active washing process to significantly reduce non-specific bindings.To further improve the detection sensitivity,we employed upconversion nanoparticles(UCNPs)as luminescent reporters and SARS-CoV-2 spike(S)antigen as a model analyte to benchmark the performance of this design against our previously reported methods.We found that the key to enhance the immunocomplex formation and signal-to-noise ratio lay in optimizing incubation time and the UCNP-to-bead ratio.We therefore successfully demonstrated that the new method can achieve a very large dynamic range from 500 fg/mL to 10μg/mL,across over 7 digits,and a limit of detection of 706 fg/mL,nearly another order of magnitude lower than the best reported LFIA using UCNPs in COVID-19 spike antigen detection.Our system offers a promising solution for ultra-sensitive and quantitative POCT diagnostics.展开更多
The authors consider the issue of hypothesis testing in varying-coefficient regression models with high-dimensional data.Utilizing kernel smoothing techniques,the authors propose a locally concerned U-statistic method...The authors consider the issue of hypothesis testing in varying-coefficient regression models with high-dimensional data.Utilizing kernel smoothing techniques,the authors propose a locally concerned U-statistic method to assess the overall significance of the coefficients.The authors establish that the proposed test is asymptotically normal under both the null hypothesis and local alternatives.Based on the locally concerned U-statistic,the authors further develop a globally concerned U-statistic to test whether the coefficient function is zero.A stochastic perturbation method is employed to approximate the distribution of the globally concerned test statistic.Monte Carlo simulations demonstrate the validity of the proposed test in finite samples.展开更多
To address the insufficient prediction accuracy of multi-state parameters in electro-hydraulic servo material fatigue testing machines under complex loading and nonlinear coupling conditions,this paper proposes a mult...To address the insufficient prediction accuracy of multi-state parameters in electro-hydraulic servo material fatigue testing machines under complex loading and nonlinear coupling conditions,this paper proposes a multivariate sequence-to-sequence prediction model integrating a Long Short-Term Memory(LSTM)encoder,a Gated Recurrent Unit(GRU)decoder,and a multi-head attention mechanism.This approach enhances prediction accuracy and robustness across different control modes and load spectra by leveraging multi-channel inputs and cross-variable feature interactions,thereby capturing both short-term high-frequency dynamics and long-term slow drift characteristics.Experiments using long-term data from real test benches demonstrate that the model achieves a stable MSE below 0.01 on the validation set,with MAE and RMSE of approximately 0.018 and 0.052,respectively,and a coefficient of determination reaching 0.98.This significantly outperforms traditional identification methods and single RNN models.Sensitivity analysis indicates that a prediction stride of 10 achieves an optimal balance between accuracy and computational overhead.Ablation experiments validated the contribution of multi-head attention and decoder architecture to enhancing cross-variable coupling modeling capabilities.This model can be applied to residualdriven early warning in health monitoring,and risk assessment with scheme optimization in test design.It enables near-real-time deployment feasibility,providing a practical data-driven technical pathway for reliability assurance in advanced equipment.展开更多
Patients affected by monogenic diseases impose a substantial burden on both themselves and their families.The primary preventive measure,i.e.,invasive prenatal diagnosis,carries a risk of miscarriage and cannot be per...Patients affected by monogenic diseases impose a substantial burden on both themselves and their families.The primary preventive measure,i.e.,invasive prenatal diagnosis,carries a risk of miscarriage and cannot be performed early in pregnancy.Hence,there is a need for non-invasive prenatal testing(NIPT)for monogenic diseases.By utilizing enriched cell-free fetal DNA(cffDNA)from maternal plasma,we refine the NIPT method,which combines targeted region capture technology,haplotyping,and analysis of informative site frequency.We apply this method to 93 clinical families at genetic risk for thalassemia,encompassing various genetic variant types,to establish a workflow and evaluate its efficiency.Our approach requires only 3 ng of DNA input to generate 0.1 Gb informative target genomic data and leverages a minimum of 3%cffDNA.This method has a 98.16%success rate and 100%concordance with conventional invasive methods.Furthermore,we demonstrate the ability to analyze fetal genotypes as early as eight weeks of gestation.This study establishes an optimized NIPT method for the early detection of various thalassemia disorders during pregnancy.This technique demonstrates high accuracy and potential for clinical application in prenatal diagnosis.展开更多
基金Supported by Open Funds for Shaanxi Provincial Key Laboratory of Infection and Immune Diseases,No.2023-KFMS-1.
文摘Gastrointestinal cancers,including esophageal,gastric,colorectal,liver,gallbladder,cholangiocarcinoma,and pancreatic cancers,pose a significant global health challenge due to their high mortality rates and poor prognosis,particularly when diagnosed at advanced stages.These malignancies,characterized by diverse clinical presentations and etiologies,require innovative approaches for improved management.Bayesian networks(BN)have emerged as a powerful tool in this field,offering the ability to manage uncertainty,integrate heterogeneous data sources,and support clinical decision-making.This review explores the application of BN in addressing critical challenges in gastrointestinal cancers,including the identification of risk factors,early detection,treatment optimization,and prognosis prediction.By integrating genetic predispositions,lifestyle factors,and clinical data,BN hold the potential to enhance survival rates and improve quality of life through personalized treatment strategies.Despite their promise,the widespread adoption of BN is hindered by challenges such as data quality limitations,computational complexities,and the need for greater clinical acceptance.The review concludes with future research directions,emphasizing the development of advanced BN algorithms,the integration of multi-omics data,and strategies to ensure clinical applicability,aiming to fully realize the potential of BN in personalized medicine for gastrointestinal cancers.
文摘Test case prioritization and ranking play a crucial role in software testing by improving fault detection efficiency and ensuring software reliability.While prioritization selects the most relevant test cases for optimal coverage,ranking further refines their execution order to detect critical faults earlier.This study investigates machine learning techniques to enhance both prioritization and ranking,contributing to more effective and efficient testing processes.We first employ advanced feature engineering alongside ensemble models,including Gradient Boosted,Support Vector Machines,Random Forests,and Naive Bayes classifiers to optimize test case prioritization,achieving an accuracy score of 0.98847 and significantly improving the Average Percentage of Fault Detection(APFD).Subsequently,we introduce a deep Q-learning framework combined with a Genetic Algorithm(GA)to refine test case ranking within priority levels.This approach achieves a rank accuracy of 0.9172,demonstrating robust performance despite the increasing computational demands of specialized variation operators.Our findings highlight the effectiveness of stacked ensemble learning and reinforcement learning in optimizing test case prioritization and ranking.This integrated approach improves testing efficiency,reduces late-stage defects,and improves overall software stability.The study provides valuable information for AI-driven testing frameworks,paving the way for more intelligent and adaptive software quality assurance methodologies.
基金supported by the National Natural Science Foundation of China under Grant U2442219Fengyun Satellite Application Pioneer Program(2023)Special Initiative on Numerical Weather Prediction(NWP)Applications,the Civil Aerospace Technology Pre-Research Project(D040405)the Joint Funds of the Zhejiang Provincial Natural Science Foundation of China under Grant No.LZJMZ23D050003。
文摘Accurate retrieval of atmospheric vertical profiles is critical for improving weather prediction and climate monitoring.However,the complexity of atmospheric processes in cloudy regions poses challenges compared to those of clear sky scenarios.This study presents a novel framework that integrates Bayesian optimization and machine learning approaches to retrieve atmospheric vertical profiles—including temperature,humidity,ozone concentration,cloud fraction,ice water content(IWC),and liquid water content(LWC)—from hyperspectral infrared observations.Specifically,a Bayesian method was used to refine ERA5 reanalysis data by minimizing brightness temperature(BT)discrepancies against FY-4B Geostationary Interferometric Infrared Sounder(GIIRS)observations,generating a high-quality profile database(~2.8 million profiles)across diverse weather systems.The optimized profiles improve radiative consistency,reducing BT biases from>40 K to<10 K in cloudy regions.To further overcome the limitations of the Bayesian method,we developed a Transformer-Resnet hybrid model(TERNet),which achieved superior performance with RMSE values of 1.61 K(temperature),5.77%(humidity),and 2.25×10^(–6)/6.09×10^(–6)kg kg^(–1)(IWC/LWC)across the entire vertical levels in all-sky conditions.The TERNet outperforms both ERA5 in cloud parameter retrieval and the GIIRS L2 product in thermodynamic profiling.Independent verification with radiosonde and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations(CALIPSO)datasets confirms the framework's reliability across various meteorological regimes.This work demonstrates the capability of combining physics-informed Bayesian methods with data-driven machine learning to fully exploit hyperspectral IR data.
基金Computer Basic Education Teaching Research Project of Association of Fundamental Computing Education in Chinese Universities(Nos.2025-AFCEC-527 and 2024-AFCEC-088)Research on the Reform of Public Course Teaching at Nantong College of Science and Technology(No.2024JGG015).
文摘With the rapid development of artificial intelligence,the intelligence level of software is increasingly improving.Intelligent software,which is widely applied in crucial fields such as autonomous driving,intelligent customer service,and medical diagnosis,is constructed based on complex technologies like machine learning and deep learning.Its uncertain behavior and data dependence pose unprecedented challenges to software testing.However,existing software testing courses mainly focus on conventional contents and are unable to meet the requirements of intelligent software testing.Therefore,this work deeply analyzed the relevant technologies of intelligent software testing,including reliability evaluation indicator system,neuron coverage,and test case generation.It also systematically designed an intelligent software testing course,covering teaching objectives,teaching content,teaching methods,and a teaching case.Verified by the practical teaching in four classes,this course has achieved remarkable results,providing practical experience for the reform of software testing courses.
基金supported by the National Natural Science Foundation of China[grant number 12001266]the Humanities and Social Science Projects ofMinistry of Education of China[grant number 19YJCZH166]supported by the National Natural Science Foundation of China[grant numbers 12271168 and 12531013].
文摘The integrated nested Laplace approximation(INLA)algorithm provides a computationally efficient approach for approximate Bayesian inference,overcoming the limitations of traditional Markov chain Monte Carlo(MCMC)methods.This paper reviews INLA algorithm and provides a systematic review of six key books that explore the theoretical foundations,practical implementations,and diverse applications of INLA.These six books cover spatial and spatio-temporal modelling,general Bayesian inference,SPDE-based spatial analysis,geospatial health data,regression modelling,and dynamic time series.In addition,these books highlight the versatility of INLA method in handling complex models while maintaining high computational efficiency.This paper begins with an introduction to the INLA method and algorithm,followed by a systematic review of six key publications in the field.
文摘Objective To study the drug test data protection system in foreign countries,and to foster pharmaceutical innovation and increase drug accessibility in China.Methods The development history of drug test data protection was analyzed to examine and evaluate China’s current drug test data protection system so as to offer recommendations for its improvement.Finally,the drug test data protection system in China can be officially implemented.Results and Conclusion The drug test data protection system aims to promote innovation by protecting the trial data of innovative drugs.In a broad sense,this belongs to intellectual property protection,but it is different from patent protection.Although China has established a drug testing data protection system after joining the“Agreement on Trade-Related Aspects of Intellectual Property Rights(TRIPS)”,the relevant provisions and regulations have not yet been formally formed,and the system has not yet been implemented.Therefore,some suggestions for improving China’s drug testing data protection system are proposed to achieve good social benefits.
基金supported by the State Grid Southwest Branch Project“Research on Defect Diagnosis and Early Warning Technology of Relay Protection and Safety Automation Devices Based on Multi-Source Heterogeneous Defect Data”.
文摘The reliable operation of power grid secondary equipment is an important guarantee for the safety and stability of the power system.However,various defects could be produced in the secondary equipment during longtermoperation.The complex relationship between the defect phenomenon andmulti-layer causes and the probabilistic influence of secondary equipment cannot be described through knowledge extraction and fusion technology by existing methods,which limits the real-time and accuracy of defect identification.Therefore,a defect recognition method based on the Bayesian network and knowledge graph fusion is proposed.The defect data of secondary equipment is transformed into the structured knowledge graph through knowledge extraction and fusion technology.The knowledge graph of power grid secondary equipment is mapped to the Bayesian network framework,combined with historical defect data,and introduced Noisy-OR nodes.The prior and conditional probabilities of the Bayesian network are then reasonably assigned to build a model that reflects the probability dependence between defect phenomena and potential causes in power grid secondary equipment.Defect identification of power grid secondary equipment is achieved by defect subgraph search based on the knowledge graph,and defect inference based on the Bayesian network.Practical application cases prove this method’s effectiveness in identifying secondary equipment defect causes,improving identification accuracy and efficiency.
基金funded by Soonchunhyang University,Grant Number 20250029。
文摘Recommendation systems have become indispensable for providing tailored suggestions and capturing evolving user preferences based on interaction histories.The collaborative filtering(CF)model,which depends exclusively on user-item interactions,commonly encounters challenges,including the cold-start problem and an inability to effectively capture the sequential and temporal characteristics of user behavior.This paper introduces a personalized recommendation system that combines deep learning techniques with Bayesian Personalized Ranking(BPR)optimization to address these limitations.With the strong support of Long Short-Term Memory(LSTM)networks,we apply it to identify sequential dependencies of user behavior and then incorporate an attention mechanism to improve the prioritization of relevant items,thereby enhancing recommendations based on the hybrid feedback of the user and its interaction patterns.The proposed system is empirically evaluated using publicly available datasets from movie and music,and we evaluate the performance against standard recommendation models,including Popularity,BPR,ItemKNN,FPMC,LightGCN,GRU4Rec,NARM,SASRec,and BERT4Rec.The results demonstrate that our proposed framework consistently achieves high outcomes in terms of HitRate,NDCG,MRR,and Precision at K=100,with scores of(0.6763,0.1892,0.0796,0.0068)on MovieLens-100K,(0.6826,0.1920,0.0813,0.0068)on MovieLens-1M,and(0.7937,0.3701,0.2756,0.0078)on Last.fm.The results show an average improvement of around 15%across all metrics compared to existing sequence models,proving that our framework ranks and recommends items more accurately.
文摘Members of the British Textile Machinery Association(BTMA)can look back on 2025 as a year marked by notable technological advances and continued progress in global trade,despite an uncertain and volatile market.“Our members have been very active over the past 12 months and this has resulted in new technologies for the production of technical fibres and fabrics,the introduction of AI and machine learning into process control systems and significant advances in materials testing,”says BTMA CEO Jason Kent.“There’s real excitement about what can be achieved in 2026 as we look ahead to upcoming exhibitions such as JEC Composites in Paris in March and Techtextil in Frankfurt in April.”
基金supported by Istanbul Technical University(Project No.45698)supported through the“Young Researchers’Career Development Project-training of doctoral students”of the Croatian Science Foundation.
文摘This paper investigates the reliability of internal marine combustion engines using an integrated approach that combines Fault Tree Analysis(FTA)and Bayesian Networks(BN).FTA provides a structured,top-down method for identifying critical failure modes and their root causes,while BN introduces flexibility in probabilistic reasoning,enabling dynamic updates based on new evidence.This dual methodology overcomes the limitations of static FTA models,offering a comprehensive framework for system reliability analysis.Critical failures,including External Leakage(ELU),Failure to Start(FTS),and Overheating(OHE),were identified as key risks.By incorporating redundancy into high-risk components such as pumps and batteries,the likelihood of these failures was significantly reduced.For instance,redundant pumps reduced the probability of ELU by 31.88%,while additional batteries decreased the occurrence of FTS by 36.45%.The results underscore the practical benefits of combining FTA and BN for enhancing system reliability,particularly in maritime applications where operational safety and efficiency are critical.This research provides valuable insights for maintenance planning and highlights the importance of redundancy in critical systems,especially as the industry transitions toward more autonomous vessels.
基金funded by the Zhejiang Provincial Key Science and Technology“LingYan”Project Foundation,grant number 2023C01145Zhejiang Gongshang University Higher Education Research Projects,grant number Xgy22028.
文摘With the deep integration of smart manufacturing and IoT technologies,higher demands are placed on the intelligence and real-time performance of industrial equipment fault detection.For industrial fans,base bolt loosening faults are difficult to identify through conventional spectrum analysis,and the extreme scarcity of fault data leads to limited training datasets,making traditional deep learning methods inaccurate in fault identification and incapable of detecting loosening severity.This paper employs Bayesian Learning by training on a small fault dataset collected from the actual operation of axial-flow fans in a factory to obtain posterior distribution.This method proposes specific data processing approaches and a configuration of Bayesian Convolutional Neural Network(BCNN).It can effectively improve the model’s generalization ability.Experimental results demonstrate high detection accuracy and alignment with real-world applications,offering practical significance and reference value for industrial fan bolt loosening detection under data-limited conditions.
基金supported in part by the National Key Research and Development Program of China(No.2022YFA1604900)the National Natural Science Foundation of China(NSFC)(Nos.12405154,12235016,12221005,12435009,12275104,92570117)+7 种基金the Strategic Priority Research Program of Chinese Academy of Sciences(No.XDB34030000)the Fundamental Research Funds for the Central UniversitiesOpen fund for Key Laboratories of the Ministry of Education(No.QLPL2024P01)CUHK-Shenzhen University Development Fund(Nos.UDF01003041 and UDF03003041)Shenzhen Peacock Fund(No.2023TC0007)Ministry of Science and Technology of China(No.2024YFA1611004)the European Union–Next Generation EU through the research(No.P2022Z4P4B)“SOPHYA-Sustainable Optimized PHYsics Algorithms:fundamental physics to build an advanced society”under the program PRIN 2022 PNRR of the Italian Ministero dell’Universitàe Ricerca(MUR)。
文摘Leveraging high-precision lattice QCD data on the equation of state and baryon number susceptibility at a vanishing chemical potential,we constructed a Bayesian holographic QCD model and systematically analyzed the thermodynamic properties of heavy quarkonium in QCD matter under varying temperatures and chemical potentials.We computed the quark-antiquark interquark distance,potential energy,entropy,binding energy,and internal energy.We present detailed posterior distribution results of the thermodynamic quantities of heavy quarkonium,including maximum a posteriori(MAP)value estimates and 95%confidence levels(CL).Through numerical simulations and theoretical analysis,we find that an increase in the temperature and chemical potential reduces the quark distance,thereby facilitating the dissociation of heavy quarkonium and leading to a suppressed potential energy.The increase in temperature and chemical potential also raises the entropy and entropy force,further accelerating the dissociation of heavy quarkonium.The calculated results of binding energy indicate that a higher temperature and chemical potential enhance the tendency of heavy quarkonium to dissociate into free quarks.The internal energy also increases with rising temperature and chemical potential.These findings provide significant theoretical insights into the properties of strongly interacting matter under extreme conditions and lay a solid foundation for the interpretation and validation of future experimental data.Finally,we also present the results for the free energy,entropy,and internal energy of a single quark.
文摘Inverse design of advanced materials represents a pivotal challenge in materials science.Leveraging the latent space of Variational Autoencoders(VAEs)for material optimization has emerged as a significant advancement in the field of material inverse design.However,VAEs are inherently prone to generating blurred images,posing challenges for precise inverse design and microstructure manufacturing.While increasing the dimensionality of the VAE latent space can mitigate reconstruction blurriness to some extent,it simultaneously imposes a substantial burden on target optimization due to an excessively high search space.To address these limitations,this study adopts a Variational Autoencoder guided Conditional Diffusion Generative Model(VAE-CDGM)framework integrated with Bayesian optimization to achieve the inverse design of composite materials with targeted mechanical properties.The VAE-CDGM model synergizes the strengths of VAEs and Denoising Diffusion Probabilistic Models(DDPM),enabling the generation of high-quality,sharp images while preserving a manipulable latent space.To accommodate varying dimensional requirements of the latent space,two optimization strategies are proposed.When the latent space dimensionality is excessively high,SHapley Additive exPlanations(SHAP)sensitivity analysis is employed to identify critical latent features for optimization within a reduced subspace.Conversely,direct optimization is performed in the low-dimensional latent space of VAE-CDGM when dimensionality is modest.The results demonstrate that both strategies accurately achieve the targeted design of composite materials while circumventing the blurred reconstruction flaws of VAEs,which offers a novel pathway for the precise design of advanced materials.
基金supported by the Open Foundation of Key Laboratory of Cyberspace Security,Ministry of Education of China(KLCS20240211)。
文摘With the rapid development of Internet technology,REST APIs(Representational State Transfer Application Programming Interfaces)have become the primary communication standard in modern microservice architectures,raising increasing concerns about their security.Existing fuzz testing methods include random or dictionary-based input generation,which often fail to ensure both syntactic and semantic correctness,and OpenAPIbased approaches,which offer better accuracy but typically lack detailed descriptions of endpoints,parameters,or data formats.To address these issues,this paper proposes the APIDocX fuzz testing framework.It introduces a crawler tailored for dynamic web pages that automatically simulates user interactions to trigger APIs,capturing and extracting parameter information from communication packets.A multi-endpoint parameter adaptation method based on improved Jaccard similarity is then used to generalize these parameters to other potential API endpoints,filling in gaps in OpenAPI specifications.Experimental results demonstrate that the extracted parameters can be generalized with 79.61%accuracy.Fuzz testing using the enriched OpenAPI documents leads to improvements in test coverage,the number of valid test cases generated,and fault detection capabilities.This approach offers an effective enhancement to automated REST API security testing.
基金supported in part by the National Science and Technology Council of Taiwan under the contract numbers NSTC 114-2221-E-019-055-MY2 and NSTC 114-2221-E-019-069.
文摘Cloud services,favored by many enterprises due to their high flexibility and easy operation,are widely used for data storage and processing.However,the high latency,together with transmission overheads of the cloud architecture,makes it difficult to quickly respond to the demands of IoT applications and local computation.To make up for these deficiencies in the cloud,fog computing has emerged as a critical role in the IoT applications.It decentralizes the computing power to various lower nodes close to data sources,so as to achieve the goal of low latency and distributed processing.With the data being frequently exchanged and shared between multiple nodes,it becomes a challenge to authorize data securely and efficiently while protecting user privacy.To address this challenge,proxy re-encryption(PRE)schemes provide a feasible way allowing an intermediary proxy node to re-encrypt ciphertext designated for different authorized data requesters without compromising any plaintext information.Since the proxy is viewed as a semi-trusted party,it should be taken to prevent malicious behaviors and reduce the risk of data leakage when implementing PRE schemes.This paper proposes a new fog-assisted identity-based PRE scheme supporting anonymous key generation,equality test,and user revocation to fulfill various IoT application requirements.Specifically,in a traditional identity-based public key architecture,the key escrow problem and the necessity of a secure channel are major security concerns.We utilize an anonymous key generation technique to solve these problems.The equality test functionality further enables a cloud server to inspect whether two candidate trapdoors contain an identical keyword.In particular,the proposed scheme realizes fine-grained user-level authorization while maintaining strong key confidentiality.To revoke an invalid user identity,we add a revocation list to the system flows to restrict access privileges without increasing additional computation cost.To ensure security,it is shown that our system meets the security notion of IND-PrID-CCA and OW-ID-CCA under the Decisional Bilinear Diffie-Hellman(DBDH)assumption.
基金financially supported by ARC Linkage project(LP210200642)ARC Center of Excellence for Quantum Biotechnology(grant no.CE230100021)+1 种基金National Health and Medical Research Council Investigator Fellowship—(grant no.APP2017499)Chan Zuckerberg Initiative Deep Tissue Imaging Phase 2(grant no.DT12-0000000182).
文摘Lateral flow immunoassays(LFIAs)are low-cost,rapid,and easy to use for pointof-care testing(POCT),but the majority of the available LFIA tests are indicative,rather than quantitative,and their sensitivity in antigen tests are usually limited at the nanogram range,which is primarily due to the passive capillary fluidics through nitrocellulose membranes,often associated with non-specific bindings and high background noise.To overcome this challenge,we report a Beads-on-a-Tip design by replacing nitrocellulose membranes with a pipette tip loaded with magnetic beads.The beads are pre-conjugated with capture antibodies that support a typical sandwich immunoassay.This design enriches the low-abundant antigen proteins and allows an active washing process to significantly reduce non-specific bindings.To further improve the detection sensitivity,we employed upconversion nanoparticles(UCNPs)as luminescent reporters and SARS-CoV-2 spike(S)antigen as a model analyte to benchmark the performance of this design against our previously reported methods.We found that the key to enhance the immunocomplex formation and signal-to-noise ratio lay in optimizing incubation time and the UCNP-to-bead ratio.We therefore successfully demonstrated that the new method can achieve a very large dynamic range from 500 fg/mL to 10μg/mL,across over 7 digits,and a limit of detection of 706 fg/mL,nearly another order of magnitude lower than the best reported LFIA using UCNPs in COVID-19 spike antigen detection.Our system offers a promising solution for ultra-sensitive and quantitative POCT diagnostics.
基金supported by the National Social Science Foundation of China under Grant No.23&ZD126National Science Foundation of China under Grant No.12471256+1 种基金Natural Science Foundation of Shanxi Province under Grant No.202203021221219Scientific and Technological Innovation Programs of Higher Education Institutions in Shanxi under Grant No.2023L164。
文摘The authors consider the issue of hypothesis testing in varying-coefficient regression models with high-dimensional data.Utilizing kernel smoothing techniques,the authors propose a locally concerned U-statistic method to assess the overall significance of the coefficients.The authors establish that the proposed test is asymptotically normal under both the null hypothesis and local alternatives.Based on the locally concerned U-statistic,the authors further develop a globally concerned U-statistic to test whether the coefficient function is zero.A stochastic perturbation method is employed to approximate the distribution of the globally concerned test statistic.Monte Carlo simulations demonstrate the validity of the proposed test in finite samples.
基金supported by Natural Science Foundation of China(NSFC),Grant number 5247052693.
文摘To address the insufficient prediction accuracy of multi-state parameters in electro-hydraulic servo material fatigue testing machines under complex loading and nonlinear coupling conditions,this paper proposes a multivariate sequence-to-sequence prediction model integrating a Long Short-Term Memory(LSTM)encoder,a Gated Recurrent Unit(GRU)decoder,and a multi-head attention mechanism.This approach enhances prediction accuracy and robustness across different control modes and load spectra by leveraging multi-channel inputs and cross-variable feature interactions,thereby capturing both short-term high-frequency dynamics and long-term slow drift characteristics.Experiments using long-term data from real test benches demonstrate that the model achieves a stable MSE below 0.01 on the validation set,with MAE and RMSE of approximately 0.018 and 0.052,respectively,and a coefficient of determination reaching 0.98.This significantly outperforms traditional identification methods and single RNN models.Sensitivity analysis indicates that a prediction stride of 10 achieves an optimal balance between accuracy and computational overhead.Ablation experiments validated the contribution of multi-head attention and decoder architecture to enhancing cross-variable coupling modeling capabilities.This model can be applied to residualdriven early warning in health monitoring,and risk assessment with scheme optimization in test design.It enables near-real-time deployment feasibility,providing a practical data-driven technical pathway for reliability assurance in advanced equipment.
基金supported by the National Key R&D Program of China(2024YFA1802300)the Major Science and Technology Program of Hainan Province(ZDKJ2021037)+4 种基金the Regional Innovation and Development Joint Fund of the National Natural Science Foundation of China(U24A20677)Hainan Province Science and Technology Special Fund(ZDYF2020117,ZDY2024SHFZ143)Hainan Province Science and TechnologyProject(LCXY202102,LCYX202203,LCYX202301,LCYx202502)Innovative research project for postgraduate students in Hainan Medical University(HYYB2021A05)the Hainan Province Clinical Medical Center,and the specific research fund of The Innovation Platform for Academicians of Hainan Province(YSPTZX202310).
文摘Patients affected by monogenic diseases impose a substantial burden on both themselves and their families.The primary preventive measure,i.e.,invasive prenatal diagnosis,carries a risk of miscarriage and cannot be performed early in pregnancy.Hence,there is a need for non-invasive prenatal testing(NIPT)for monogenic diseases.By utilizing enriched cell-free fetal DNA(cffDNA)from maternal plasma,we refine the NIPT method,which combines targeted region capture technology,haplotyping,and analysis of informative site frequency.We apply this method to 93 clinical families at genetic risk for thalassemia,encompassing various genetic variant types,to establish a workflow and evaluate its efficiency.Our approach requires only 3 ng of DNA input to generate 0.1 Gb informative target genomic data and leverages a minimum of 3%cffDNA.This method has a 98.16%success rate and 100%concordance with conventional invasive methods.Furthermore,we demonstrate the ability to analyze fetal genotypes as early as eight weeks of gestation.This study establishes an optimized NIPT method for the early detection of various thalassemia disorders during pregnancy.This technique demonstrates high accuracy and potential for clinical application in prenatal diagnosis.