It is known that correlation does not imply causality.Some relationships identified in the analysis of data are coincidental or unknown,and some are produced by real-world causality of the situation,which is problemat...It is known that correlation does not imply causality.Some relationships identified in the analysis of data are coincidental or unknown,and some are produced by real-world causality of the situation,which is problematic,since there is a need to differentiate between these two scenarios.Until recently,the proper−semantic−causality of the relationship could have been determined only by human experts from the area of expertise of the studied data.This has changed with the advance of large language models,which are often utilized as surrogates for such human experts,making the process automated and readily available to all data analysts.This motivates the main objective of this work,which is to introduce the design and implementation of a large language model-based semantic causality evaluator based on correlation analysis,together with its visual analysis model called Causal heatmap.After the implementation itself,the model is evaluated from the point of view of the quality of the visual model,from the point of view of the quality of causal evaluation based on large language models,and from the point of view of comparative analysis,while the results reached in the study highlight the usability of large language models in the task and the potential of the proposed approach in the analysis of unknown datasets.The results of the experimental evaluation demonstrate the usefulness of the Causal heatmap method,supported by the evident highlighting of interesting relationships,while suppressing irrelevant ones.展开更多
Detecting geomagnetic anomalies preceding earthquakes is a challenging yet promising area of research that has gained increasing attention in recent years.This study introduces a novel reconstruction-based modeling ap...Detecting geomagnetic anomalies preceding earthquakes is a challenging yet promising area of research that has gained increasing attention in recent years.This study introduces a novel reconstruction-based modeling approach enhanced by negative learning,employing a Bidirectional Long Short-Term Memory(BiLSTM)network explicitly trained to accurately reconstruct non-seismic geomagnetic signals while intentionally amplifying reconstruction errors for seismic signals.By penalizing the model for accurately reconstructing seismic anomalies,the negative learning approach effectively magnifies the differences between normal and anomalous data.This strategic differentiation enhances the sensitivity of the BiLSTM network,enabling improved detection of subtle geomagnetic anomalies that may serve as earthquake precursors.Experimental validation clearly demonstrated statistically significant higher reconstruction errors for seismic signals compared to non-seismic signals,confirmed through the Mann-Whitney U test with a p-value of 0.0035 for Root Mean Square Error(RMSE).These results provide compelling evidence of the enhanced anomaly detection capability achieved through negative learning.Unlike traditional classification-based methods,negative learning explicitly encourages sensitivity to subtle precursor signals embedded within complex geomagnetic data,establishing a robust basis for further development of reliable earthquake prediction methods.展开更多
Interaction between the converter and the grid may lead to harmonic oscillations.The impedance-based method is an effective way to deal with the stability issue.In this study,the impedance-based method is used to inve...Interaction between the converter and the grid may lead to harmonic oscillations.The impedance-based method is an effective way to deal with the stability issue.In this study,the impedance-based method is used to investigate the small-signal stability of a cascaded 12-pulse line-commutated converter-based high-voltage direct current(LCC-HVDC)transmission system.In the modeling part,the impedance models of the single rectifier and inverter are established respectively with consideration to the effect of frequency coupling,which has improved the accuracy of the models.Based on the models,the AC impedance models of the cascaded LCC-HVDC transmission system are established both on the rectifier and inverter side.In the stability analysis part,the stability of the system is analyzed under different working conditions.The simulation results reveal that the established impedance model can properly represent the stability of this system.The findings of this study can provide a theoretical reference for the stability design and oscillation suppression strategy of LCC-HVDC transmission systems and LCC interconnected systems.展开更多
This paper introduces a novel fractional-order model based on the Caputo-Fabrizio(CF)derivative for analyzing computer virus propagation in networked environments.The model partitions the computer population into four...This paper introduces a novel fractional-order model based on the Caputo-Fabrizio(CF)derivative for analyzing computer virus propagation in networked environments.The model partitions the computer population into four compartments:susceptible,latently infected,breaking-out,and antivirus-capable systems.By employing the CF derivative—which uses a nonsingular exponential kernel—the framework effectively captures memory-dependent and nonlocal characteristics intrinsic to cyber systems,aspects inadequately represented by traditional integer-order models.Under Lipschitz continuity and boundedness assumptions,the existence and uniqueness of solutions are rigorously established via fixed-point theory.We develop a tailored two-step Adams-Bashforth numerical scheme for the CF framework and prove its second-order accuracy.Extensive numerical simulations across various fractional orders reveal that memory effects significantly influence virus transmission and control dynamics;smaller fractional orders produce more pronounced memory effects,delaying both infection spread and antivirus activation.Further theoretical analysis,including Hyers-Ulam stability and sensitivity assessments,reinforces the model’s robustness and identifies key parameters governing virus dynamics.The study also extends the framework to incorporate stochastic effects through a stochastic CF formulation.These results underscore fractional-order modeling as a powerful analytical tool for developing robust and effective cybersecurity strategies.展开更多
In this work,a computational modelling and analysis framework is developed to investigate the thermal buckling behavior of doubly-curved composite shells reinforced with graphene-origami(G-Ori)auxetic metamaterials.A ...In this work,a computational modelling and analysis framework is developed to investigate the thermal buckling behavior of doubly-curved composite shells reinforced with graphene-origami(G-Ori)auxetic metamaterials.A semi-analytical formulation based on the First-Order Shear Deformation Theory(FSDT)and the principle of virtual displacements is established,and closed-form solutions are derived via Navier’s method for simply supported boundary conditions.The G-Ori metamaterial reinforcements are treated as programmable constructs whose effective thermo-mechanical properties are obtained via micromechanical homogenization and incorporated into the shell model.A comprehensive parametric study examines the influence of folding geometry,dispersion arrangement,reinforcement weight fraction,curvature parameters,and elastic foundation support on the critical buckling temperature(CBT).The results reveal that,under optimal folding geometry and reinforcement alignment with principal stress trajectories,the CBT can increase by more than 150%.Furthermore,the combined effect of G-Ori reinforcement and elastic foundation substantially enhances thermal buckling resistance.These findings establish design guidelines for architected composite shells in applications such as aerospace thermal skins,morphing structures,and thermally-responsive systems,and illustrate the potential of auxetic graphene metamaterials for multifunctional,lightweight,and thermally robust structural components.展开更多
Objective:Sepsis exhibits remarkable heterogeneity in disease progression trajectories,and accurate identificationof distinct trajectory-based phenotypes is critical for implementing personalized therapeutic strategie...Objective:Sepsis exhibits remarkable heterogeneity in disease progression trajectories,and accurate identificationof distinct trajectory-based phenotypes is critical for implementing personalized therapeutic strategies and prognostic assessment.However,trajectory clustering analysis of time-series clinical data poses substantial methodological challenges for researchers.This study provides a comprehensive tutorial framework demonstrating six trajectory modeling approaches integrated with proteomic analysis to guide researchers in identifying sepsis subtypes after laparoscopic surgery.Methods:This study employs simulated longitudinal data from 300 septic patients after laparoscopic surgery to demonstrate six trajectory modeling methods(group-based trajectory modeling,latent growth mixture modeling,latent transition analysis,time-varying effect modeling,K-means for longitudinal data,agglomerative hierarchical clustering)for identifying associations between predefinedsequential organ failure assessment trajectories and 25 proteomic biomarkers.Clustering performance was evaluated via multiple metrics,and a biomarker discovery pipeline integrating principal component analysis,random forests,feature selection,and receiver operating characteristic analysis was developed.Results:The six methods demonstrated varying performance in identifying trajectory structures,with each approach exhibiting distinct analytical characteristics.The performance metrics revealed differences across methods,which may inform context-specificmethod selection and interpretation strategies.Conclusion:This study illustrates practical implementations of trajectory modeling approaches under controlled conditions,facilitating informed method selection for clinical researchers.The inclusion of complete R code and integrated proteomics workflows offers a reproducible analytical framework connecting temporal pattern recognition to biomarker discovery.Beyond sepsis,this pipeline-oriented approach may be adapted to diverse clinical scenarios requiring longitudinal disease characterization and precision medicine applications.The comparative analysis reveals that each method has distinct strengths,providing a practical guide for clinical researchers in selecting appropriate methods based on their specificstudy goals and data characteristics.展开更多
This study investigates the uncertain dynamic characterization of hybrid composite plates by employing advanced machine-assisted finite element methodologies.Hybrid composites,widely used in aerospace,automotive,and s...This study investigates the uncertain dynamic characterization of hybrid composite plates by employing advanced machine-assisted finite element methodologies.Hybrid composites,widely used in aerospace,automotive,and structural applications,often face variability in material properties,geometric configurations,and manufacturing processes,leading to uncertainty in their dynamic response.To address this,three surrogate-based machine learning approaches like radial basis function(RBF),multivariate adaptive regression splines(MARS),and polynomial neural networks(PNN)are integrated with a finite element framework to efficiently capture the stochastic behavior of these plates.The research focuses on predicting the first three natural frequencies under material uncertainties,which are critical to ensuring structural reliability.Monte Carlo simulation(MCS)is used as a benchmark for generating probabilistic datasets,including mean values,standard deviations,and probability density functions.The surrogate models are then trained and validated against these datasets,enabling accurate representation of uncertainty with substantially fewer samples compared to conventionalMCS.Among the methods studied,the RBFmodel demonstrates superior performance,closely approximating MCS results with a reduced sample size,thereby achieving significant computational savings.The proposed framework not only reduces computational time and costs but also maintains high predictive accuracy,making it well-suited for complex engineering systems.Beyond free vibration analysis,the methodology can be extended to more sophisticated scenarios,such as forced vibration,damping effects,and nonlinear structural responses.Overall,this work presents a computationally efficient and robust approach for surrogate-based uncertainty quantification,advancing the analysis and design of hybrid composite structures under uncertainty.展开更多
China’s environmental governance strategy provides a distinctive pathway for integrating sustainable development into national policy.Understanding its policy trajectory is essential for assessing China’s contributi...China’s environmental governance strategy provides a distinctive pathway for integrating sustainable development into national policy.Understanding its policy trajectory is essential for assessing China’s contribution to global sustainable development and the United Nations Sustainable Development Goals(SDGs).This study constructs a comprehensive database of 425 national environmental governance policy documents issued between 1978 and 2022 and applies Latent Dirichlet Allocation(LDA)modeling to examine the evolution of policy themes and discourse.The results show that China’s environmental governance has undergone four stages-initial exploration,detailed development,transformative leap,and diverse prosperity-reflecting a progressive shift toward more integrated and coordinated governance.Policy priorities have evolved from a primary focus on pollution control and energy transition to an emphasis on institutional construction and organizational reform,thereby strengthening alignment with the SDGs.This transformation is characterized by recurring developmental themes and increasingly preventive,forward-looking,and system-oriented governance approaches.Moreover,the co-evolution of policy concepts and implementation has driven a transition from localized,end-of-pipe responses to comprehensive governance frameworks,alongside a shift from normative guidance towards effectiveness-oriented policy design.By employing a data-driven text analysis approach,this study offers a systematic framework for tracing long-term policy evolution and assessing its implications for sustainable development.展开更多
The probabilistic stability evolution analysis of reservoir bank slopes is a crucial aspect of risk assessment,with core challenges including the consideration of deformation mechanisms and accurate determination of m...The probabilistic stability evolution analysis of reservoir bank slopes is a crucial aspect of risk assessment,with core challenges including the consideration of deformation mechanisms and accurate determination of mechanical parameters.In this study,a novel time-varying reliability analysis framework based on sequential Bayesian updating of mechanical parameters is proposed.The inverse parameters account for damage time-dependent behavior,incorporating water effect and a strain-driven softening-hardening process that depends on sliding states.The likelihood function is enhanced to simultaneously consider observation error,surrogate model prediction error,and model structural error,with the introduction of physical penalty.Exploration of the high-dimensional parameter space is achieved via the Hamiltonian Monte Carlo(HMC)method and the physics knowledge-based time-dependent deformation surrogate model.The time-varying reliability analysis of the slope is performed using the multi-grid method.Taking a reservoir bank slope as a case study,the sequential updating of 12 mechanical parameters is conducted based on deformation time series from 16 monitoring points,thereby validating the proposed framework.The results indicate that the proposed framework effectively captures the posterior distribution of mechanical parameters,with the case slope remaining in a critically stable state after overall sliding,showing a high failure probability.Introducing model structural error can reduce parameter compensation,and a reasonable sequential updating step size can improve inversion accuracy.展开更多
Objective To systematically characterize the developmental trajectory and interdisciplinary integration of intelligent diagnosis in traditional Chinese medicine(TCM)through quantitative topic evolution analysis,we add...Objective To systematically characterize the developmental trajectory and interdisciplinary integration of intelligent diagnosis in traditional Chinese medicine(TCM)through quantitative topic evolution analysis,we addressed the fragmentation of existing research and clarified the long-term research structure and evolutionary patterns of the field.Methods A topic evolution analysis was performed on Chinese-language literature pertaining to intelligent diagnosis in TCM.Publications were retrieved from the China National Knowledge Infrastructure(CNKI),Wanfang Data,and China Science and Technology Journal Database(VIP),covering the period from database inception to July 3,2025.A hybrid segmentation approach,based on cumulative publication growth trends and inflection point detection,was applied to divide the research timeline into distinct stages.Subsequently,the latent Dirichlet allocation(LDA)model was used to extract research topics,followed by alignment and evolutionary analysis of topics across different stages.Results A total of 3919 publications published between 2003 and 2025 were included,and the research trajectory was divided into five stages based on data-driven breakpoint detection.The field exhibited a clear evolutionary shift from early rule-based systems and tonguepulse image and signal analysis(2006–2010),to machine-learning-based syndrome and prescription modeling(2011–2015),followed by deep-learning-driven pattern recognition and formula association(2016–2020).Since 2021,research has increasingly emphasized knowledge-graph construction,multimodal integration,and intelligent clinical decision-support systems,with recent studies(2024–2025)showing the emergence of large language models and agent-based diagnostic frameworks.Topic evolution analysis further revealed sustained cross-stage continuity in syndrome modeling and prescription association analysis,alongside the progressive consolidation of integrated intelligent diagnostic platforms.Conclusion By identifying key technological transitions and persistent core research themes,our findings offer a structured reference framework for the design of intelligent diagnostic systems,the construction of knowledge-driven clinical decision-support tools,and the alignment of AI models with TCM diagnostic logic.Importantly,the stage-based evolutionary insights derived from this analysis can inform future methodological choices,improve model interpretability and clinical applicability,and support the translation of intelligent TCM diagnosis from experimental research to real-world clinical practice.展开更多
Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be so...Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be solved. A comprehensive evaluation method is proposed, which combines grey relational analysis (GRA) and cloud model, to evaluate the anti-jamming performances of Link-16. Firstly, on the basis of establishing the anti-jamming performance evaluation indicator system of Link-16, the linear combination of analytic hierarchy process(AHP) and entropy weight method (EWM) are used to calculate the combined weight. Secondly, the qualitative and quantitative concept transformation model, i.e., the cloud model, is introduced to evaluate the anti-jamming abilities of Link-16 under each jamming scheme. In addition, GRA calculates the correlation degree between evaluation indicators and the anti-jamming performance of Link-16, and assesses the best anti-jamming technology. Finally, simulation results prove that the proposed evaluation model can achieve the objective of feasible and practical evaluation, which opens up a novel way for the research of anti-jamming performance evaluations of Link-16.展开更多
Shotcrete is one of the common solutions for shallow sliding.It works by forming a protective layer with high strength and cementing the loose soil particles on the slope surface to prevent shallow sliding.However,the...Shotcrete is one of the common solutions for shallow sliding.It works by forming a protective layer with high strength and cementing the loose soil particles on the slope surface to prevent shallow sliding.However,the solidification time of conventional cement paste is long when shotcrete is used to treat cohesionless soil landslide.The idea of reinforcing slope with polyurethane solidified soil(i.e.,mixture of polyurethane and sand)was proposed.Model tests and finite element analysis were carried out to study the effectiveness of the proposed new method on the emergency treatment of cohesionless soil landslide.Surcharge loading on the crest of the slope was applied step by step until landslide was triggered so as to test and compare the stability and bearing capacity of slope models with different conditions.The simulated slope displacements were relatively close to the measured results,and the simulated slope deformation characteristics were in good agreement with the observed phenomena,which verifies the accuracy of the numerical method.Under the condition of surcharge loading on the crest of the slope,the unreinforced slope slid when the surcharge loading exceeded 30 k Pa,which presented a failure mode of local instability and collapse at the shallow layer of slope top.The reinforced slope remained stable even when the surcharge loading reached 48 k Pa.The displacement of the reinforced slope was reduced by more than 95%.Overall,this study verifies the effectiveness of polyurethane in the emergency treatment of cohesionless soil landslide and should have broad application prospects in the field of geological disasters concerning the safety of people's live.展开更多
Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in speci...Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.展开更多
DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres...DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.展开更多
With the rapid development of generative artificial intelligence technologies,represented by large language models,university-level computer science education is undergoing a critical transition-from knowledge-based i...With the rapid development of generative artificial intelligence technologies,represented by large language models,university-level computer science education is undergoing a critical transition-from knowledge-based instruction to competency-oriented teaching.A postgraduate student competency evaluation model can serve as a framework to organize and guide both teaching and research activities at the postgraduate level.A number of relevant research efforts have already been conducted in this area.Graduate education plays a vital role not only as a continuation and enhancement of undergraduate education but also as essential preparation for future research endeavors.An analysis of the acceptance of competency evaluation models refers to the assessment of how various stakeholders perceive the importance of different components within the model.Investigating the degree of acceptance among diverse groups-such as current undergraduate students,current postgraduate students,graduates with less than three years of work experience,and those with more than three years of work experience-can offer valuable insights for improving and optimizing postgraduate education and training practices.展开更多
The dynamic,heterogeneous nature of Edge computing in the Internet of Things(Edge-IoT)and Industrial IoT(IIoT)networks brings unique and evolving cybersecurity challenges.This study maps cyber threats in Edge-IoT/IIoT...The dynamic,heterogeneous nature of Edge computing in the Internet of Things(Edge-IoT)and Industrial IoT(IIoT)networks brings unique and evolving cybersecurity challenges.This study maps cyber threats in Edge-IoT/IIoT environments to the Adversarial Tactics,Techniques,and Common Knowledge(ATT&CK)framework by MITRE and introduces a lightweight,data-driven scoring model that enables rapid identification and prioritization of attacks.Inspired by the Factor Analysis of Information Risk model,our proposed scoring model integrates four key metrics:Common Vulnerability Scoring System(CVSS)-based severity scoring,Cyber Kill Chain–based difficulty estimation,Deep Neural Networks-driven detection scoring,and frequency analysis based on dataset prevalence.By aggregating these indicators,the model generates comprehensive risk profiles,facilitating actionable prioritization of threats.Robustness and stability of the scoring model are validated through non-parametric correlation analysis using Spearman’s and Kendall’s rank correlation coefficients,demonstrating consistent performance across diverse scenarios.The approach culminates in a prioritized attack ranking that provides actionable guidance for risk mitigation and resource allocation in Edge-IoT/IIoT security operations.By leveraging real-world data to align MITRE ATT&CK techniques with CVSS metrics,the framework offers a standardized and practically applicable solution for consistent threat assessment in operational settings.The proposed lightweight scoring model delivers rapid and reliable results under dynamic cyber conditions,facilitating timely identification of attack scenarios and prioritization of response strategies.Our systematic integration of established taxonomies with data-driven indicators strengthens practical risk management and supports strategic planning in next-generation IoT deployments.Ultimately,this work advances adaptive threat modeling for Edge/IIoT ecosystems and establishes a robust foundation for evidence-based prioritization in emerging cyber-physical infrastructures.展开更多
Background:With the rapid development of artificial intelligence(AI),large language models(LLMs)have emerged as a potent tool for invigorating ophthalmology across clinical,educational,and research fields.Their accura...Background:With the rapid development of artificial intelligence(AI),large language models(LLMs)have emerged as a potent tool for invigorating ophthalmology across clinical,educational,and research fields.Their accuracy and reliability have undergone tested.This bibliometric analysis aims to provide an overview of research on LLMs in ophthalmology from both thematic and geographical perspectives.Methods:All existing and highly cited LLM-related ophthalmology research papers published in English up to 24th April 2025 were sourced from Scopus,PubMed,and Web of Science.The characteristics of these publications,including publication output,authors,journals,countries,institutions,citations,and research domains,were analyzed using Biblioshiny and VOSviewer software.Results:A total of 277 articles from 1,459 authors and 89 journals were included in this study.Although relevant publications began to appear in 2019,there was a significant increase starting from 2023.He M and Shi D are the most prolific authors,while Investigative Ophthalmology&Visual Science stands out as the most prominent journal.Most of the top-publishing countries are high-income economies,with the USA taking the lead,and the University of California is the leading institution.VOSviewer identified 5 clusters in the keyword co-occurrence analysis,indicating that current research focuses on the clinical applications of LLMs,particularly in diagnosis and patient education.Conclusions:While LLMs have demonstrated effectiveness in retaining knowledge,their accuracy in image-based diagnosis remains limited.Therefore,future research should investigate fine-tuning strategies and domain-specific adaptations to close this gap.Although research on the applications of LLMs in ophthalmology is still in its early stages,it holds significant potential for advancing the field.展开更多
Sentiment analysis,a cornerstone of natural language processing,has witnessed remarkable advancements driven by deep learning models which demonstrated impressive accuracy in discerning sentiment from text across vari...Sentiment analysis,a cornerstone of natural language processing,has witnessed remarkable advancements driven by deep learning models which demonstrated impressive accuracy in discerning sentiment from text across various domains.However,the deployment of such models in resource-constrained environments presents a unique set of challenges that require innovative solutions.Resource-constrained environments encompass scenarios where computing resources,memory,and energy availability are restricted.To empower sentiment analysis in resource-constrained environments,we address the crucial need by leveraging lightweight pre-trained models.These models,derived from popular architectures such as DistilBERT,MobileBERT,ALBERT,TinyBERT,ELECTRA,and SqueezeBERT,offer a promising solution to the resource limitations imposed by these environments.By distilling the knowledge from larger models into smaller ones and employing various optimization techniques,these lightweight models aim to strike a balance between performance and resource efficiency.This paper endeavors to explore the performance of multiple lightweight pre-trained models in sentiment analysis tasks specific to such environments and provide insights into their viability for practical deployment.展开更多
In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This...In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This article considers subgroup analysis methods for longitudinal data containing multiple covariates and biomarkers.We divide subgroups based on whether a linear combination of these biomarkers exceeds a predetermined threshold,and assess the heterogeneity of treatment effects across subgroups using the interaction between subgroups and exposure variables.Quantile regression is used to better characterize the global distribution of the response variable and sparsity penalties are imposed to achieve variable selection of covariates and biomarkers.The effectiveness of our proposed methodology for both variable selection and parameter estimation is verified through random simulations.Finally,we demonstrate the application of this method by analyzing data from the PA.3 trial,further illustrating the practicality of the method proposed in this paper.展开更多
Wide-band oscillations have become a significant issue limiting the development of wind power.Both large-signal and small-signal analyses require extensive model derivation.Moreover,the large number and high order of ...Wide-band oscillations have become a significant issue limiting the development of wind power.Both large-signal and small-signal analyses require extensive model derivation.Moreover,the large number and high order of wind turbines have driven the development of simplified models,whose applicability remains controversial.In this paper,a wide-band oscillation analysis method based on the average-value model(AVM)is proposed for wind farms(WFs).A novel linearization analysis framework is developed,leveraging the continuous-time characteristics of the AVM and MATLAB/Simulink’s built-in linearization tools.This significantly reduces modeling complexity and computational costs while maintaining model fidelity.Additionally,an object-based initial value estimation method of state variables is introduced,which,when combined with steady-state point-solving tools,greatly reduces the computational effort required for equilibrium point solving in batch linearization analysis.The proposed method is validated in both doubly fed induction generator(DFIG)-based and permanent magnet synchronous generator(PMSG)-based WFs.Furthermore,a comprehensive analysis is conducted for the first time to examine the impact of the machine-side system on the system stability of the nonfully controlled PMSG-based WF.展开更多
基金supported by University Grant Agency of Matej Bel University in Banská Bystrica project number UGA-14-PDS-2025.
文摘It is known that correlation does not imply causality.Some relationships identified in the analysis of data are coincidental or unknown,and some are produced by real-world causality of the situation,which is problematic,since there is a need to differentiate between these two scenarios.Until recently,the proper−semantic−causality of the relationship could have been determined only by human experts from the area of expertise of the studied data.This has changed with the advance of large language models,which are often utilized as surrogates for such human experts,making the process automated and readily available to all data analysts.This motivates the main objective of this work,which is to introduce the design and implementation of a large language model-based semantic causality evaluator based on correlation analysis,together with its visual analysis model called Causal heatmap.After the implementation itself,the model is evaluated from the point of view of the quality of the visual model,from the point of view of the quality of causal evaluation based on large language models,and from the point of view of comparative analysis,while the results reached in the study highlight the usability of large language models in the task and the potential of the proposed approach in the analysis of unknown datasets.The results of the experimental evaluation demonstrate the usefulness of the Causal heatmap method,supported by the evident highlighting of interesting relationships,while suppressing irrelevant ones.
基金funded by the Ministry of Higher Education through Universiti Putra Malaysia(UPM)under Grant FRGS/1/2023/STG07/UPM/02/4.
文摘Detecting geomagnetic anomalies preceding earthquakes is a challenging yet promising area of research that has gained increasing attention in recent years.This study introduces a novel reconstruction-based modeling approach enhanced by negative learning,employing a Bidirectional Long Short-Term Memory(BiLSTM)network explicitly trained to accurately reconstruct non-seismic geomagnetic signals while intentionally amplifying reconstruction errors for seismic signals.By penalizing the model for accurately reconstructing seismic anomalies,the negative learning approach effectively magnifies the differences between normal and anomalous data.This strategic differentiation enhances the sensitivity of the BiLSTM network,enabling improved detection of subtle geomagnetic anomalies that may serve as earthquake precursors.Experimental validation clearly demonstrated statistically significant higher reconstruction errors for seismic signals compared to non-seismic signals,confirmed through the Mann-Whitney U test with a p-value of 0.0035 for Root Mean Square Error(RMSE).These results provide compelling evidence of the enhanced anomaly detection capability achieved through negative learning.Unlike traditional classification-based methods,negative learning explicitly encourages sensitivity to subtle precursor signals embedded within complex geomagnetic data,establishing a robust basis for further development of reliable earthquake prediction methods.
基金supported in part by the National Natural Science Foundation of China under 52125704 and 51937001.
文摘Interaction between the converter and the grid may lead to harmonic oscillations.The impedance-based method is an effective way to deal with the stability issue.In this study,the impedance-based method is used to investigate the small-signal stability of a cascaded 12-pulse line-commutated converter-based high-voltage direct current(LCC-HVDC)transmission system.In the modeling part,the impedance models of the single rectifier and inverter are established respectively with consideration to the effect of frequency coupling,which has improved the accuracy of the models.Based on the models,the AC impedance models of the cascaded LCC-HVDC transmission system are established both on the rectifier and inverter side.In the stability analysis part,the stability of the system is analyzed under different working conditions.The simulation results reveal that the established impedance model can properly represent the stability of this system.The findings of this study can provide a theoretical reference for the stability design and oscillation suppression strategy of LCC-HVDC transmission systems and LCC interconnected systems.
基金supported and funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(grant number IMSIU-DDRSP2601).
文摘This paper introduces a novel fractional-order model based on the Caputo-Fabrizio(CF)derivative for analyzing computer virus propagation in networked environments.The model partitions the computer population into four compartments:susceptible,latently infected,breaking-out,and antivirus-capable systems.By employing the CF derivative—which uses a nonsingular exponential kernel—the framework effectively captures memory-dependent and nonlocal characteristics intrinsic to cyber systems,aspects inadequately represented by traditional integer-order models.Under Lipschitz continuity and boundedness assumptions,the existence and uniqueness of solutions are rigorously established via fixed-point theory.We develop a tailored two-step Adams-Bashforth numerical scheme for the CF framework and prove its second-order accuracy.Extensive numerical simulations across various fractional orders reveal that memory effects significantly influence virus transmission and control dynamics;smaller fractional orders produce more pronounced memory effects,delaying both infection spread and antivirus activation.Further theoretical analysis,including Hyers-Ulam stability and sensitivity assessments,reinforces the model’s robustness and identifies key parameters governing virus dynamics.The study also extends the framework to incorporate stochastic effects through a stochastic CF formulation.These results underscore fractional-order modeling as a powerful analytical tool for developing robust and effective cybersecurity strategies.
文摘In this work,a computational modelling and analysis framework is developed to investigate the thermal buckling behavior of doubly-curved composite shells reinforced with graphene-origami(G-Ori)auxetic metamaterials.A semi-analytical formulation based on the First-Order Shear Deformation Theory(FSDT)and the principle of virtual displacements is established,and closed-form solutions are derived via Navier’s method for simply supported boundary conditions.The G-Ori metamaterial reinforcements are treated as programmable constructs whose effective thermo-mechanical properties are obtained via micromechanical homogenization and incorporated into the shell model.A comprehensive parametric study examines the influence of folding geometry,dispersion arrangement,reinforcement weight fraction,curvature parameters,and elastic foundation support on the critical buckling temperature(CBT).The results reveal that,under optimal folding geometry and reinforcement alignment with principal stress trajectories,the CBT can increase by more than 150%.Furthermore,the combined effect of G-Ori reinforcement and elastic foundation substantially enhances thermal buckling resistance.These findings establish design guidelines for architected composite shells in applications such as aerospace thermal skins,morphing structures,and thermally-responsive systems,and illustrate the potential of auxetic graphene metamaterials for multifunctional,lightweight,and thermally robust structural components.
基金funding from the China National Key Research and Development Program(No.2023YFC3603104)the National Natural Science Foundation of China(Nos.82472243 and 82272180)+6 种基金the Fundamental Research Funds for the Central Universities(No.226-2025-00024)the Huadong Medicine Joint Funds of the Zhejiang Provincial Natural Science Foundation of China(No.LHDMD24H150001)the Key Research&Development Project of Zhejiang Province(No.2024C03240)a collaborative scientific project co-established by the Science and Technology Department of the National Administration of Traditional Chinese Medicine and the Zhejiang Provincial Administration of Traditional Chinese Medicine(No.GZY-ZJ-KJ-24082)he General Health Science and Technology Program of Zhejiang Province(No.2024KY1099)the Project of Zhejiang University Longquan Innovation Center(No.ZJDXLQCXZCJBGS2024016)Wu Jieping Medical Foundation Special Research Grant(No.320.6750.2024-23-07).
文摘Objective:Sepsis exhibits remarkable heterogeneity in disease progression trajectories,and accurate identificationof distinct trajectory-based phenotypes is critical for implementing personalized therapeutic strategies and prognostic assessment.However,trajectory clustering analysis of time-series clinical data poses substantial methodological challenges for researchers.This study provides a comprehensive tutorial framework demonstrating six trajectory modeling approaches integrated with proteomic analysis to guide researchers in identifying sepsis subtypes after laparoscopic surgery.Methods:This study employs simulated longitudinal data from 300 septic patients after laparoscopic surgery to demonstrate six trajectory modeling methods(group-based trajectory modeling,latent growth mixture modeling,latent transition analysis,time-varying effect modeling,K-means for longitudinal data,agglomerative hierarchical clustering)for identifying associations between predefinedsequential organ failure assessment trajectories and 25 proteomic biomarkers.Clustering performance was evaluated via multiple metrics,and a biomarker discovery pipeline integrating principal component analysis,random forests,feature selection,and receiver operating characteristic analysis was developed.Results:The six methods demonstrated varying performance in identifying trajectory structures,with each approach exhibiting distinct analytical characteristics.The performance metrics revealed differences across methods,which may inform context-specificmethod selection and interpretation strategies.Conclusion:This study illustrates practical implementations of trajectory modeling approaches under controlled conditions,facilitating informed method selection for clinical researchers.The inclusion of complete R code and integrated proteomics workflows offers a reproducible analytical framework connecting temporal pattern recognition to biomarker discovery.Beyond sepsis,this pipeline-oriented approach may be adapted to diverse clinical scenarios requiring longitudinal disease characterization and precision medicine applications.The comparative analysis reveals that each method has distinct strengths,providing a practical guide for clinical researchers in selecting appropriate methods based on their specificstudy goals and data characteristics.
文摘This study investigates the uncertain dynamic characterization of hybrid composite plates by employing advanced machine-assisted finite element methodologies.Hybrid composites,widely used in aerospace,automotive,and structural applications,often face variability in material properties,geometric configurations,and manufacturing processes,leading to uncertainty in their dynamic response.To address this,three surrogate-based machine learning approaches like radial basis function(RBF),multivariate adaptive regression splines(MARS),and polynomial neural networks(PNN)are integrated with a finite element framework to efficiently capture the stochastic behavior of these plates.The research focuses on predicting the first three natural frequencies under material uncertainties,which are critical to ensuring structural reliability.Monte Carlo simulation(MCS)is used as a benchmark for generating probabilistic datasets,including mean values,standard deviations,and probability density functions.The surrogate models are then trained and validated against these datasets,enabling accurate representation of uncertainty with substantially fewer samples compared to conventionalMCS.Among the methods studied,the RBFmodel demonstrates superior performance,closely approximating MCS results with a reduced sample size,thereby achieving significant computational savings.The proposed framework not only reduces computational time and costs but also maintains high predictive accuracy,making it well-suited for complex engineering systems.Beyond free vibration analysis,the methodology can be extended to more sophisticated scenarios,such as forced vibration,damping effects,and nonlinear structural responses.Overall,this work presents a computationally efficient and robust approach for surrogate-based uncertainty quantification,advancing the analysis and design of hybrid composite structures under uncertainty.
基金supported by the Key Project of Jiangsu Social Science Fund and the Key Project of Jiangsu Research Center for Xi Jinping Thought on Socialism with Chinese Characteristics for a New Era(Grant No.26ZXZA017).
文摘China’s environmental governance strategy provides a distinctive pathway for integrating sustainable development into national policy.Understanding its policy trajectory is essential for assessing China’s contribution to global sustainable development and the United Nations Sustainable Development Goals(SDGs).This study constructs a comprehensive database of 425 national environmental governance policy documents issued between 1978 and 2022 and applies Latent Dirichlet Allocation(LDA)modeling to examine the evolution of policy themes and discourse.The results show that China’s environmental governance has undergone four stages-initial exploration,detailed development,transformative leap,and diverse prosperity-reflecting a progressive shift toward more integrated and coordinated governance.Policy priorities have evolved from a primary focus on pollution control and energy transition to an emphasis on institutional construction and organizational reform,thereby strengthening alignment with the SDGs.This transformation is characterized by recurring developmental themes and increasingly preventive,forward-looking,and system-oriented governance approaches.Moreover,the co-evolution of policy concepts and implementation has driven a transition from localized,end-of-pipe responses to comprehensive governance frameworks,alongside a shift from normative guidance towards effectiveness-oriented policy design.By employing a data-driven text analysis approach,this study offers a systematic framework for tracing long-term policy evolution and assessing its implications for sustainable development.
基金supported by the National Natural Science Foundation of China(Grant No.41961134032).
文摘The probabilistic stability evolution analysis of reservoir bank slopes is a crucial aspect of risk assessment,with core challenges including the consideration of deformation mechanisms and accurate determination of mechanical parameters.In this study,a novel time-varying reliability analysis framework based on sequential Bayesian updating of mechanical parameters is proposed.The inverse parameters account for damage time-dependent behavior,incorporating water effect and a strain-driven softening-hardening process that depends on sliding states.The likelihood function is enhanced to simultaneously consider observation error,surrogate model prediction error,and model structural error,with the introduction of physical penalty.Exploration of the high-dimensional parameter space is achieved via the Hamiltonian Monte Carlo(HMC)method and the physics knowledge-based time-dependent deformation surrogate model.The time-varying reliability analysis of the slope is performed using the multi-grid method.Taking a reservoir bank slope as a case study,the sequential updating of 12 mechanical parameters is conducted based on deformation time series from 16 monitoring points,thereby validating the proposed framework.The results indicate that the proposed framework effectively captures the posterior distribution of mechanical parameters,with the case slope remaining in a critically stable state after overall sliding,showing a high failure probability.Introducing model structural error can reduce parameter compensation,and a reasonable sequential updating step size can improve inversion accuracy.
基金Grants of National Natural Science Foundation of China(82274685).
文摘Objective To systematically characterize the developmental trajectory and interdisciplinary integration of intelligent diagnosis in traditional Chinese medicine(TCM)through quantitative topic evolution analysis,we addressed the fragmentation of existing research and clarified the long-term research structure and evolutionary patterns of the field.Methods A topic evolution analysis was performed on Chinese-language literature pertaining to intelligent diagnosis in TCM.Publications were retrieved from the China National Knowledge Infrastructure(CNKI),Wanfang Data,and China Science and Technology Journal Database(VIP),covering the period from database inception to July 3,2025.A hybrid segmentation approach,based on cumulative publication growth trends and inflection point detection,was applied to divide the research timeline into distinct stages.Subsequently,the latent Dirichlet allocation(LDA)model was used to extract research topics,followed by alignment and evolutionary analysis of topics across different stages.Results A total of 3919 publications published between 2003 and 2025 were included,and the research trajectory was divided into five stages based on data-driven breakpoint detection.The field exhibited a clear evolutionary shift from early rule-based systems and tonguepulse image and signal analysis(2006–2010),to machine-learning-based syndrome and prescription modeling(2011–2015),followed by deep-learning-driven pattern recognition and formula association(2016–2020).Since 2021,research has increasingly emphasized knowledge-graph construction,multimodal integration,and intelligent clinical decision-support systems,with recent studies(2024–2025)showing the emergence of large language models and agent-based diagnostic frameworks.Topic evolution analysis further revealed sustained cross-stage continuity in syndrome modeling and prescription association analysis,alongside the progressive consolidation of integrated intelligent diagnostic platforms.Conclusion By identifying key technological transitions and persistent core research themes,our findings offer a structured reference framework for the design of intelligent diagnostic systems,the construction of knowledge-driven clinical decision-support tools,and the alignment of AI models with TCM diagnostic logic.Importantly,the stage-based evolutionary insights derived from this analysis can inform future methodological choices,improve model interpretability and clinical applicability,and support the translation of intelligent TCM diagnosis from experimental research to real-world clinical practice.
基金Heilongjiang Provincial Natural Science Foundation of China (LH2021F009)。
文摘Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be solved. A comprehensive evaluation method is proposed, which combines grey relational analysis (GRA) and cloud model, to evaluate the anti-jamming performances of Link-16. Firstly, on the basis of establishing the anti-jamming performance evaluation indicator system of Link-16, the linear combination of analytic hierarchy process(AHP) and entropy weight method (EWM) are used to calculate the combined weight. Secondly, the qualitative and quantitative concept transformation model, i.e., the cloud model, is introduced to evaluate the anti-jamming abilities of Link-16 under each jamming scheme. In addition, GRA calculates the correlation degree between evaluation indicators and the anti-jamming performance of Link-16, and assesses the best anti-jamming technology. Finally, simulation results prove that the proposed evaluation model can achieve the objective of feasible and practical evaluation, which opens up a novel way for the research of anti-jamming performance evaluations of Link-16.
基金the financial support from the Fujian Science Foundation for Outstanding Youth(2023J06039)the National Natural Science Foundation of China(Grant No.41977259,U2005205,41972268)the Independent Research Project of Technology Innovation Center for Monitoring and Restoration Engineering of Ecological Fragile Zone in Southeast China(KY-090000-04-2022-019)。
文摘Shotcrete is one of the common solutions for shallow sliding.It works by forming a protective layer with high strength and cementing the loose soil particles on the slope surface to prevent shallow sliding.However,the solidification time of conventional cement paste is long when shotcrete is used to treat cohesionless soil landslide.The idea of reinforcing slope with polyurethane solidified soil(i.e.,mixture of polyurethane and sand)was proposed.Model tests and finite element analysis were carried out to study the effectiveness of the proposed new method on the emergency treatment of cohesionless soil landslide.Surcharge loading on the crest of the slope was applied step by step until landslide was triggered so as to test and compare the stability and bearing capacity of slope models with different conditions.The simulated slope displacements were relatively close to the measured results,and the simulated slope deformation characteristics were in good agreement with the observed phenomena,which verifies the accuracy of the numerical method.Under the condition of surcharge loading on the crest of the slope,the unreinforced slope slid when the surcharge loading exceeded 30 k Pa,which presented a failure mode of local instability and collapse at the shallow layer of slope top.The reinforced slope remained stable even when the surcharge loading reached 48 k Pa.The displacement of the reinforced slope was reduced by more than 95%.Overall,this study verifies the effectiveness of polyurethane in the emergency treatment of cohesionless soil landslide and should have broad application prospects in the field of geological disasters concerning the safety of people's live.
基金supported by the National Key R&D Program of China(No.2021YFB0301200)National Natural Science Foundation of China(No.62025208).
文摘Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.
文摘DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.
文摘With the rapid development of generative artificial intelligence technologies,represented by large language models,university-level computer science education is undergoing a critical transition-from knowledge-based instruction to competency-oriented teaching.A postgraduate student competency evaluation model can serve as a framework to organize and guide both teaching and research activities at the postgraduate level.A number of relevant research efforts have already been conducted in this area.Graduate education plays a vital role not only as a continuation and enhancement of undergraduate education but also as essential preparation for future research endeavors.An analysis of the acceptance of competency evaluation models refers to the assessment of how various stakeholders perceive the importance of different components within the model.Investigating the degree of acceptance among diverse groups-such as current undergraduate students,current postgraduate students,graduates with less than three years of work experience,and those with more than three years of work experience-can offer valuable insights for improving and optimizing postgraduate education and training practices.
基金supported by the“Regional Innovation System&Education(RISE)”through the Seoul RISE Center,funded by the Ministry of Education(MOE)and the Seoul Metropolitan Government(2025-RISE-01-018-05)supported by Quad Miners Corp。
文摘The dynamic,heterogeneous nature of Edge computing in the Internet of Things(Edge-IoT)and Industrial IoT(IIoT)networks brings unique and evolving cybersecurity challenges.This study maps cyber threats in Edge-IoT/IIoT environments to the Adversarial Tactics,Techniques,and Common Knowledge(ATT&CK)framework by MITRE and introduces a lightweight,data-driven scoring model that enables rapid identification and prioritization of attacks.Inspired by the Factor Analysis of Information Risk model,our proposed scoring model integrates four key metrics:Common Vulnerability Scoring System(CVSS)-based severity scoring,Cyber Kill Chain–based difficulty estimation,Deep Neural Networks-driven detection scoring,and frequency analysis based on dataset prevalence.By aggregating these indicators,the model generates comprehensive risk profiles,facilitating actionable prioritization of threats.Robustness and stability of the scoring model are validated through non-parametric correlation analysis using Spearman’s and Kendall’s rank correlation coefficients,demonstrating consistent performance across diverse scenarios.The approach culminates in a prioritized attack ranking that provides actionable guidance for risk mitigation and resource allocation in Edge-IoT/IIoT security operations.By leveraging real-world data to align MITRE ATT&CK techniques with CVSS metrics,the framework offers a standardized and practically applicable solution for consistent threat assessment in operational settings.The proposed lightweight scoring model delivers rapid and reliable results under dynamic cyber conditions,facilitating timely identification of attack scenarios and prioritization of response strategies.Our systematic integration of established taxonomies with data-driven indicators strengthens practical risk management and supports strategic planning in next-generation IoT deployments.Ultimately,this work advances adaptive threat modeling for Edge/IIoT ecosystems and establishes a robust foundation for evidence-based prioritization in emerging cyber-physical infrastructures.
基金supported by Health and Medical Research Fund,Hong Kong(11220386,12230246).
文摘Background:With the rapid development of artificial intelligence(AI),large language models(LLMs)have emerged as a potent tool for invigorating ophthalmology across clinical,educational,and research fields.Their accuracy and reliability have undergone tested.This bibliometric analysis aims to provide an overview of research on LLMs in ophthalmology from both thematic and geographical perspectives.Methods:All existing and highly cited LLM-related ophthalmology research papers published in English up to 24th April 2025 were sourced from Scopus,PubMed,and Web of Science.The characteristics of these publications,including publication output,authors,journals,countries,institutions,citations,and research domains,were analyzed using Biblioshiny and VOSviewer software.Results:A total of 277 articles from 1,459 authors and 89 journals were included in this study.Although relevant publications began to appear in 2019,there was a significant increase starting from 2023.He M and Shi D are the most prolific authors,while Investigative Ophthalmology&Visual Science stands out as the most prominent journal.Most of the top-publishing countries are high-income economies,with the USA taking the lead,and the University of California is the leading institution.VOSviewer identified 5 clusters in the keyword co-occurrence analysis,indicating that current research focuses on the clinical applications of LLMs,particularly in diagnosis and patient education.Conclusions:While LLMs have demonstrated effectiveness in retaining knowledge,their accuracy in image-based diagnosis remains limited.Therefore,future research should investigate fine-tuning strategies and domain-specific adaptations to close this gap.Although research on the applications of LLMs in ophthalmology is still in its early stages,it holds significant potential for advancing the field.
文摘Sentiment analysis,a cornerstone of natural language processing,has witnessed remarkable advancements driven by deep learning models which demonstrated impressive accuracy in discerning sentiment from text across various domains.However,the deployment of such models in resource-constrained environments presents a unique set of challenges that require innovative solutions.Resource-constrained environments encompass scenarios where computing resources,memory,and energy availability are restricted.To empower sentiment analysis in resource-constrained environments,we address the crucial need by leveraging lightweight pre-trained models.These models,derived from popular architectures such as DistilBERT,MobileBERT,ALBERT,TinyBERT,ELECTRA,and SqueezeBERT,offer a promising solution to the resource limitations imposed by these environments.By distilling the knowledge from larger models into smaller ones and employing various optimization techniques,these lightweight models aim to strike a balance between performance and resource efficiency.This paper endeavors to explore the performance of multiple lightweight pre-trained models in sentiment analysis tasks specific to such environments and provide insights into their viability for practical deployment.
基金Supported by the Natural Science Foundation of Fujian Province(2022J011177,2024J01903)the Key Project of Fujian Provincial Education Department(JZ230054)。
文摘In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This article considers subgroup analysis methods for longitudinal data containing multiple covariates and biomarkers.We divide subgroups based on whether a linear combination of these biomarkers exceeds a predetermined threshold,and assess the heterogeneity of treatment effects across subgroups using the interaction between subgroups and exposure variables.Quantile regression is used to better characterize the global distribution of the response variable and sparsity penalties are imposed to achieve variable selection of covariates and biomarkers.The effectiveness of our proposed methodology for both variable selection and parameter estimation is verified through random simulations.Finally,we demonstrate the application of this method by analyzing data from the PA.3 trial,further illustrating the practicality of the method proposed in this paper.
基金supported by the National Natural Science Foundation of China under Grant 52277072.
文摘Wide-band oscillations have become a significant issue limiting the development of wind power.Both large-signal and small-signal analyses require extensive model derivation.Moreover,the large number and high order of wind turbines have driven the development of simplified models,whose applicability remains controversial.In this paper,a wide-band oscillation analysis method based on the average-value model(AVM)is proposed for wind farms(WFs).A novel linearization analysis framework is developed,leveraging the continuous-time characteristics of the AVM and MATLAB/Simulink’s built-in linearization tools.This significantly reduces modeling complexity and computational costs while maintaining model fidelity.Additionally,an object-based initial value estimation method of state variables is introduced,which,when combined with steady-state point-solving tools,greatly reduces the computational effort required for equilibrium point solving in batch linearization analysis.The proposed method is validated in both doubly fed induction generator(DFIG)-based and permanent magnet synchronous generator(PMSG)-based WFs.Furthermore,a comprehensive analysis is conducted for the first time to examine the impact of the machine-side system on the system stability of the nonfully controlled PMSG-based WF.