Transparent exopolymer particles(TEP)are abundant gel-like colloids pivotal in marine carbon cycling and water treatment processes.Their environmental roles are governed by hierarchical architectures,yet in-situ struc...Transparent exopolymer particles(TEP)are abundant gel-like colloids pivotal in marine carbon cycling and water treatment processes.Their environmental roles are governed by hierarchical architectures,yet in-situ structural characterization remains challenging due to transparency,fragility,and polymorphism.To address this,we developed an integrated image analysis suite combining advanced processing with statistical modeling,enabling simultaneous 2D/3D quantification of TEP morphology and intra-particle heterogeneity.This framework generates multidimensional descriptors(e.g.,fractal dimensions,density gradients)for individual aggregates and assemblies.Applied to cationmediated aggregation,it revealed divergent bridging behaviors.Mg2+induced moderate size changes(2.99-4.08μm),while Ca^(2+)drove exponential growth(1.81-187.76μm)when ionic strength increasing from 1 to 5 mmol/L.Concurrent form factor reductions(Mg:0.31 to 0.16;Ca:0.44 to 0.19)quantitatively distinguish aggregation pathways.The method deciphers ion-specific assembly mechanisms and resolves subtle colloidal interactions,establishing a paradigm for colloidal system analysis with possible applications extending beyond TEP research to other subjects such as microplastic aggregation.展开更多
A three-dimensional wind field analysis sollware based on the Beigng-Gucheng dual-Doppler weather radar system has been built, and evaluated by using the numerical cloud model producing storm flow and hydrometeor fiel...A three-dimensional wind field analysis sollware based on the Beigng-Gucheng dual-Doppler weather radar system has been built, and evaluated by using the numerical cloud model producing storm flow and hydrometeor fields. The effects of observation noise and the spatial distribution of wind field analysis error are also investigated.展开更多
To study the rock deformation with three- dimensional model under rolling forces of disc cutter, by car- rying out the circular-grooving test with disc cutter rolling around on the rock, the rock mechanical behavior u...To study the rock deformation with three- dimensional model under rolling forces of disc cutter, by car- rying out the circular-grooving test with disc cutter rolling around on the rock, the rock mechanical behavior under rolling disc cutter is studied, the mechanical model of disc cutter rolling around the groove is established, and the the- ory of single-point and double-angle variables is proposed. Based on this theory, the physics equations and geometric equations of rock mechanical behavior under disc cutters of tunnel boring machine (TBM) are studied, and then the bal- ance equations of interactive forces between disc cutter and rock are established. Accordingly, formulas about normal force, rolling force and side force of a disc cutter are de- rived, and their validity is studied by tests. Therefore, a new method and theory is proposed to study rock- breaking mech- anism of disc cutters.展开更多
A three-dimensional density field associated with mesoscaie unstable waves generated by the 3-D, primitive-equation model (Wang and Ikeda, 1996) is provided to the quasi-geostrophic pressure tendency and ω-equations,...A three-dimensional density field associated with mesoscaie unstable waves generated by the 3-D, primitive-equation model (Wang and Ikeda, 1996) is provided to the quasi-geostrophic pressure tendency and ω-equations, and to the (ageostrophic) Q-vector equation. Diagnostic analyses, analogous to the approaches in meteorology: ω-equation and Q-vector method, are for the first time developed to examine the mesoscaie dynamical processes and mechanisms of the unstable waves propagating in the mid-latitude ocean. The weaknesses and strengths of these two diagnostic approaches are evaluated and compared to the model results. The Q-vector method is then recommended to diagnose the vertical motion associated with the mesoscaie dynamics from a hydrographic CTD (conductivity-temperature-depth) array, while the quasi-geostrophic equations produce some small-scale features (errors) in the diagnosed fields.展开更多
Objective: To study the evaluation value of three-dimensional finite element model analysis for bone mineral density (BMD) and bone metabolism activity in patients with osteoporosis. Methods: A total of 218 patients w...Objective: To study the evaluation value of three-dimensional finite element model analysis for bone mineral density (BMD) and bone metabolism activity in patients with osteoporosis. Methods: A total of 218 patients who were diagnosed with osteoporosis in the hospital between February 2014 and January 2017 were collected as observation group, and 100 healthy volunteers who received physical examination in the hospital during the same period were selected as normal control group. The femoral head of the two groups was analyzed by three-dimensional finite element model, and the femoral head BMD levels and serum bone metabolism index contents were measured. Pearson test was used to evaluate the evaluation value of femoral head three-dimensional finite element model for osteoporosis. Results: The cancellous bone and cortical bone Von Mises stress value of observation group were lower than those of normal control group, and femoral neck BMD value of observation group was lower than that of normal control group;serum bone metabolism index BGP content was lower than that of normal control group while NBAP, TRACP-5b and CTX-1 contents were higher than those of normal control group. Pearson test showed that the cancellous bone and cortical bone Von Mises stress value of patients with osteoporosis were directly correlated with BMD value and bone metabolism index contents. Conclusion: The three-dimensional finite element model analysis resultsof patients with osteoporosis can objectively reflect the femoral headBMD value and bone metabolism activity, and is a reliable way to evaluate the risk of long-term fractures.展开更多
Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be so...Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be solved. A comprehensive evaluation method is proposed, which combines grey relational analysis (GRA) and cloud model, to evaluate the anti-jamming performances of Link-16. Firstly, on the basis of establishing the anti-jamming performance evaluation indicator system of Link-16, the linear combination of analytic hierarchy process(AHP) and entropy weight method (EWM) are used to calculate the combined weight. Secondly, the qualitative and quantitative concept transformation model, i.e., the cloud model, is introduced to evaluate the anti-jamming abilities of Link-16 under each jamming scheme. In addition, GRA calculates the correlation degree between evaluation indicators and the anti-jamming performance of Link-16, and assesses the best anti-jamming technology. Finally, simulation results prove that the proposed evaluation model can achieve the objective of feasible and practical evaluation, which opens up a novel way for the research of anti-jamming performance evaluations of Link-16.展开更多
Shotcrete is one of the common solutions for shallow sliding.It works by forming a protective layer with high strength and cementing the loose soil particles on the slope surface to prevent shallow sliding.However,the...Shotcrete is one of the common solutions for shallow sliding.It works by forming a protective layer with high strength and cementing the loose soil particles on the slope surface to prevent shallow sliding.However,the solidification time of conventional cement paste is long when shotcrete is used to treat cohesionless soil landslide.The idea of reinforcing slope with polyurethane solidified soil(i.e.,mixture of polyurethane and sand)was proposed.Model tests and finite element analysis were carried out to study the effectiveness of the proposed new method on the emergency treatment of cohesionless soil landslide.Surcharge loading on the crest of the slope was applied step by step until landslide was triggered so as to test and compare the stability and bearing capacity of slope models with different conditions.The simulated slope displacements were relatively close to the measured results,and the simulated slope deformation characteristics were in good agreement with the observed phenomena,which verifies the accuracy of the numerical method.Under the condition of surcharge loading on the crest of the slope,the unreinforced slope slid when the surcharge loading exceeded 30 k Pa,which presented a failure mode of local instability and collapse at the shallow layer of slope top.The reinforced slope remained stable even when the surcharge loading reached 48 k Pa.The displacement of the reinforced slope was reduced by more than 95%.Overall,this study verifies the effectiveness of polyurethane in the emergency treatment of cohesionless soil landslide and should have broad application prospects in the field of geological disasters concerning the safety of people's live.展开更多
Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in speci...Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.展开更多
DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres...DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.展开更多
With the rapid development of generative artificial intelligence technologies,represented by large language models,university-level computer science education is undergoing a critical transition-from knowledge-based i...With the rapid development of generative artificial intelligence technologies,represented by large language models,university-level computer science education is undergoing a critical transition-from knowledge-based instruction to competency-oriented teaching.A postgraduate student competency evaluation model can serve as a framework to organize and guide both teaching and research activities at the postgraduate level.A number of relevant research efforts have already been conducted in this area.Graduate education plays a vital role not only as a continuation and enhancement of undergraduate education but also as essential preparation for future research endeavors.An analysis of the acceptance of competency evaluation models refers to the assessment of how various stakeholders perceive the importance of different components within the model.Investigating the degree of acceptance among diverse groups-such as current undergraduate students,current postgraduate students,graduates with less than three years of work experience,and those with more than three years of work experience-can offer valuable insights for improving and optimizing postgraduate education and training practices.展开更多
The dynamic,heterogeneous nature of Edge computing in the Internet of Things(Edge-IoT)and Industrial IoT(IIoT)networks brings unique and evolving cybersecurity challenges.This study maps cyber threats in Edge-IoT/IIoT...The dynamic,heterogeneous nature of Edge computing in the Internet of Things(Edge-IoT)and Industrial IoT(IIoT)networks brings unique and evolving cybersecurity challenges.This study maps cyber threats in Edge-IoT/IIoT environments to the Adversarial Tactics,Techniques,and Common Knowledge(ATT&CK)framework by MITRE and introduces a lightweight,data-driven scoring model that enables rapid identification and prioritization of attacks.Inspired by the Factor Analysis of Information Risk model,our proposed scoring model integrates four key metrics:Common Vulnerability Scoring System(CVSS)-based severity scoring,Cyber Kill Chain–based difficulty estimation,Deep Neural Networks-driven detection scoring,and frequency analysis based on dataset prevalence.By aggregating these indicators,the model generates comprehensive risk profiles,facilitating actionable prioritization of threats.Robustness and stability of the scoring model are validated through non-parametric correlation analysis using Spearman’s and Kendall’s rank correlation coefficients,demonstrating consistent performance across diverse scenarios.The approach culminates in a prioritized attack ranking that provides actionable guidance for risk mitigation and resource allocation in Edge-IoT/IIoT security operations.By leveraging real-world data to align MITRE ATT&CK techniques with CVSS metrics,the framework offers a standardized and practically applicable solution for consistent threat assessment in operational settings.The proposed lightweight scoring model delivers rapid and reliable results under dynamic cyber conditions,facilitating timely identification of attack scenarios and prioritization of response strategies.Our systematic integration of established taxonomies with data-driven indicators strengthens practical risk management and supports strategic planning in next-generation IoT deployments.Ultimately,this work advances adaptive threat modeling for Edge/IIoT ecosystems and establishes a robust foundation for evidence-based prioritization in emerging cyber-physical infrastructures.展开更多
Background:With the rapid development of artificial intelligence(AI),large language models(LLMs)have emerged as a potent tool for invigorating ophthalmology across clinical,educational,and research fields.Their accura...Background:With the rapid development of artificial intelligence(AI),large language models(LLMs)have emerged as a potent tool for invigorating ophthalmology across clinical,educational,and research fields.Their accuracy and reliability have undergone tested.This bibliometric analysis aims to provide an overview of research on LLMs in ophthalmology from both thematic and geographical perspectives.Methods:All existing and highly cited LLM-related ophthalmology research papers published in English up to 24th April 2025 were sourced from Scopus,PubMed,and Web of Science.The characteristics of these publications,including publication output,authors,journals,countries,institutions,citations,and research domains,were analyzed using Biblioshiny and VOSviewer software.Results:A total of 277 articles from 1,459 authors and 89 journals were included in this study.Although relevant publications began to appear in 2019,there was a significant increase starting from 2023.He M and Shi D are the most prolific authors,while Investigative Ophthalmology&Visual Science stands out as the most prominent journal.Most of the top-publishing countries are high-income economies,with the USA taking the lead,and the University of California is the leading institution.VOSviewer identified 5 clusters in the keyword co-occurrence analysis,indicating that current research focuses on the clinical applications of LLMs,particularly in diagnosis and patient education.Conclusions:While LLMs have demonstrated effectiveness in retaining knowledge,their accuracy in image-based diagnosis remains limited.Therefore,future research should investigate fine-tuning strategies and domain-specific adaptations to close this gap.Although research on the applications of LLMs in ophthalmology is still in its early stages,it holds significant potential for advancing the field.展开更多
Sentiment analysis,a cornerstone of natural language processing,has witnessed remarkable advancements driven by deep learning models which demonstrated impressive accuracy in discerning sentiment from text across vari...Sentiment analysis,a cornerstone of natural language processing,has witnessed remarkable advancements driven by deep learning models which demonstrated impressive accuracy in discerning sentiment from text across various domains.However,the deployment of such models in resource-constrained environments presents a unique set of challenges that require innovative solutions.Resource-constrained environments encompass scenarios where computing resources,memory,and energy availability are restricted.To empower sentiment analysis in resource-constrained environments,we address the crucial need by leveraging lightweight pre-trained models.These models,derived from popular architectures such as DistilBERT,MobileBERT,ALBERT,TinyBERT,ELECTRA,and SqueezeBERT,offer a promising solution to the resource limitations imposed by these environments.By distilling the knowledge from larger models into smaller ones and employing various optimization techniques,these lightweight models aim to strike a balance between performance and resource efficiency.This paper endeavors to explore the performance of multiple lightweight pre-trained models in sentiment analysis tasks specific to such environments and provide insights into their viability for practical deployment.展开更多
In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This...In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This article considers subgroup analysis methods for longitudinal data containing multiple covariates and biomarkers.We divide subgroups based on whether a linear combination of these biomarkers exceeds a predetermined threshold,and assess the heterogeneity of treatment effects across subgroups using the interaction between subgroups and exposure variables.Quantile regression is used to better characterize the global distribution of the response variable and sparsity penalties are imposed to achieve variable selection of covariates and biomarkers.The effectiveness of our proposed methodology for both variable selection and parameter estimation is verified through random simulations.Finally,we demonstrate the application of this method by analyzing data from the PA.3 trial,further illustrating the practicality of the method proposed in this paper.展开更多
Wide-band oscillations have become a significant issue limiting the development of wind power.Both large-signal and small-signal analyses require extensive model derivation.Moreover,the large number and high order of ...Wide-band oscillations have become a significant issue limiting the development of wind power.Both large-signal and small-signal analyses require extensive model derivation.Moreover,the large number and high order of wind turbines have driven the development of simplified models,whose applicability remains controversial.In this paper,a wide-band oscillation analysis method based on the average-value model(AVM)is proposed for wind farms(WFs).A novel linearization analysis framework is developed,leveraging the continuous-time characteristics of the AVM and MATLAB/Simulink’s built-in linearization tools.This significantly reduces modeling complexity and computational costs while maintaining model fidelity.Additionally,an object-based initial value estimation method of state variables is introduced,which,when combined with steady-state point-solving tools,greatly reduces the computational effort required for equilibrium point solving in batch linearization analysis.The proposed method is validated in both doubly fed induction generator(DFIG)-based and permanent magnet synchronous generator(PMSG)-based WFs.Furthermore,a comprehensive analysis is conducted for the first time to examine the impact of the machine-side system on the system stability of the nonfully controlled PMSG-based WF.展开更多
In the rapidly evolving landscape of natural language processing(NLP)and sentiment analysis,improving the accuracy and efficiency of sentiment classification models is crucial.This paper investigates the performance o...In the rapidly evolving landscape of natural language processing(NLP)and sentiment analysis,improving the accuracy and efficiency of sentiment classification models is crucial.This paper investigates the performance of two advanced models,the Large Language Model(LLM)LLaMA model and NLP BERT model,in the context of airline review sentiment analysis.Through fine-tuning,domain adaptation,and the application of few-shot learning,the study addresses the subtleties of sentiment expressions in airline-related text data.Employing predictive modeling and comparative analysis,the research evaluates the effectiveness of Large Language Model Meta AI(LLaMA)and Bidirectional Encoder Representations from Transformers(BERT)in capturing sentiment intricacies.Fine-tuning,including domain adaptation,enhances the models'performance in sentiment classification tasks.Additionally,the study explores the potential of few-shot learning to improve model generalization using minimal annotated data for targeted sentiment analysis.By conducting experiments on a diverse airline review dataset,the research quantifies the impact of fine-tuning,domain adaptation,and few-shot learning on model performance,providing valuable insights for industries aiming to predict recommendations and enhance customer satisfaction through a deeper understanding of sentiment in user-generated content(UGC).This research contributes to refining sentiment analysis models,ultimately fostering improved customer satisfaction in the airline industry.展开更多
Pingquan City,the origin of five rivers,serves as the core water conservation zone for the Beijing-Tianjin-Hebei region and exemplifies the characteristics of small watersheds in hilly areas.In recent years,excessive ...Pingquan City,the origin of five rivers,serves as the core water conservation zone for the Beijing-Tianjin-Hebei region and exemplifies the characteristics of small watersheds in hilly areas.In recent years,excessive mining and intensified human activities have severely disrupted the local ecosystem,creating an urgent need for ecological vulnerability assessment to enhance water conservation functions.This study employed the sensitivity-resilience-pressure model,integrating various data sources,including regional background,hydro-meteorological data,field investigations,remote sensing analysis,and socio-economic data.The weights of the model indices were determined using an entropy weighting model that combines principal component analysis and the analytic hierarchy process.Using the ArcGIS platform,the spatial distribution and driving forces of ecological vulnerability in 2020 were analyzed,providing valuable insights for regional ecological restoration.The results indicated that the overall Ecological Vulnerability Index(EVI)was 0.389,signifying moderate ecological vulnerability,with significant variation between watersheds.The Daling River Basin had a high EVI,with ecological vulnerability primarily in levels IV and V,indicating high ecological pressure,whereas the Laoniu River Basin had a low EVI,reflecting minimal ecological pressure.Soil type was identified as the primary driving factor,followed by elevation,temperature,and soil erosion as secondary factors.It is recommended to focus on key regions and critical factors while conducting comprehensive monitoring and assessment to ensure the long-term success of ecological management efforts.展开更多
We analyzed accident factors in a 2020 ship collision case that occurred off Kii Oshima Island using the SHELL model analysis and examined corresponding collision prevention measures.The SHELL model analysis is a fram...We analyzed accident factors in a 2020 ship collision case that occurred off Kii Oshima Island using the SHELL model analysis and examined corresponding collision prevention measures.The SHELL model analysis is a framework for identifying accident factors related to human abilities and characteristics,hardware,software,and the environment.Beyond assessing the accident factors in each element,we also examined the interrelationship between humans and each element.This study highlights the importance of(1)training to enhance situational awareness,(2)improving decision-making skills,and(3)establishing structured decision-making procedures to prevent maritime collision accidents.Additionally,we considered safety measures through(4)hardware enhancements and(5)environmental measures.Furthermore,to prevent accidents,implementing measures grounded in(6)predictions is deemed effective.This study identified accident factors through prediction alongside the SHELL model analysis and proposed countermeasures based on the findings.By applying these predictions,more countermeasures can be derived,which,when combined strategically,can significantly aid in preventing maritime collision accidents.展开更多
In this paper,the N-soliton solutions for the massive Thirring model(MTM)in laboratory coordinates are analyzed via the Riemann-Hilbert(RH)approach.The direct scattering including the analyticity,symmetries,and asympt...In this paper,the N-soliton solutions for the massive Thirring model(MTM)in laboratory coordinates are analyzed via the Riemann-Hilbert(RH)approach.The direct scattering including the analyticity,symmetries,and asymptotic behaviors of the Jost solutions as|λ|→∞andλ→0 are given.Considering that the scattering coefficients have simple zeros,the matrix RH problem,reconstruction formulas and corresponding trace formulas are also derived.Further,the N-soliton solutions in the reflectionless case are obtained explicitly in the form of determinants.The propagation characteristics of one-soliton solutions and interaction properties of two-soliton solutions are discussed.In particular,the asymptotic expressions of two-soliton solutions as|t|→∞are obtained,which show that the velocities and amplitudes of the asymptotic solitons do not change before and after interaction except the position shifts.In addition,three types of bounded states for two-soliton solutions are presented with certain parametric conditions.展开更多
This research presents an advanced study on the modeling and stability analysis of electro-hydraulic control modules used in intelligent chassis systems.Firstly,a comprehensive nonlinear mathematical model of the elec...This research presents an advanced study on the modeling and stability analysis of electro-hydraulic control modules used in intelligent chassis systems.Firstly,a comprehensive nonlinear mathematical model of the electro-hydraulic power-shift system is developed,incorporating pipeline characteristics through impedance analysis and examining coupling effects between the pilot solenoid valve,main valve,and pipeline.Then,the model’s accuracy is validated through experimental testing,demonstrating high precision and minimal model errors.A comparative analysis between simulation data(both with and without pipeline characteristics)and experimental results reveals that the model considering pipeline parameters aligns more closely with experimental data,highlighting its superior accuracy.The research further explores the influence of key factors on system stability,including damping coefficient,feedback cavity orifice diameter,spring stiffness,pipeline length,and pipeline diameter.Significant findings include the critical impact of damping coefficient,orifice diameter,and pipeline length on stability,while spring stiffness has a minimal effect.These findings provide valuable insights for optimizing electro-hydraulic control modules in intelligent chassis systems,with practical implications for automotive and construction machinery applications.展开更多
基金financial support from the National Natural Science Foundation of China(No.52270019,U24A20188,and 52370003).
文摘Transparent exopolymer particles(TEP)are abundant gel-like colloids pivotal in marine carbon cycling and water treatment processes.Their environmental roles are governed by hierarchical architectures,yet in-situ structural characterization remains challenging due to transparency,fragility,and polymorphism.To address this,we developed an integrated image analysis suite combining advanced processing with statistical modeling,enabling simultaneous 2D/3D quantification of TEP morphology and intra-particle heterogeneity.This framework generates multidimensional descriptors(e.g.,fractal dimensions,density gradients)for individual aggregates and assemblies.Applied to cationmediated aggregation,it revealed divergent bridging behaviors.Mg2+induced moderate size changes(2.99-4.08μm),while Ca^(2+)drove exponential growth(1.81-187.76μm)when ionic strength increasing from 1 to 5 mmol/L.Concurrent form factor reductions(Mg:0.31 to 0.16;Ca:0.44 to 0.19)quantitatively distinguish aggregation pathways.The method deciphers ion-specific assembly mechanisms and resolves subtle colloidal interactions,establishing a paradigm for colloidal system analysis with possible applications extending beyond TEP research to other subjects such as microplastic aggregation.
文摘A three-dimensional wind field analysis sollware based on the Beigng-Gucheng dual-Doppler weather radar system has been built, and evaluated by using the numerical cloud model producing storm flow and hydrometeor fields. The effects of observation noise and the spatial distribution of wind field analysis error are also investigated.
基金supported by the National Natural Science Foundation of China (51075147)
文摘To study the rock deformation with three- dimensional model under rolling forces of disc cutter, by car- rying out the circular-grooving test with disc cutter rolling around on the rock, the rock mechanical behavior under rolling disc cutter is studied, the mechanical model of disc cutter rolling around the groove is established, and the the- ory of single-point and double-angle variables is proposed. Based on this theory, the physics equations and geometric equations of rock mechanical behavior under disc cutters of tunnel boring machine (TBM) are studied, and then the bal- ance equations of interactive forces between disc cutter and rock are established. Accordingly, formulas about normal force, rolling force and side force of a disc cutter are de- rived, and their validity is studied by tests. Therefore, a new method and theory is proposed to study rock- breaking mech- anism of disc cutters.
文摘A three-dimensional density field associated with mesoscaie unstable waves generated by the 3-D, primitive-equation model (Wang and Ikeda, 1996) is provided to the quasi-geostrophic pressure tendency and ω-equations, and to the (ageostrophic) Q-vector equation. Diagnostic analyses, analogous to the approaches in meteorology: ω-equation and Q-vector method, are for the first time developed to examine the mesoscaie dynamical processes and mechanisms of the unstable waves propagating in the mid-latitude ocean. The weaknesses and strengths of these two diagnostic approaches are evaluated and compared to the model results. The Q-vector method is then recommended to diagnose the vertical motion associated with the mesoscaie dynamics from a hydrographic CTD (conductivity-temperature-depth) array, while the quasi-geostrophic equations produce some small-scale features (errors) in the diagnosed fields.
基金National Science Foundation of China No:81301292.
文摘Objective: To study the evaluation value of three-dimensional finite element model analysis for bone mineral density (BMD) and bone metabolism activity in patients with osteoporosis. Methods: A total of 218 patients who were diagnosed with osteoporosis in the hospital between February 2014 and January 2017 were collected as observation group, and 100 healthy volunteers who received physical examination in the hospital during the same period were selected as normal control group. The femoral head of the two groups was analyzed by three-dimensional finite element model, and the femoral head BMD levels and serum bone metabolism index contents were measured. Pearson test was used to evaluate the evaluation value of femoral head three-dimensional finite element model for osteoporosis. Results: The cancellous bone and cortical bone Von Mises stress value of observation group were lower than those of normal control group, and femoral neck BMD value of observation group was lower than that of normal control group;serum bone metabolism index BGP content was lower than that of normal control group while NBAP, TRACP-5b and CTX-1 contents were higher than those of normal control group. Pearson test showed that the cancellous bone and cortical bone Von Mises stress value of patients with osteoporosis were directly correlated with BMD value and bone metabolism index contents. Conclusion: The three-dimensional finite element model analysis resultsof patients with osteoporosis can objectively reflect the femoral headBMD value and bone metabolism activity, and is a reliable way to evaluate the risk of long-term fractures.
基金Heilongjiang Provincial Natural Science Foundation of China (LH2021F009)。
文摘Anti-jamming performance evaluation has recently received significant attention. For Link-16, the anti-jamming performance evaluation and selection of the optimal anti-jamming technologies are urgent problems to be solved. A comprehensive evaluation method is proposed, which combines grey relational analysis (GRA) and cloud model, to evaluate the anti-jamming performances of Link-16. Firstly, on the basis of establishing the anti-jamming performance evaluation indicator system of Link-16, the linear combination of analytic hierarchy process(AHP) and entropy weight method (EWM) are used to calculate the combined weight. Secondly, the qualitative and quantitative concept transformation model, i.e., the cloud model, is introduced to evaluate the anti-jamming abilities of Link-16 under each jamming scheme. In addition, GRA calculates the correlation degree between evaluation indicators and the anti-jamming performance of Link-16, and assesses the best anti-jamming technology. Finally, simulation results prove that the proposed evaluation model can achieve the objective of feasible and practical evaluation, which opens up a novel way for the research of anti-jamming performance evaluations of Link-16.
基金the financial support from the Fujian Science Foundation for Outstanding Youth(2023J06039)the National Natural Science Foundation of China(Grant No.41977259,U2005205,41972268)the Independent Research Project of Technology Innovation Center for Monitoring and Restoration Engineering of Ecological Fragile Zone in Southeast China(KY-090000-04-2022-019)。
文摘Shotcrete is one of the common solutions for shallow sliding.It works by forming a protective layer with high strength and cementing the loose soil particles on the slope surface to prevent shallow sliding.However,the solidification time of conventional cement paste is long when shotcrete is used to treat cohesionless soil landslide.The idea of reinforcing slope with polyurethane solidified soil(i.e.,mixture of polyurethane and sand)was proposed.Model tests and finite element analysis were carried out to study the effectiveness of the proposed new method on the emergency treatment of cohesionless soil landslide.Surcharge loading on the crest of the slope was applied step by step until landslide was triggered so as to test and compare the stability and bearing capacity of slope models with different conditions.The simulated slope displacements were relatively close to the measured results,and the simulated slope deformation characteristics were in good agreement with the observed phenomena,which verifies the accuracy of the numerical method.Under the condition of surcharge loading on the crest of the slope,the unreinforced slope slid when the surcharge loading exceeded 30 k Pa,which presented a failure mode of local instability and collapse at the shallow layer of slope top.The reinforced slope remained stable even when the surcharge loading reached 48 k Pa.The displacement of the reinforced slope was reduced by more than 95%.Overall,this study verifies the effectiveness of polyurethane in the emergency treatment of cohesionless soil landslide and should have broad application prospects in the field of geological disasters concerning the safety of people's live.
基金supported by the National Key R&D Program of China(No.2021YFB0301200)National Natural Science Foundation of China(No.62025208).
文摘Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.
文摘DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.
文摘With the rapid development of generative artificial intelligence technologies,represented by large language models,university-level computer science education is undergoing a critical transition-from knowledge-based instruction to competency-oriented teaching.A postgraduate student competency evaluation model can serve as a framework to organize and guide both teaching and research activities at the postgraduate level.A number of relevant research efforts have already been conducted in this area.Graduate education plays a vital role not only as a continuation and enhancement of undergraduate education but also as essential preparation for future research endeavors.An analysis of the acceptance of competency evaluation models refers to the assessment of how various stakeholders perceive the importance of different components within the model.Investigating the degree of acceptance among diverse groups-such as current undergraduate students,current postgraduate students,graduates with less than three years of work experience,and those with more than three years of work experience-can offer valuable insights for improving and optimizing postgraduate education and training practices.
基金supported by the“Regional Innovation System&Education(RISE)”through the Seoul RISE Center,funded by the Ministry of Education(MOE)and the Seoul Metropolitan Government(2025-RISE-01-018-05)supported by Quad Miners Corp。
文摘The dynamic,heterogeneous nature of Edge computing in the Internet of Things(Edge-IoT)and Industrial IoT(IIoT)networks brings unique and evolving cybersecurity challenges.This study maps cyber threats in Edge-IoT/IIoT environments to the Adversarial Tactics,Techniques,and Common Knowledge(ATT&CK)framework by MITRE and introduces a lightweight,data-driven scoring model that enables rapid identification and prioritization of attacks.Inspired by the Factor Analysis of Information Risk model,our proposed scoring model integrates four key metrics:Common Vulnerability Scoring System(CVSS)-based severity scoring,Cyber Kill Chain–based difficulty estimation,Deep Neural Networks-driven detection scoring,and frequency analysis based on dataset prevalence.By aggregating these indicators,the model generates comprehensive risk profiles,facilitating actionable prioritization of threats.Robustness and stability of the scoring model are validated through non-parametric correlation analysis using Spearman’s and Kendall’s rank correlation coefficients,demonstrating consistent performance across diverse scenarios.The approach culminates in a prioritized attack ranking that provides actionable guidance for risk mitigation and resource allocation in Edge-IoT/IIoT security operations.By leveraging real-world data to align MITRE ATT&CK techniques with CVSS metrics,the framework offers a standardized and practically applicable solution for consistent threat assessment in operational settings.The proposed lightweight scoring model delivers rapid and reliable results under dynamic cyber conditions,facilitating timely identification of attack scenarios and prioritization of response strategies.Our systematic integration of established taxonomies with data-driven indicators strengthens practical risk management and supports strategic planning in next-generation IoT deployments.Ultimately,this work advances adaptive threat modeling for Edge/IIoT ecosystems and establishes a robust foundation for evidence-based prioritization in emerging cyber-physical infrastructures.
基金supported by Health and Medical Research Fund,Hong Kong(11220386,12230246).
文摘Background:With the rapid development of artificial intelligence(AI),large language models(LLMs)have emerged as a potent tool for invigorating ophthalmology across clinical,educational,and research fields.Their accuracy and reliability have undergone tested.This bibliometric analysis aims to provide an overview of research on LLMs in ophthalmology from both thematic and geographical perspectives.Methods:All existing and highly cited LLM-related ophthalmology research papers published in English up to 24th April 2025 were sourced from Scopus,PubMed,and Web of Science.The characteristics of these publications,including publication output,authors,journals,countries,institutions,citations,and research domains,were analyzed using Biblioshiny and VOSviewer software.Results:A total of 277 articles from 1,459 authors and 89 journals were included in this study.Although relevant publications began to appear in 2019,there was a significant increase starting from 2023.He M and Shi D are the most prolific authors,while Investigative Ophthalmology&Visual Science stands out as the most prominent journal.Most of the top-publishing countries are high-income economies,with the USA taking the lead,and the University of California is the leading institution.VOSviewer identified 5 clusters in the keyword co-occurrence analysis,indicating that current research focuses on the clinical applications of LLMs,particularly in diagnosis and patient education.Conclusions:While LLMs have demonstrated effectiveness in retaining knowledge,their accuracy in image-based diagnosis remains limited.Therefore,future research should investigate fine-tuning strategies and domain-specific adaptations to close this gap.Although research on the applications of LLMs in ophthalmology is still in its early stages,it holds significant potential for advancing the field.
文摘Sentiment analysis,a cornerstone of natural language processing,has witnessed remarkable advancements driven by deep learning models which demonstrated impressive accuracy in discerning sentiment from text across various domains.However,the deployment of such models in resource-constrained environments presents a unique set of challenges that require innovative solutions.Resource-constrained environments encompass scenarios where computing resources,memory,and energy availability are restricted.To empower sentiment analysis in resource-constrained environments,we address the crucial need by leveraging lightweight pre-trained models.These models,derived from popular architectures such as DistilBERT,MobileBERT,ALBERT,TinyBERT,ELECTRA,and SqueezeBERT,offer a promising solution to the resource limitations imposed by these environments.By distilling the knowledge from larger models into smaller ones and employing various optimization techniques,these lightweight models aim to strike a balance between performance and resource efficiency.This paper endeavors to explore the performance of multiple lightweight pre-trained models in sentiment analysis tasks specific to such environments and provide insights into their viability for practical deployment.
基金Supported by the Natural Science Foundation of Fujian Province(2022J011177,2024J01903)the Key Project of Fujian Provincial Education Department(JZ230054)。
文摘In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This article considers subgroup analysis methods for longitudinal data containing multiple covariates and biomarkers.We divide subgroups based on whether a linear combination of these biomarkers exceeds a predetermined threshold,and assess the heterogeneity of treatment effects across subgroups using the interaction between subgroups and exposure variables.Quantile regression is used to better characterize the global distribution of the response variable and sparsity penalties are imposed to achieve variable selection of covariates and biomarkers.The effectiveness of our proposed methodology for both variable selection and parameter estimation is verified through random simulations.Finally,we demonstrate the application of this method by analyzing data from the PA.3 trial,further illustrating the practicality of the method proposed in this paper.
基金supported by the National Natural Science Foundation of China under Grant 52277072.
文摘Wide-band oscillations have become a significant issue limiting the development of wind power.Both large-signal and small-signal analyses require extensive model derivation.Moreover,the large number and high order of wind turbines have driven the development of simplified models,whose applicability remains controversial.In this paper,a wide-band oscillation analysis method based on the average-value model(AVM)is proposed for wind farms(WFs).A novel linearization analysis framework is developed,leveraging the continuous-time characteristics of the AVM and MATLAB/Simulink’s built-in linearization tools.This significantly reduces modeling complexity and computational costs while maintaining model fidelity.Additionally,an object-based initial value estimation method of state variables is introduced,which,when combined with steady-state point-solving tools,greatly reduces the computational effort required for equilibrium point solving in batch linearization analysis.The proposed method is validated in both doubly fed induction generator(DFIG)-based and permanent magnet synchronous generator(PMSG)-based WFs.Furthermore,a comprehensive analysis is conducted for the first time to examine the impact of the machine-side system on the system stability of the nonfully controlled PMSG-based WF.
文摘In the rapidly evolving landscape of natural language processing(NLP)and sentiment analysis,improving the accuracy and efficiency of sentiment classification models is crucial.This paper investigates the performance of two advanced models,the Large Language Model(LLM)LLaMA model and NLP BERT model,in the context of airline review sentiment analysis.Through fine-tuning,domain adaptation,and the application of few-shot learning,the study addresses the subtleties of sentiment expressions in airline-related text data.Employing predictive modeling and comparative analysis,the research evaluates the effectiveness of Large Language Model Meta AI(LLaMA)and Bidirectional Encoder Representations from Transformers(BERT)in capturing sentiment intricacies.Fine-tuning,including domain adaptation,enhances the models'performance in sentiment classification tasks.Additionally,the study explores the potential of few-shot learning to improve model generalization using minimal annotated data for targeted sentiment analysis.By conducting experiments on a diverse airline review dataset,the research quantifies the impact of fine-tuning,domain adaptation,and few-shot learning on model performance,providing valuable insights for industries aiming to predict recommendations and enhance customer satisfaction through a deeper understanding of sentiment in user-generated content(UGC).This research contributes to refining sentiment analysis models,ultimately fostering improved customer satisfaction in the airline industry.
基金supported by the project of China Geological Survey(No.DD20220954)Open Funding Project of the Key Laboratory of Groundwater Sciences and Engineering,Ministry of Natural Resources(No.SK202301-4)+1 种基金Open Foundation of the Key Laboratory of Coupling Process and Effect of Natural Resources Elements(No.2022KFKTC009)Yanzhao Shanshui Science and Innovation Fund of Langfang Integrated Natural Resources Survey Center,China Geological Survey(No.YZSSJJ202401-001).
文摘Pingquan City,the origin of five rivers,serves as the core water conservation zone for the Beijing-Tianjin-Hebei region and exemplifies the characteristics of small watersheds in hilly areas.In recent years,excessive mining and intensified human activities have severely disrupted the local ecosystem,creating an urgent need for ecological vulnerability assessment to enhance water conservation functions.This study employed the sensitivity-resilience-pressure model,integrating various data sources,including regional background,hydro-meteorological data,field investigations,remote sensing analysis,and socio-economic data.The weights of the model indices were determined using an entropy weighting model that combines principal component analysis and the analytic hierarchy process.Using the ArcGIS platform,the spatial distribution and driving forces of ecological vulnerability in 2020 were analyzed,providing valuable insights for regional ecological restoration.The results indicated that the overall Ecological Vulnerability Index(EVI)was 0.389,signifying moderate ecological vulnerability,with significant variation between watersheds.The Daling River Basin had a high EVI,with ecological vulnerability primarily in levels IV and V,indicating high ecological pressure,whereas the Laoniu River Basin had a low EVI,reflecting minimal ecological pressure.Soil type was identified as the primary driving factor,followed by elevation,temperature,and soil erosion as secondary factors.It is recommended to focus on key regions and critical factors while conducting comprehensive monitoring and assessment to ensure the long-term success of ecological management efforts.
文摘We analyzed accident factors in a 2020 ship collision case that occurred off Kii Oshima Island using the SHELL model analysis and examined corresponding collision prevention measures.The SHELL model analysis is a framework for identifying accident factors related to human abilities and characteristics,hardware,software,and the environment.Beyond assessing the accident factors in each element,we also examined the interrelationship between humans and each element.This study highlights the importance of(1)training to enhance situational awareness,(2)improving decision-making skills,and(3)establishing structured decision-making procedures to prevent maritime collision accidents.Additionally,we considered safety measures through(4)hardware enhancements and(5)environmental measures.Furthermore,to prevent accidents,implementing measures grounded in(6)predictions is deemed effective.This study identified accident factors through prediction alongside the SHELL model analysis and proposed countermeasures based on the findings.By applying these predictions,more countermeasures can be derived,which,when combined strategically,can significantly aid in preventing maritime collision accidents.
基金supported by the National Natural Science Foundation of China(Grant Nos.12475003 and11705284)by the Natural Science Foundation of Beijing Municipality(Grant Nos.1232022 and 1212007)。
文摘In this paper,the N-soliton solutions for the massive Thirring model(MTM)in laboratory coordinates are analyzed via the Riemann-Hilbert(RH)approach.The direct scattering including the analyticity,symmetries,and asymptotic behaviors of the Jost solutions as|λ|→∞andλ→0 are given.Considering that the scattering coefficients have simple zeros,the matrix RH problem,reconstruction formulas and corresponding trace formulas are also derived.Further,the N-soliton solutions in the reflectionless case are obtained explicitly in the form of determinants.The propagation characteristics of one-soliton solutions and interaction properties of two-soliton solutions are discussed.In particular,the asymptotic expressions of two-soliton solutions as|t|→∞are obtained,which show that the velocities and amplitudes of the asymptotic solitons do not change before and after interaction except the position shifts.In addition,three types of bounded states for two-soliton solutions are presented with certain parametric conditions.
基金Supported by the Basic Product Innovation Plan for Vehicle Power Scientific Research Project(Grant No.JCCPCX201704).
文摘This research presents an advanced study on the modeling and stability analysis of electro-hydraulic control modules used in intelligent chassis systems.Firstly,a comprehensive nonlinear mathematical model of the electro-hydraulic power-shift system is developed,incorporating pipeline characteristics through impedance analysis and examining coupling effects between the pilot solenoid valve,main valve,and pipeline.Then,the model’s accuracy is validated through experimental testing,demonstrating high precision and minimal model errors.A comparative analysis between simulation data(both with and without pipeline characteristics)and experimental results reveals that the model considering pipeline parameters aligns more closely with experimental data,highlighting its superior accuracy.The research further explores the influence of key factors on system stability,including damping coefficient,feedback cavity orifice diameter,spring stiffness,pipeline length,and pipeline diameter.Significant findings include the critical impact of damping coefficient,orifice diameter,and pipeline length on stability,while spring stiffness has a minimal effect.These findings provide valuable insights for optimizing electro-hydraulic control modules in intelligent chassis systems,with practical implications for automotive and construction machinery applications.