Active inflammation in“inactive”progressive multiple sclerosis:Traditionally,the distinction between relapsing-remitting multiple sclerosis and progressive multiple sclerosis(PMS)has been framed as an inflammatory v...Active inflammation in“inactive”progressive multiple sclerosis:Traditionally,the distinction between relapsing-remitting multiple sclerosis and progressive multiple sclerosis(PMS)has been framed as an inflammatory versus degenerative dichotomy.This was based on a broad misconception regarding essentially all neurodegenerative conditions,depicting the degenerative process as passive and immune-independent occurring as a late byproduct of active inflammation in the central nervous system(CNS),which is(solely)systemically driven.展开更多
Mitigating vortex-induced vibrations(VIV)in flexible risers represents a critical concern in offshore oil and gas production,considering its potential impact on operational safety and efficiency.The accurate predictio...Mitigating vortex-induced vibrations(VIV)in flexible risers represents a critical concern in offshore oil and gas production,considering its potential impact on operational safety and efficiency.The accurate prediction of displacement and position of VIV in flexible risers remains challenging under actual marine conditions.This study presents a data-driven model for riser displacement prediction that corresponds to field conditions.Experimental data analysis reveals that the XGBoost algorithm predicts the maximum displacement and position with superior accuracy compared with Support vector regression(SVR),considering both computational efficiency and precision.Platform displacement in the Y-direction demonstrates a significant positive correlation with both axial depth and maximum displacement magnitude.The fourth point displacement exhibits the highest contribution to model prediction outcomes,showing a positive influence on maximum displacement while negatively affecting the axial depth of maximum displacement.Platform displacement in the X-and Y-directions exhibits competitive effects on both the riser’s maximum displacement and its axial depth.Through the implementation of XGBoost algorithm and SHapley Additive exPlanation(SHAP)analysis,the model effectively estimates the riser’s maximum displacement and its precise location.This data-driven approach achieves predictions using minimal,readily available data points,enhancing its practical field applications and demonstrating clear relevance to academic and professional communities.展开更多
Based on the educational evaluation reform,this study explores the construction of an evidence-based value-added evaluation system based on data-driven,aiming to solve the limitations of traditional evaluation methods...Based on the educational evaluation reform,this study explores the construction of an evidence-based value-added evaluation system based on data-driven,aiming to solve the limitations of traditional evaluation methods.The research adopts the method of combining theoretical analysis and practical application,and designs the evidence-based value-added evaluation framework,which includes the core elements of a multi-source heterogeneous data acquisition and processing system,a value-added evaluation agent based on a large model,and an evaluation implementation and application mechanism.Through empirical research verification,the evaluation system has remarkable effects in improving learning participation,promoting ability development,and supporting teaching decision-making,and provides a theoretical reference and practical path for educational evaluation reform in the new era.The research shows that the evidence-based value-added evaluation system based on data-driven can reflect students’actual progress more fairly and objectively by accurately measuring the difference in starting point and development range of students,and provide strong support for the realization of high-quality education development.展开更多
We propose an integrated method of data-driven and mechanism models for well logging formation evaluation,explicitly focusing on predicting reservoir parameters,such as porosity and water saturation.Accurately interpr...We propose an integrated method of data-driven and mechanism models for well logging formation evaluation,explicitly focusing on predicting reservoir parameters,such as porosity and water saturation.Accurately interpreting these parameters is crucial for effectively exploring and developing oil and gas.However,with the increasing complexity of geological conditions in this industry,there is a growing demand for improved accuracy in reservoir parameter prediction,leading to higher costs associated with manual interpretation.The conventional logging interpretation methods rely on empirical relationships between logging data and reservoir parameters,which suffer from low interpretation efficiency,intense subjectivity,and suitability for ideal conditions.The application of artificial intelligence in the interpretation of logging data provides a new solution to the problems existing in traditional methods.It is expected to improve the accuracy and efficiency of the interpretation.If large and high-quality datasets exist,data-driven models can reveal relationships of arbitrary complexity.Nevertheless,constructing sufficiently large logging datasets with reliable labels remains challenging,making it difficult to apply data-driven models effectively in logging data interpretation.Furthermore,data-driven models often act as“black boxes”without explaining their predictions or ensuring compliance with primary physical constraints.This paper proposes a machine learning method with strong physical constraints by integrating mechanism and data-driven models.Prior knowledge of logging data interpretation is embedded into machine learning regarding network structure,loss function,and optimization algorithm.We employ the Physically Informed Auto-Encoder(PIAE)to predict porosity and water saturation,which can be trained without labeled reservoir parameters using self-supervised learning techniques.This approach effectively achieves automated interpretation and facilitates generalization across diverse datasets.展开更多
This study examines the advent of agent interaction(AIx)as a transformative paradigm in humancomputer interaction(HCI),signifying a notable evolution beyond traditional graphical interfaces and touchscreen interaction...This study examines the advent of agent interaction(AIx)as a transformative paradigm in humancomputer interaction(HCI),signifying a notable evolution beyond traditional graphical interfaces and touchscreen interactions.Within the context of large models,AIx is characterized by its innovative interaction patterns and a plethora of application scenarios that hold great potential.The paper highlights the pivotal role of AIx in shaping the future landscape of the large model industry,emphasizing its adoption and necessity from a user's perspective.This study underscores the pivotal role of AIx in dictating the future trajectory of a large model industry by emphasizing the importance of its adoption and necessity from a user-centric perspective.The fundamental drivers of AIx include the introduction of novel capabilities,replication of capabilities(both anthropomorphic and superhuman),migration of capabilities,aggregation of intelligence,and multiplication of capabilities.These elements are essential for propelling innovation,expanding the frontiers of capability,and realizing the exponential superposition of capabilities,thereby mitigating labor redundancy and addressing a spectrum of human needs.Furthermore,this study provides an in-depth analysis of the structural components and operational mechanisms of agents supported by large models.Such advancements significantly enhance the capacity of agents to tackle complex problems and provide intelligent services,thereby facilitating a more intuitive,adaptive,and personalized engagement between humans and machines.The study further delineates four principal categories of interaction patterns that encompass eight distinct modalities of interaction,corresponding to twenty-one specific scenarios,including applications in smart home systems,health assistance,and elderly care.This emphasizes the significance of this new paradigm in advancing HCI,fostering technological advancements,and redefining user experiences.However,it also acknowledges the challenges and ethical considerations that accompany this paradigm shift,recognizing the need for a balanced approach to harness the full potential of AIx in modern society.展开更多
Against the backdrop of the national innovation strategy and the digital transformation of education,the traditional“extensive”training model for innovation and entrepreneurship talents struggles to meet the persona...Against the backdrop of the national innovation strategy and the digital transformation of education,the traditional“extensive”training model for innovation and entrepreneurship talents struggles to meet the personalized development needs of students,making an urgent shift toward precision and intelligence necessary.This study constructs a four-dimensional integrated framework centered on data,“Goal-Data-Intervention-Evaluation”,and proposes a data-driven training model for innovation and entrepreneurship talents in universities.By collecting multi-source data such as learning behaviors,competency assessments,and practical projects,the model conducts in-depth analysis of students’individual characteristics and development potential,enabling precise decision-making in goal setting,teaching intervention,and practical guidance.Based on data analysis,a supportive system for personalized teaching and practical activities is established.Combined with process-oriented and summative evaluations,a closed-loop feedback mechanism is formed to improve training effectiveness.This model provides a theoretical framework and practical path for the scientific,personalized,and intelligent development of innovation and entrepreneurship education in universities.展开更多
A data-driven model ofmultiple variable cutting(M-VCUT)level set-based substructure is proposed for the topology optimization of lattice structures.TheM-VCUTlevel setmethod is used to represent substructures,enriching...A data-driven model ofmultiple variable cutting(M-VCUT)level set-based substructure is proposed for the topology optimization of lattice structures.TheM-VCUTlevel setmethod is used to represent substructures,enriching their diversity of configuration while ensuring connectivity.To construct the data-driven model of substructure,a database is prepared by sampling the space of substructures spanned by several substructure prototypes.Then,for each substructure in this database,the stiffness matrix is condensed so that its degrees of freedomare reduced.Thereafter,the data-drivenmodel of substructures is constructed through interpolationwith compactly supported radial basis function(CS-RBF).The inputs of the data-driven model are the design variables of topology optimization,and the outputs are the condensed stiffness matrix and volume of substructures.During the optimization,this data-driven model is used,thus avoiding repeated static condensation that would requiremuch computation time.Several numerical examples are provided to verify the proposed method.展开更多
With the continuous improvement of the medical industry’s requirements for the professional capabilities of nursing talents,traditional nursing teaching models can hardly meet the needs of complex nursing work in neu...With the continuous improvement of the medical industry’s requirements for the professional capabilities of nursing talents,traditional nursing teaching models can hardly meet the needs of complex nursing work in neurology.This paper focuses on nursing education for neurology nursing students and explores the construction of the“one-on-one”teaching model,aiming to achieve a paradigm shift in nursing education.By analyzing the current status of neurology nursing education,this paper identifies the problems in traditional teaching models.Combining the advantages of the“one-on-one”teaching model,it elaborates on the construction path of this model from aspects such as the selection and training of teaching instructors,the design of teaching content,the innovation of teaching methods,and the improvement of the teaching evaluation system.The research shows that the“one-on-one”teaching model can significantly enhance nursing students’mastery of professional knowledge,clinical operation skills,communication skills,and emergency response capabilities,as well as strengthen their professional identity and sense of responsibility.It provides an effective way to cultivate high-quality nursing talents who can meet the needs of neurology nursing work and promotes the innovative development of nursing education.展开更多
The outstanding comprehensive mechanical properties of newly developed hybrid lattice structures make them useful in engineering applications for bearing multiple mechanical loads.Additive-manufacturing technologies m...The outstanding comprehensive mechanical properties of newly developed hybrid lattice structures make them useful in engineering applications for bearing multiple mechanical loads.Additive-manufacturing technologies make it possible to fabricate these highly spatially programmable structures and greatly enhance the freedom in their design.However,traditional analytical methods do not sufficiently reflect the actual vibration-damping mechanism of lattice structures and are limited by their high computational cost.In this study,a hybrid lattice structure consisting of various cells was designed based on quasi-static and vibration experiments.Subsequently,a novel parametric design method based on a data-driven approach was developed for hybrid lattices with engineered properties.The response surface method was adopted to define the sensitive optimization target.A prediction model for the lattice geometric parameters and vibration properties was established using a backpropagation neural network.Then,it was integrated into the genetic algorithm to create the optimal hybrid lattice with varying geometric features and the required wide-band vibration-damping characteristics.Validation experiments were conducted,demonstrating that the optimized hybrid lattice can achieve the target properties.In addition,the data-driven parametric design method can reduce computation time and be widely applied to complex structural designs when analytical and empirical solutions are unavailable.展开更多
The Underwater Acoustic(UWA)channel is bandwidth-constrained and experiences doubly selective fading.It is challenging to acquire perfect channel knowledge for Orthogonal Frequency Division Multiplexing(OFDM)communica...The Underwater Acoustic(UWA)channel is bandwidth-constrained and experiences doubly selective fading.It is challenging to acquire perfect channel knowledge for Orthogonal Frequency Division Multiplexing(OFDM)communications using a finite number of pilots.On the other hand,Deep Learning(DL)approaches have been very successful in wireless OFDM communications.However,whether they will work underwater is still a mystery.For the first time,this paper compares two categories of DL-based UWA OFDM receivers:the DataDriven(DD)method,which performs as an end-to-end black box,and the Model-Driven(MD)method,also known as the model-based data-driven method,which combines DL and expert OFDM receiver knowledge.The encoder-decoder framework and Convolutional Neural Network(CNN)structure are employed to establish the DD receiver.On the other hand,an unfolding-based Minimum Mean Square Error(MMSE)structure is adopted for the MD receiver.We analyze the characteristics of different receivers by Monte Carlo simulations under diverse communications conditions and propose a strategy for selecting a proper receiver under different communication scenarios.Field trials in the pool and sea are also conducted to verify the feasibility and advantages of the DL receivers.It is observed that DL receivers perform better than conventional receivers in terms of bit error rate.展开更多
For control systems with unknown model parameters,this paper proposes a data-driven iterative learning method for fault estimation.First,input and output data from the system under fault-free conditions are collected....For control systems with unknown model parameters,this paper proposes a data-driven iterative learning method for fault estimation.First,input and output data from the system under fault-free conditions are collected.By applying orthogonal triangular decomposition and singular value decomposition,a data-driven realization of the system's kernel representation is derived,based on this representation,a residual generator is constructed.Then,the actuator fault signal is estimated online by analyzing the system's dynamic residual,and an iterative learning algorithm is introduced to continuously optimize the residual-based performance function,thereby enhancing estimation accuracy.The proposed method achieves actuator fault estimation without requiring knowledge of model parameters,eliminating the time-consuming system modeling process,and allowing operators to focus on system optimization and decision-making.Compared with existing fault estimation methods,the proposed method demonstrates superior transient performance,steady-state performance,and real-time capability,reduces the need for manual intervention and lowers operational complexity.Finally,experimental results on a mobile robot verify the effectiveness and advantages of the method.展开更多
When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding bia...When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis.展开更多
In the rapidly evolving technological landscape,state-owned enterprises(SOEs)encounter significant challenges in sustaining their competitiveness through efficient R&D management.Integrated Product Development(IPD...In the rapidly evolving technological landscape,state-owned enterprises(SOEs)encounter significant challenges in sustaining their competitiveness through efficient R&D management.Integrated Product Development(IPD),with its emphasis on cross-functional teamwork,concurrent engineering,and data-driven decision-making,has been widely recognized for enhancing R&D efficiency and product quality.However,the unique characteristics of SOEs pose challenges to the effective implementation of IPD.The advancement of big data and artificial intelligence technologies offers new opportunities for optimizing IPD R&D management through data-driven decision-making models.This paper constructs and validates a data-driven decision-making model tailored to the IPD R&D management of SOEs.By integrating data mining,machine learning,and other advanced analytical techniques,the model serves as a scientific and efficient decision-making tool.It aids SOEs in optimizing R&D resource allocation,shortening product development cycles,reducing R&D costs,and improving product quality and innovation.Moreover,this study contributes to a deeper theoretical understanding of the value of data-driven decision-making in the context of IPD.展开更多
This paper explores the paradigm reconstruction of interpreting pedagogy driven by generative AI technology.With the breakthroughs of AI technologies such as ChatGPT in natural language processing,traditional interpre...This paper explores the paradigm reconstruction of interpreting pedagogy driven by generative AI technology.With the breakthroughs of AI technologies such as ChatGPT in natural language processing,traditional interpreting education faces dual challenges of technological substitution and pedagogical transformation.Based on Kuhn’s paradigm theory,the study analyzes the limitations of three traditional interpreting teaching paradigms,language-centric,knowledge-based,and skill-acquisition-oriented,and proposes a novel“teacher-AI-learner”triadic collaborative paradigm.Through reconstructing teaching subjects,environments,and curriculum systems,the integration of real-time translation tools and intelligent terminology databases facilitates the transition from static skill training to dynamic human-machine collaboration.The research simultaneously highlights challenges in technological ethics and curriculum design transformation pressures,emphasizing the necessity to balance technological empowerment with humanistic education.展开更多
Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysi...Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysis methods have limitations in dealing with these complex and interrelated factors,and it is difficult to fully reveal the actual contribution of each factor to the production.Machine learning-based methods explore the complex mapping relationships between large amounts of data to provide datadriven insights into the key factors driving production.In this study,a data-driven PCA-RF-VIM(Principal Component Analysis-Random Forest-Variable Importance Measures)approach of analyzing the importance of features is proposed to identify the key factors driving post-fracturing production.Four types of parameters,including log parameters,geological and reservoir physical parameters,hydraulic fracturing design parameters,and reservoir stimulation parameters,were inputted into the PCA-RF-VIM model.The model was trained using 6-fold cross-validation and grid search,and the relative importance ranking of each factor was finally obtained.In order to verify the validity of the PCA-RF-VIM model,a consolidation model that uses three other independent data-driven methods(Pearson correlation coefficient,RF feature significance analysis method,and XGboost feature significance analysis method)are applied to compare with the PCA-RF-VIM model.A comparison the two models shows that they contain almost the same parameters in the top ten,with only minor differences in one parameter.In combination with the reservoir characteristics,the reasonableness of the PCA-RF-VIM model is verified,and the importance ranking of the parameters by this method is more consistent with the reservoir characteristics of the study area.Ultimately,the ten parameters are selected as the controlling factors that have the potential to influence post-fracturing gas production,as the combined importance of these top ten parameters is 91.95%on driving natural gas production.Analyzing and obtaining these ten controlling factors provides engineers with a new insight into the reservoir selection for fracturing stimulation and fracturing parameter optimization to improve fracturing efficiency and productivity.展开更多
The integration of artificial intelligence(AI)is fundamentally reshaping the scientific research,giving rise to a new era of discovery and innovation.This paper explores this transformative shift,introducing an innova...The integration of artificial intelligence(AI)is fundamentally reshaping the scientific research,giving rise to a new era of discovery and innovation.This paper explores this transformative shift,introducing an innovative concept of the“AI-Driven Research Ecosystem”,a dynamic and collaborative research environment.Within this ecosystem,we focus on the unification of human-AI collaboration models and the emerging new research thinking paradigms.We analyze the multifaceted roles of AI within the research lifecycle,spanning from a passive tool to an active assistant and autonomous participants,and categorize these interactions into distinct human-AI collaboration models.Furthermore,we examine how the pervasive involvement of AI necessitates an evolution in human research thinking,emphasizing the significant roles of critical,creative,and computational thinking.Through a review of existing literature and illustrative case studies,this paper provides a comprehensive overview of the AI-driven research ecosystem,highlighting its potential for transforming scientific research.Our findings advance the current understanding of AI’s multiple roles in research and underscore its capacity to revolutionize both knowledge discovery and collaborative innovation,paving the way for a more integrated and impactful research paradigm.展开更多
This paper focuses on the numerical solution of a tumor growth model under a data-driven approach.Based on the inherent laws of the data and reasonable assumptions,an ordinary differential equation model for tumor gro...This paper focuses on the numerical solution of a tumor growth model under a data-driven approach.Based on the inherent laws of the data and reasonable assumptions,an ordinary differential equation model for tumor growth is established.Nonlinear fitting is employed to obtain the optimal parameter estimation of the mathematical model,and the numerical solution is carried out using the Matlab software.By comparing the clinical data with the simulation results,a good agreement is achieved,which verifies the rationality and feasibility of the model.展开更多
The impacts of lateral boundary conditions(LBCs)provided by numerical models and data-driven networks on convective-scale ensemble forecasts are investigated in this study.Four experiments are conducted on the Hangzho...The impacts of lateral boundary conditions(LBCs)provided by numerical models and data-driven networks on convective-scale ensemble forecasts are investigated in this study.Four experiments are conducted on the Hangzhou RDP(19th Hangzhou Asian Games Research Development Project on Convective-scale Ensemble Prediction and Application)testbed,with the LBCs respectively sourced from National Centers for Environmental Prediction(NCEP)Global Forecast System(GFS)forecasts with 33 vertical levels(Exp_GFS),Pangu forecasts with 13 vertical levels(Exp_Pangu),Fuxi forecasts with 13 vertical levels(Exp_Fuxi),and NCEP GFS forecasts with the vertical levels reduced to 13(the same as those of Exp_Pangu and Exp_Fuxi)(Exp_GFSRDV).In general,Exp_Pangu performs comparably to Exp_GFS,while Exp_Fuxi shows slightly inferior performance compared to Exp_Pangu,possibly due to its less accurate large-scale predictions.Therefore,the ability of using data-driven networks to efficiently provide LBCs for convective-scale ensemble forecasts has been demonstrated.Moreover,Exp_GFSRDV has the worst convective-scale forecasts among the four experiments,which indicates the potential improvement of using data-driven networks for LBCs by increasing the vertical levels of the networks.However,the ensemble spread of the four experiments barely increases with lead time.Thus,each experiment has insufficient ensemble spread to present realistic forecast uncertainties,which will be investigated in a future study.展开更多
BACKGROUND Research has consistently demonstrated that patients with major depressive disorder(MDD)exhibit attentional switching dysfunction,and the dual-task paradigm has emerged as a valuable tool for probing cognit...BACKGROUND Research has consistently demonstrated that patients with major depressive disorder(MDD)exhibit attentional switching dysfunction,and the dual-task paradigm has emerged as a valuable tool for probing cognitive deficits.However,the neuroelectrophysiological mechanism underlying this deficit has not been clarified.AIM To investigate the event-related potential(ERP)characteristics of attentional switching dysfunction and further explore the neuroelectrophysiological mechanism of the cognitive processing deficits underlying attentional switching dysfunction in MDD.METHODS The participants included 29 MDD patients and 29 healthy controls(HCs).The ERPs of the participants were measured while they performed the dual-task para digm.The behavioral and ERP N100,P200,P300,and late positive potential(LPP)data were analyzed.RESULTS This study revealed greater accuracy in HCs and slower reaction times(RTs)in MDD patients.Angry facial pictures led to lower accuracy.The results also revealed shorter RTs for happy facial pictures and the longest RTs for the 500-ms stimulus onset asynchrony.With respect to ERP characteristics,happy facial pictures and neutral facial pictures evoked higher amplitudes.The N100,P200,P300,and LPP amplitudes at Pz were the highest.MDD patients had lower P200 mean amplitudes and LPP amplitudes than HCs did.CONCLUSION In conclusion,MDD patients exhibited abnormal ERP characteristics evoked by the dual-task paradigm,which could be the neural correlates of the known abnormalities in attentional switching in patients with MDD.These results provide valuable insights into the understanding of the neural mechanisms of attentional switching function and may guide targeted interventions in patients with MDD.展开更多
With the rapid advancement of machine learning technology and its growing adoption in research and engineering applications,an increasing number of studies have embraced data-driven approaches for modeling wind turbin...With the rapid advancement of machine learning technology and its growing adoption in research and engineering applications,an increasing number of studies have embraced data-driven approaches for modeling wind turbine wakes.These models leverage the ability to capture complex,high-dimensional characteristics of wind turbine wakes while offering significantly greater efficiency in the prediction process than physics-driven models.As a result,data-driven wind turbine wake models are regarded as powerful and effective tools for predicting wake behavior and turbine power output.This paper aims to provide a concise yet comprehensive review of existing studies on wind turbine wake modeling that employ data-driven approaches.It begins by defining and classifying machine learning methods to facilitate a clearer understanding of the reviewed literature.Subsequently,the related studies are categorized into four key areas:wind turbine power prediction,data-driven analytic wake models,wake field reconstruction,and the incorporation of explicit physical constraints.The accuracy of data-driven models is influenced by two primary factors:the quality of the training data and the performance of the model itself.Accordingly,both data accuracy and model structure are discussed in detail within the review.展开更多
文摘Active inflammation in“inactive”progressive multiple sclerosis:Traditionally,the distinction between relapsing-remitting multiple sclerosis and progressive multiple sclerosis(PMS)has been framed as an inflammatory versus degenerative dichotomy.This was based on a broad misconception regarding essentially all neurodegenerative conditions,depicting the degenerative process as passive and immune-independent occurring as a late byproduct of active inflammation in the central nervous system(CNS),which is(solely)systemically driven.
基金The research work was financially supported by the National Natural Science Foundation of China(Grant Nos.51979238 and 52301338)the Sichuan Science and Technology Program(Grant Nos.2023NSFSC1953 and 2023ZYD0140).
文摘Mitigating vortex-induced vibrations(VIV)in flexible risers represents a critical concern in offshore oil and gas production,considering its potential impact on operational safety and efficiency.The accurate prediction of displacement and position of VIV in flexible risers remains challenging under actual marine conditions.This study presents a data-driven model for riser displacement prediction that corresponds to field conditions.Experimental data analysis reveals that the XGBoost algorithm predicts the maximum displacement and position with superior accuracy compared with Support vector regression(SVR),considering both computational efficiency and precision.Platform displacement in the Y-direction demonstrates a significant positive correlation with both axial depth and maximum displacement magnitude.The fourth point displacement exhibits the highest contribution to model prediction outcomes,showing a positive influence on maximum displacement while negatively affecting the axial depth of maximum displacement.Platform displacement in the X-and Y-directions exhibits competitive effects on both the riser’s maximum displacement and its axial depth.Through the implementation of XGBoost algorithm and SHapley Additive exPlanation(SHAP)analysis,the model effectively estimates the riser’s maximum displacement and its precise location.This data-driven approach achieves predictions using minimal,readily available data points,enhancing its practical field applications and demonstrating clear relevance to academic and professional communities.
基金This paper is the research result of“Research on Innovation of Evidence-Based Teaching Paradigm in Vocational Education under the Background of New Quality Productivity”(2024JXQ176)the Shandong Province Artificial Intelligence Education Research Project(SDDJ202501035),which explores the application of artificial intelligence big models in student value-added evaluation from an evidence-based perspective。
文摘Based on the educational evaluation reform,this study explores the construction of an evidence-based value-added evaluation system based on data-driven,aiming to solve the limitations of traditional evaluation methods.The research adopts the method of combining theoretical analysis and practical application,and designs the evidence-based value-added evaluation framework,which includes the core elements of a multi-source heterogeneous data acquisition and processing system,a value-added evaluation agent based on a large model,and an evaluation implementation and application mechanism.Through empirical research verification,the evaluation system has remarkable effects in improving learning participation,promoting ability development,and supporting teaching decision-making,and provides a theoretical reference and practical path for educational evaluation reform in the new era.The research shows that the evidence-based value-added evaluation system based on data-driven can reflect students’actual progress more fairly and objectively by accurately measuring the difference in starting point and development range of students,and provide strong support for the realization of high-quality education development.
基金supported by National Key Research and Development Program (2019YFA0708301)National Natural Science Foundation of China (51974337)+2 种基金the Strategic Cooperation Projects of CNPC and CUPB (ZLZX2020-03)Science and Technology Innovation Fund of CNPC (2021DQ02-0403)Open Fund of Petroleum Exploration and Development Research Institute of CNPC (2022-KFKT-09)
文摘We propose an integrated method of data-driven and mechanism models for well logging formation evaluation,explicitly focusing on predicting reservoir parameters,such as porosity and water saturation.Accurately interpreting these parameters is crucial for effectively exploring and developing oil and gas.However,with the increasing complexity of geological conditions in this industry,there is a growing demand for improved accuracy in reservoir parameter prediction,leading to higher costs associated with manual interpretation.The conventional logging interpretation methods rely on empirical relationships between logging data and reservoir parameters,which suffer from low interpretation efficiency,intense subjectivity,and suitability for ideal conditions.The application of artificial intelligence in the interpretation of logging data provides a new solution to the problems existing in traditional methods.It is expected to improve the accuracy and efficiency of the interpretation.If large and high-quality datasets exist,data-driven models can reveal relationships of arbitrary complexity.Nevertheless,constructing sufficiently large logging datasets with reliable labels remains challenging,making it difficult to apply data-driven models effectively in logging data interpretation.Furthermore,data-driven models often act as“black boxes”without explaining their predictions or ensuring compliance with primary physical constraints.This paper proposes a machine learning method with strong physical constraints by integrating mechanism and data-driven models.Prior knowledge of logging data interpretation is embedded into machine learning regarding network structure,loss function,and optimization algorithm.We employ the Physically Informed Auto-Encoder(PIAE)to predict porosity and water saturation,which can be trained without labeled reservoir parameters using self-supervised learning techniques.This approach effectively achieves automated interpretation and facilitates generalization across diverse datasets.
文摘This study examines the advent of agent interaction(AIx)as a transformative paradigm in humancomputer interaction(HCI),signifying a notable evolution beyond traditional graphical interfaces and touchscreen interactions.Within the context of large models,AIx is characterized by its innovative interaction patterns and a plethora of application scenarios that hold great potential.The paper highlights the pivotal role of AIx in shaping the future landscape of the large model industry,emphasizing its adoption and necessity from a user's perspective.This study underscores the pivotal role of AIx in dictating the future trajectory of a large model industry by emphasizing the importance of its adoption and necessity from a user-centric perspective.The fundamental drivers of AIx include the introduction of novel capabilities,replication of capabilities(both anthropomorphic and superhuman),migration of capabilities,aggregation of intelligence,and multiplication of capabilities.These elements are essential for propelling innovation,expanding the frontiers of capability,and realizing the exponential superposition of capabilities,thereby mitigating labor redundancy and addressing a spectrum of human needs.Furthermore,this study provides an in-depth analysis of the structural components and operational mechanisms of agents supported by large models.Such advancements significantly enhance the capacity of agents to tackle complex problems and provide intelligent services,thereby facilitating a more intuitive,adaptive,and personalized engagement between humans and machines.The study further delineates four principal categories of interaction patterns that encompass eight distinct modalities of interaction,corresponding to twenty-one specific scenarios,including applications in smart home systems,health assistance,and elderly care.This emphasizes the significance of this new paradigm in advancing HCI,fostering technological advancements,and redefining user experiences.However,it also acknowledges the challenges and ethical considerations that accompany this paradigm shift,recognizing the need for a balanced approach to harness the full potential of AIx in modern society.
基金Special Fund for Teacher Development Research Program of University of Shanghai for Science and Technology(Project No.:CFTD2025YB28)。
文摘Against the backdrop of the national innovation strategy and the digital transformation of education,the traditional“extensive”training model for innovation and entrepreneurship talents struggles to meet the personalized development needs of students,making an urgent shift toward precision and intelligence necessary.This study constructs a four-dimensional integrated framework centered on data,“Goal-Data-Intervention-Evaluation”,and proposes a data-driven training model for innovation and entrepreneurship talents in universities.By collecting multi-source data such as learning behaviors,competency assessments,and practical projects,the model conducts in-depth analysis of students’individual characteristics and development potential,enabling precise decision-making in goal setting,teaching intervention,and practical guidance.Based on data analysis,a supportive system for personalized teaching and practical activities is established.Combined with process-oriented and summative evaluations,a closed-loop feedback mechanism is formed to improve training effectiveness.This model provides a theoretical framework and practical path for the scientific,personalized,and intelligent development of innovation and entrepreneurship education in universities.
基金supported by the National Natural Science Foundation of China(Grant No.12272144).
文摘A data-driven model ofmultiple variable cutting(M-VCUT)level set-based substructure is proposed for the topology optimization of lattice structures.TheM-VCUTlevel setmethod is used to represent substructures,enriching their diversity of configuration while ensuring connectivity.To construct the data-driven model of substructure,a database is prepared by sampling the space of substructures spanned by several substructure prototypes.Then,for each substructure in this database,the stiffness matrix is condensed so that its degrees of freedomare reduced.Thereafter,the data-drivenmodel of substructures is constructed through interpolationwith compactly supported radial basis function(CS-RBF).The inputs of the data-driven model are the design variables of topology optimization,and the outputs are the condensed stiffness matrix and volume of substructures.During the optimization,this data-driven model is used,thus avoiding repeated static condensation that would requiremuch computation time.Several numerical examples are provided to verify the proposed method.
文摘With the continuous improvement of the medical industry’s requirements for the professional capabilities of nursing talents,traditional nursing teaching models can hardly meet the needs of complex nursing work in neurology.This paper focuses on nursing education for neurology nursing students and explores the construction of the“one-on-one”teaching model,aiming to achieve a paradigm shift in nursing education.By analyzing the current status of neurology nursing education,this paper identifies the problems in traditional teaching models.Combining the advantages of the“one-on-one”teaching model,it elaborates on the construction path of this model from aspects such as the selection and training of teaching instructors,the design of teaching content,the innovation of teaching methods,and the improvement of the teaching evaluation system.The research shows that the“one-on-one”teaching model can significantly enhance nursing students’mastery of professional knowledge,clinical operation skills,communication skills,and emergency response capabilities,as well as strengthen their professional identity and sense of responsibility.It provides an effective way to cultivate high-quality nursing talents who can meet the needs of neurology nursing work and promotes the innovative development of nursing education.
基金supported by National Natural Science Foundation of China(Grant No.52375380)National Key R&D Program of China(Grant No.2022YFB3402200)the Key Project of National Natural Science Foundation of China(Grant No.12032018).
文摘The outstanding comprehensive mechanical properties of newly developed hybrid lattice structures make them useful in engineering applications for bearing multiple mechanical loads.Additive-manufacturing technologies make it possible to fabricate these highly spatially programmable structures and greatly enhance the freedom in their design.However,traditional analytical methods do not sufficiently reflect the actual vibration-damping mechanism of lattice structures and are limited by their high computational cost.In this study,a hybrid lattice structure consisting of various cells was designed based on quasi-static and vibration experiments.Subsequently,a novel parametric design method based on a data-driven approach was developed for hybrid lattices with engineered properties.The response surface method was adopted to define the sensitive optimization target.A prediction model for the lattice geometric parameters and vibration properties was established using a backpropagation neural network.Then,it was integrated into the genetic algorithm to create the optimal hybrid lattice with varying geometric features and the required wide-band vibration-damping characteristics.Validation experiments were conducted,demonstrating that the optimized hybrid lattice can achieve the target properties.In addition,the data-driven parametric design method can reduce computation time and be widely applied to complex structural designs when analytical and empirical solutions are unavailable.
基金funded in part by the National Natural Science Foundation of China under Grant 62401167 and 62192712in part by the Key Laboratory of Marine Environmental Survey Technology and Application,Ministry of Natural Resources,P.R.China under Grant MESTA-2023-B001in part by the Stable Supporting Fund of National Key Laboratory of Underwater Acoustic Technology under Grant JCKYS2022604SSJS007.
文摘The Underwater Acoustic(UWA)channel is bandwidth-constrained and experiences doubly selective fading.It is challenging to acquire perfect channel knowledge for Orthogonal Frequency Division Multiplexing(OFDM)communications using a finite number of pilots.On the other hand,Deep Learning(DL)approaches have been very successful in wireless OFDM communications.However,whether they will work underwater is still a mystery.For the first time,this paper compares two categories of DL-based UWA OFDM receivers:the DataDriven(DD)method,which performs as an end-to-end black box,and the Model-Driven(MD)method,also known as the model-based data-driven method,which combines DL and expert OFDM receiver knowledge.The encoder-decoder framework and Convolutional Neural Network(CNN)structure are employed to establish the DD receiver.On the other hand,an unfolding-based Minimum Mean Square Error(MMSE)structure is adopted for the MD receiver.We analyze the characteristics of different receivers by Monte Carlo simulations under diverse communications conditions and propose a strategy for selecting a proper receiver under different communication scenarios.Field trials in the pool and sea are also conducted to verify the feasibility and advantages of the DL receivers.It is observed that DL receivers perform better than conventional receivers in terms of bit error rate.
基金Supported by Shandong Provincial Taishan Scholar Program(Grant No.tsqn202312133)Shandong Provincial Natural Science Foundation(Grant Nos.ZR2022YQ61,ZR2023ZD32)+1 种基金Shandong Provincial Natural Science Foundation(Grant No.ZR2023ZD32)National Natural Science Foundation of China(Grant Nos.61772551 and 62111530052)。
文摘For control systems with unknown model parameters,this paper proposes a data-driven iterative learning method for fault estimation.First,input and output data from the system under fault-free conditions are collected.By applying orthogonal triangular decomposition and singular value decomposition,a data-driven realization of the system's kernel representation is derived,based on this representation,a residual generator is constructed.Then,the actuator fault signal is estimated online by analyzing the system's dynamic residual,and an iterative learning algorithm is introduced to continuously optimize the residual-based performance function,thereby enhancing estimation accuracy.The proposed method achieves actuator fault estimation without requiring knowledge of model parameters,eliminating the time-consuming system modeling process,and allowing operators to focus on system optimization and decision-making.Compared with existing fault estimation methods,the proposed method demonstrates superior transient performance,steady-state performance,and real-time capability,reduces the need for manual intervention and lowers operational complexity.Finally,experimental results on a mobile robot verify the effectiveness and advantages of the method.
文摘When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis.
文摘In the rapidly evolving technological landscape,state-owned enterprises(SOEs)encounter significant challenges in sustaining their competitiveness through efficient R&D management.Integrated Product Development(IPD),with its emphasis on cross-functional teamwork,concurrent engineering,and data-driven decision-making,has been widely recognized for enhancing R&D efficiency and product quality.However,the unique characteristics of SOEs pose challenges to the effective implementation of IPD.The advancement of big data and artificial intelligence technologies offers new opportunities for optimizing IPD R&D management through data-driven decision-making models.This paper constructs and validates a data-driven decision-making model tailored to the IPD R&D management of SOEs.By integrating data mining,machine learning,and other advanced analytical techniques,the model serves as a scientific and efficient decision-making tool.It aids SOEs in optimizing R&D resource allocation,shortening product development cycles,reducing R&D costs,and improving product quality and innovation.Moreover,this study contributes to a deeper theoretical understanding of the value of data-driven decision-making in the context of IPD.
基金2025 General Project of Humanities and Social Sciences Research in Henan Higher Education Institutions,“Research on the Dynamic Mechanisms and Paths of Innovative Development of Undergraduate Translation Programs Empowered by New Productive Forces”(Project No.:2025-ZDJH-885)2024 College-Level Undergraduate Teaching Reform Project of the School of Foreign Languages,Henan University of Technology,“Research on Implementation Paths of New Models for Interpreter Training Based on AI Large Models”(Project No.:2024YJWYJG06)+1 种基金2025 First-Class Undergraduate Program Construction Special Project of the School of Foreign Languages,Henan University of Technology,titled“Research on Development Paths for Innovative Development of Undergraduate Translation Programs Empowered by New Productive Forces”(Project No.:2025WYZYJS30)2025 Educational Reform Project of the School of International Education,Henan University of Technology,“A Study on the Language Competence Development Model for International Talents Based on the Al Large Model-Taking IELTS Reading and Writing Teaching Practice as an Example”(Project No.:GJXY202533)。
文摘This paper explores the paradigm reconstruction of interpreting pedagogy driven by generative AI technology.With the breakthroughs of AI technologies such as ChatGPT in natural language processing,traditional interpreting education faces dual challenges of technological substitution and pedagogical transformation.Based on Kuhn’s paradigm theory,the study analyzes the limitations of three traditional interpreting teaching paradigms,language-centric,knowledge-based,and skill-acquisition-oriented,and proposes a novel“teacher-AI-learner”triadic collaborative paradigm.Through reconstructing teaching subjects,environments,and curriculum systems,the integration of real-time translation tools and intelligent terminology databases facilitates the transition from static skill training to dynamic human-machine collaboration.The research simultaneously highlights challenges in technological ethics and curriculum design transformation pressures,emphasizing the necessity to balance technological empowerment with humanistic education.
基金funded by the Key Research and Development Program of Shaanxi,China(No.2024GX-YBXM-503)the National Natural Science Foundation of China(No.51974254)。
文摘Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysis methods have limitations in dealing with these complex and interrelated factors,and it is difficult to fully reveal the actual contribution of each factor to the production.Machine learning-based methods explore the complex mapping relationships between large amounts of data to provide datadriven insights into the key factors driving production.In this study,a data-driven PCA-RF-VIM(Principal Component Analysis-Random Forest-Variable Importance Measures)approach of analyzing the importance of features is proposed to identify the key factors driving post-fracturing production.Four types of parameters,including log parameters,geological and reservoir physical parameters,hydraulic fracturing design parameters,and reservoir stimulation parameters,were inputted into the PCA-RF-VIM model.The model was trained using 6-fold cross-validation and grid search,and the relative importance ranking of each factor was finally obtained.In order to verify the validity of the PCA-RF-VIM model,a consolidation model that uses three other independent data-driven methods(Pearson correlation coefficient,RF feature significance analysis method,and XGboost feature significance analysis method)are applied to compare with the PCA-RF-VIM model.A comparison the two models shows that they contain almost the same parameters in the top ten,with only minor differences in one parameter.In combination with the reservoir characteristics,the reasonableness of the PCA-RF-VIM model is verified,and the importance ranking of the parameters by this method is more consistent with the reservoir characteristics of the study area.Ultimately,the ten parameters are selected as the controlling factors that have the potential to influence post-fracturing gas production,as the combined importance of these top ten parameters is 91.95%on driving natural gas production.Analyzing and obtaining these ten controlling factors provides engineers with a new insight into the reservoir selection for fracturing stimulation and fracturing parameter optimization to improve fracturing efficiency and productivity.
基金funded by the General Program of the National Natural Science Foundation of China grant number 62277022.
文摘The integration of artificial intelligence(AI)is fundamentally reshaping the scientific research,giving rise to a new era of discovery and innovation.This paper explores this transformative shift,introducing an innovative concept of the“AI-Driven Research Ecosystem”,a dynamic and collaborative research environment.Within this ecosystem,we focus on the unification of human-AI collaboration models and the emerging new research thinking paradigms.We analyze the multifaceted roles of AI within the research lifecycle,spanning from a passive tool to an active assistant and autonomous participants,and categorize these interactions into distinct human-AI collaboration models.Furthermore,we examine how the pervasive involvement of AI necessitates an evolution in human research thinking,emphasizing the significant roles of critical,creative,and computational thinking.Through a review of existing literature and illustrative case studies,this paper provides a comprehensive overview of the AI-driven research ecosystem,highlighting its potential for transforming scientific research.Our findings advance the current understanding of AI’s multiple roles in research and underscore its capacity to revolutionize both knowledge discovery and collaborative innovation,paving the way for a more integrated and impactful research paradigm.
基金National Natural Science Foundation of China(Project No.:12371428)Projects of the Provincial College Students’Innovation and Training Program in 2024(Project No.:S202413023106,S202413023110)。
文摘This paper focuses on the numerical solution of a tumor growth model under a data-driven approach.Based on the inherent laws of the data and reasonable assumptions,an ordinary differential equation model for tumor growth is established.Nonlinear fitting is employed to obtain the optimal parameter estimation of the mathematical model,and the numerical solution is carried out using the Matlab software.By comparing the clinical data with the simulation results,a good agreement is achieved,which verifies the rationality and feasibility of the model.
基金supported by the Strategic Research and Consulting Project of the Chinese Academy of Engineering[grant number 2024-XBZD-14]the National Natural Science Foundation of China[grant numbers 42192553 and 41922036]the Fundamental Research Funds for the Central Universities–Cemac“GeoX”Interdisciplinary Program[grant number 020714380207]。
文摘The impacts of lateral boundary conditions(LBCs)provided by numerical models and data-driven networks on convective-scale ensemble forecasts are investigated in this study.Four experiments are conducted on the Hangzhou RDP(19th Hangzhou Asian Games Research Development Project on Convective-scale Ensemble Prediction and Application)testbed,with the LBCs respectively sourced from National Centers for Environmental Prediction(NCEP)Global Forecast System(GFS)forecasts with 33 vertical levels(Exp_GFS),Pangu forecasts with 13 vertical levels(Exp_Pangu),Fuxi forecasts with 13 vertical levels(Exp_Fuxi),and NCEP GFS forecasts with the vertical levels reduced to 13(the same as those of Exp_Pangu and Exp_Fuxi)(Exp_GFSRDV).In general,Exp_Pangu performs comparably to Exp_GFS,while Exp_Fuxi shows slightly inferior performance compared to Exp_Pangu,possibly due to its less accurate large-scale predictions.Therefore,the ability of using data-driven networks to efficiently provide LBCs for convective-scale ensemble forecasts has been demonstrated.Moreover,Exp_GFSRDV has the worst convective-scale forecasts among the four experiments,which indicates the potential improvement of using data-driven networks for LBCs by increasing the vertical levels of the networks.However,the ensemble spread of the four experiments barely increases with lead time.Thus,each experiment has insufficient ensemble spread to present realistic forecast uncertainties,which will be investigated in a future study.
基金Supported by Wuxi Taihu Talent Project,No.WXTTP 2021the General Scientific Research Program of Wuxi Municipal Health Commission,No.M202447.
文摘BACKGROUND Research has consistently demonstrated that patients with major depressive disorder(MDD)exhibit attentional switching dysfunction,and the dual-task paradigm has emerged as a valuable tool for probing cognitive deficits.However,the neuroelectrophysiological mechanism underlying this deficit has not been clarified.AIM To investigate the event-related potential(ERP)characteristics of attentional switching dysfunction and further explore the neuroelectrophysiological mechanism of the cognitive processing deficits underlying attentional switching dysfunction in MDD.METHODS The participants included 29 MDD patients and 29 healthy controls(HCs).The ERPs of the participants were measured while they performed the dual-task para digm.The behavioral and ERP N100,P200,P300,and late positive potential(LPP)data were analyzed.RESULTS This study revealed greater accuracy in HCs and slower reaction times(RTs)in MDD patients.Angry facial pictures led to lower accuracy.The results also revealed shorter RTs for happy facial pictures and the longest RTs for the 500-ms stimulus onset asynchrony.With respect to ERP characteristics,happy facial pictures and neutral facial pictures evoked higher amplitudes.The N100,P200,P300,and LPP amplitudes at Pz were the highest.MDD patients had lower P200 mean amplitudes and LPP amplitudes than HCs did.CONCLUSION In conclusion,MDD patients exhibited abnormal ERP characteristics evoked by the dual-task paradigm,which could be the neural correlates of the known abnormalities in attentional switching in patients with MDD.These results provide valuable insights into the understanding of the neural mechanisms of attentional switching function and may guide targeted interventions in patients with MDD.
基金Supported by the National Natural Science Foundation of China under Grant No.52131102.
文摘With the rapid advancement of machine learning technology and its growing adoption in research and engineering applications,an increasing number of studies have embraced data-driven approaches for modeling wind turbine wakes.These models leverage the ability to capture complex,high-dimensional characteristics of wind turbine wakes while offering significantly greater efficiency in the prediction process than physics-driven models.As a result,data-driven wind turbine wake models are regarded as powerful and effective tools for predicting wake behavior and turbine power output.This paper aims to provide a concise yet comprehensive review of existing studies on wind turbine wake modeling that employ data-driven approaches.It begins by defining and classifying machine learning methods to facilitate a clearer understanding of the reviewed literature.Subsequently,the related studies are categorized into four key areas:wind turbine power prediction,data-driven analytic wake models,wake field reconstruction,and the incorporation of explicit physical constraints.The accuracy of data-driven models is influenced by two primary factors:the quality of the training data and the performance of the model itself.Accordingly,both data accuracy and model structure are discussed in detail within the review.