This paper focuses on the numerical solution of a tumor growth model under a data-driven approach.Based on the inherent laws of the data and reasonable assumptions,an ordinary differential equation model for tumor gro...This paper focuses on the numerical solution of a tumor growth model under a data-driven approach.Based on the inherent laws of the data and reasonable assumptions,an ordinary differential equation model for tumor growth is established.Nonlinear fitting is employed to obtain the optimal parameter estimation of the mathematical model,and the numerical solution is carried out using the Matlab software.By comparing the clinical data with the simulation results,a good agreement is achieved,which verifies the rationality and feasibility of the model.展开更多
Permanent magnet synchronous motor(PMSM)is widely used in alternating current servo systems as it provides high eficiency,high power density,and a wide speed regulation range.The servo system is placing higher demands...Permanent magnet synchronous motor(PMSM)is widely used in alternating current servo systems as it provides high eficiency,high power density,and a wide speed regulation range.The servo system is placing higher demands on its control performance.The model predictive control(MPC)algorithm is emerging as a potential high-performance motor control algorithm due to its capability of handling multiple-input and multipleoutput variables and imposed constraints.For the MPC used in the PMSM control process,there is a nonlinear disturbance caused by the change of electromagnetic parameters or load disturbance that may lead to a mismatch between the nominal model and the controlled object,which causes the prediction error and thus affects the dynamic stability of the control system.This paper proposes a data-driven MPC strategy in which the historical data in an appropriate range are utilized to eliminate the impact of parameter mismatch and further improve the control performance.The stability of the proposed algorithm is proved as the simulation demonstrates the feasibility.Compared with the classical MPC strategy,the superiority of the algorithm has also been verified.展开更多
This paper aims to develop Machine Learning algorithms to classify electronic articles related to this phenomenon by retrieving information and topic modelling.The Methodology of this study is categorized into three p...This paper aims to develop Machine Learning algorithms to classify electronic articles related to this phenomenon by retrieving information and topic modelling.The Methodology of this study is categorized into three phases:the Text Classification Approach(TCA),the Proposed Algorithms Interpretation(PAI),andfinally,Information Retrieval Approach(IRA).The TCA reflects the text preprocessing pipeline called a clean corpus.The Global Vec-tors for Word Representation(Glove)pre-trained model,FastText,Term Frequency-Inverse Document Fre-quency(TF-IDF),and Bag-of-Words(BOW)for extracting the features have been interpreted in this research.The PAI manifests the Bidirectional Long Short-Term Memory(Bi-LSTM)and Convolutional Neural Network(CNN)to classify the COVID-19 news.Again,the IRA explains the mathematical interpretation of Latent Dirich-let Allocation(LDA),obtained for modelling the topic of Information Retrieval(IR).In this study,99%accuracy was obtained by performing K-fold cross-validation on Bi-LSTM with Glove.A comparative analysis between Deep Learning and Machine Learning based on feature extraction and computational complexity exploration has been performed in this research.Furthermore,some text analyses and the most influential aspects of each document have been explored in this study.We have utilized Bidirectional Encoder Representations from Trans-formers(BERT)as a Deep Learning mechanism in our model training,but the result has not been uncovered satisfactory.However,the proposed system can be adjustable in the real-time news classification of COVID-19.展开更多
Due to the signal reflection and diffraction,site-specific unmodeled errors like multipath effect and Non-Line-of-Sight reception are significant error sources in Global Navigation Satellite System since they cannot b...Due to the signal reflection and diffraction,site-specific unmodeled errors like multipath effect and Non-Line-of-Sight reception are significant error sources in Global Navigation Satellite System since they cannot be easily mitigated.However,how to characterize and model the internal mechanisms and external influences of these site-specific unmodeled errors are still to be investigated.Therefore,we propose a method for characterizing and modeling site-specific unmodeled errors under reflection and diffraction using a data-driven approach.Specifically,we first consider all the popular potential features,which generate the site-specific unmodeled errors.We then use the random forest regression to comprehensively analyze the correlations between the site-specific unmodeled errors and the potential features.We finally characterize and model the site-specific unmodeled errors.Two 7-consecutive datasets dominated by signal reflection and diffraction were conducted.The results show that there are significant differences in the correlations with potential features.They are highly related to the application scenarios,observation types,and satellite types.Notably,the innovation vector often shows a strong correlation with the code site-specific unmodeled errors.For the phase site-specific unmodeled errors,they have high correlations with elevation,azimuth,number of visible satellites,and between-frequency differenced phase observations.In the environments of reflection and diffraction,the sum of the correlations of the top six potential features can reach approximately 88.5 and 87.7%,respectively.Meanwhile,these correlations are stable for different observation types and satellite types.With the integration of a transformer model with the random forest method,a high-precision unmodeled error prediction model is established,demonstrating the necessity to include multiple features for accurate and efficient characterization and modeling of site-specific unmodeled errors.展开更多
Manufacturers are striving to achieve higher energy efficiency without compromising production performance and quality standards.Parallel-serial structures,commonly found in modern production systems,offer a unique ba...Manufacturers are striving to achieve higher energy efficiency without compromising production performance and quality standards.Parallel-serial structures,commonly found in modern production systems,offer a unique balance of flexibility and efficiency by combining parallel processes with sequential workflows.However,their inherent complexity poses significant challenges,particularly in optimizing energy efficiency and ensuring consistent product quality.In data-driven manufacturing environments,it is not clear how to leverage production data to enhance the energy efficiency of production systems.Therefore,this paper studied a data-driven approach to improving energy efficiency in parallel-serial production lines with product quality issues.Firstly,the authors developed a data-driven performance analysis method to evaluate the effects of disruption events,such as energy-saving control actions,machine breakdowns,and product quality failures,on system throughput and energy consumption.Secondly,a periodic energy-saving control method was developed to enhance system energy efficiency using a non-linear programming model.To reduce complexity and improve computational efficiency,the model was simplified by leveraging the intrinsic properties of parallel-serial production lines and solved using an adaptive genetic algorithm.Finally,the effectiveness of the proposed data-driven approach was validated through case studies,providing actionable insights into achieving data-driven energy efficiency optimization in complex production systems.展开更多
Building integrated energy systems(BIESs)are pivotal for enhancing energy efficiency by accounting for a significant proportion of global energy consumption.Two key barriers that reduce the BIES operational efficiency...Building integrated energy systems(BIESs)are pivotal for enhancing energy efficiency by accounting for a significant proportion of global energy consumption.Two key barriers that reduce the BIES operational efficiency mainly lie in the renewable generation uncertainty and operational non-convexity of combined heat and power(CHP)units.To this end,this paper proposes a soft actor-critic(SAC)algorithm to solve the scheduling problem of BIES,which overcomes the model non-convexity and shows advantages in robustness and generalization.This paper also adopts a temporal fusion transformer(TFT)to enhance the optimal solution for the SAC algorithm by forecasting the renewable generation and energy demand.The TFT can effectively capture the complex temporal patterns and dependencies that span multiple steps.Furthermore,its forecasting results are interpretable due to the employment of a self-attention layer so as to assist in more trustworthy decision-making in the SAC algorithm.The proposed hybrid data-driven approach integrating TFT and SAC algorithm,i.e.,TFT-SAC approach,is trained and tested on a real-world dataset to validate its superior performance in reducing the energy cost and computational time compared with the benchmark approaches.The generalization performance for the scheduling policy,as well as the sensitivity analysis,are examined in the case studies.展开更多
To address the complex coupling between aerodynamic characteristics and guidance control for morphing flight missiles,this study proposes a data-driven approach to integrated adaptive morphing and guidance.Firstly,an ...To address the complex coupling between aerodynamic characteristics and guidance control for morphing flight missiles,this study proposes a data-driven approach to integrated adaptive morphing and guidance.Firstly,an aerodynamic surrogate model is constructed using a fully connected neural network(FCNN),mapping the configuration parameters to aerodynamic parameters.Secondly,an adaptive physical parameters optimization network(PPON)is developed to optimize aerodynamic characteristics based on predictions from the aerodynamic surrogate model.Thirdly,an integrated morphing and guidance model is derived by applying the proximal policy optimization(PPO)algorithm from deep reinforcement learning(DRL),embedded with the adaptive aerodynamic optimization model.Eventually,the proposed integrated approach is applied to the guidance task of a morphing cruise missile with variable camber wings.Simulation results demonstrate that the integrated guidance model significantly enhances aerodynamic performance and generates more continuous guidance commands within approximately 4.3 s,outperforming the deep Q-Network(DQN)algorithm under morphing flight conditions.Moreover,compared to the PPO and DQN-based guidance laws without morphing flight conditions,the integrated model improves both the guidance accuracy and terminal kinetic energy.Furthermore,the integrated guidance model,trained on stationary targets,remains effective for engaging moving and maneuvering targets,showcasing its robust generalization capability.展开更多
BACKGROUND Due to the increasing rate of thyroid nodules diagnosis,and the desire to avoid the unsightly cervical scar,remote thyroidectomies were invented and are increasingly performed.Transoral endoscopic thyroidec...BACKGROUND Due to the increasing rate of thyroid nodules diagnosis,and the desire to avoid the unsightly cervical scar,remote thyroidectomies were invented and are increasingly performed.Transoral endoscopic thyroidectomy vestibular approach and trans-areolar approaches(TAA)are the two most commonly used remote approaches.No previous meta-analysis has compared postoperative infections and swallowing difficulties among the two procedures.AIM To compared the same among patients undergoing lobectomy for unilateral thyroid carcinoma/benign thyroid nodule.METHODS We searched PubMed MEDLINE,Google Scholar,and Cochrane Library from the date of the first published article up to August 2025.The term used were transoral thyroidectomy vestibular approach,trans areolar thyroidectomy,scarless thyroidectomy,remote thyroidectomy,infections,postoperative,inflammation,dysphagia,and swallowing difficulties.We identified 130 studies,of them,30 full texts were screened and only six studies were included in the final meta-analysis.RESULTS Postoperative infections were not different between the two approaches,odd ratio=1.33,95%confidence interval:0.50-3.53,theχ2 was 1.92 and the P-value for overall effect of 0.57.Similarly,transient swallowing difficulty was not different between the two forms of surgery,with odd ratio=0.91,95%confidence interval:0.35-2.40;theχ2 was 1.32,and the P-value for overall effect of 0.85.CONCLUSION No significant statistical differences were evident between trans-oral endoscopic Mirghani H.Infections and swallowing difficulty in scarless thyroidectomy WJCC https://www.wjgnet.com 2 January 6,2026 Volume 14 Issue 1 thyroidectomy vestibular approach and trans-areolar approach regarding postoperative infection and transient swallowing difficulties.Further longer randomized trials are needed.展开更多
This study integrates multiple sources of data(transaction data,policy text,public opinion data)with visualization techniques(such as heat maps,time-series trend charts,3D building brochures)to construct an analysis f...This study integrates multiple sources of data(transaction data,policy text,public opinion data)with visualization techniques(such as heat maps,time-series trend charts,3D building brochures)to construct an analysis framework for the Chengdu real estate market.By using the Adaptive Neuro-Fuzzy Inference System(ANFIS)prediction model,spatial GIS(Geographic Information System analysis)analysis,and interactive dashboards,this study reveals market differentiation,policy impacts,and changes in demand structure,thereby providing decision support for the government,enterprises,and homebuyers.展开更多
To address the issue of instability or even imbalance in the orientation and attitude control of quadrotor unmanned aerial vehicles(QUAVs)under random disturbances,this paper proposes a distributed antidisturbance dat...To address the issue of instability or even imbalance in the orientation and attitude control of quadrotor unmanned aerial vehicles(QUAVs)under random disturbances,this paper proposes a distributed antidisturbance data-driven event-triggered fusion control method,which achieves efficient fault diagnosis while suppressing random disturbances and mitigating communication conflicts within the QUAV swarm.First,the impact of random disturbances on the UAV swarm is analyzed,and a model for orientation and attitude control of QUAVs under stochastic perturbations is established,with the disturbance gain threshold determined.Second,a fault diagnosis system based on a high-gain observer is designed,constructing a fault gain criterion by integrating orientation and attitude information from QUAVs.Subsequently,a model-free dynamic linearization-based data modeling(MFDLDM)framework is developed using model-free adaptive control,which efficiently fits the nonlinear control model of the QUAV swarm while reducing temporal constraints on control data.On this basis,this paper constructs a distributed data-driven event-triggered controller based on the staggered communication mechanism,which consists of an equivalent QUAV controller and an event-triggered controller,and is able to reduce the communication conflicts while suppressing the influence of random interference.Finally,by incorporating random disturbances into the controller,comparative experiments and physical validations are conducted on the QUAV platforms,fully demonstrating the strong adaptability and robustness of the proposed distributed event-triggered fault-tolerant control system.展开更多
Wetting deformation in earth-rockfill dams is a critical factor influencingdam safety.Although numerous mathematical models have been developed to describe this phenomenon,most of them rely on empirical formulations a...Wetting deformation in earth-rockfill dams is a critical factor influencingdam safety.Although numerous mathematical models have been developed to describe this phenomenon,most of them rely on empirical formulations and lack prior knowledge of model parameters,which is essential for Bayesian parameter inversion to enhance accuracy and reduce uncertainty.This study introduces a datadriven approach to establishing prior knowledge of earth-rockfill dams.Driving factors are utilized to determine the potential range of model parameters,and settlement changes within this range are calculated.The results are iteratively compared with actual monitoring data until the calculated range encompasses the observed data,thereby providing prior knowledge of the model parameters.The proposed method is applied to the right-bank earth-rockfilldam of Danjiangkou.Employing a Gibbs sample size of 30,000,the proposed method effectively calibrates the prior knowledge of the wetting model parameters,achieving a root mean square error(RMSE)of 5.18 mm for the settlement predictions.By comparison,the use of non-informative priors with sample sizes of 30,000 and 50,000 results in significantly larger RMSE values of 11.97 mm and 16.07 mm,respectively.Furthermore,the computational efficiencyof the proposed method is demonstrated by an inversion computation time of 902 s for 30,000 samples,which is notably shorter than the 1026 s and 1558 s required for noninformative priors with 30,000 and 50,000 samples,respectively.These findingsunderscore the superior performance of the proposed approach in terms of both prediction accuracy and computational efficiency.These results demonstrate that the proposed method not only improves the predictive accuracy but also enhances the computational efficiency,enabling optimal parameter identificationwith reduced computational effort.This approach provides a robust and efficientframework for advancing dam safety assessments.展开更多
The photoacoustic imaging of lipid is intrinsically constrained by the feeble nature of endogenous lipid signals,posing a persistent sensitivity challenge that demands innovative solutions.Although adopting high-effic...The photoacoustic imaging of lipid is intrinsically constrained by the feeble nature of endogenous lipid signals,posing a persistent sensitivity challenge that demands innovative solutions.Although adopting high-efficiency excitation and detection elements may improve the imaging sensitivity to a certain extent,the application of the elements is inevitably subject to various limitations in practical applications,particularly during in vivo imaging and endoscopic imaging.In this study,we propose a multi-combinatorial approach to enhance the sensitivity of lipid photoacoustic imaging.The approach involves wavelet transform processing of one-dimensional A-line signals,gradient-based denoising of two-dimensional B-scan images,and finally,threedimensional spatial weighted averaging of the data processed by the previous two steps.This method not only significantly improves the signal-to-noise ratio(SNR)in distinguished feature regions of the image by around 10 dB,but also efficiently extracts weak signals with no distinct features in the original image.After processing with this method,the images acquired under single scanning were compared with those obtained under multiple scanning.The results showed highly consistent image features,with the structural similarity index increasing from 0.2 to 0.8,confirming the accuracy and reliability of the multi-combinatorial approach.展开更多
With the rapid development of intelligent navigation technology,efficient and safe path planning for mobile robots has become a core requirement.To address the challenges of complex dynamic environments,this paper pro...With the rapid development of intelligent navigation technology,efficient and safe path planning for mobile robots has become a core requirement.To address the challenges of complex dynamic environments,this paper proposes an intelligent path planning framework based on grid map modeling.First,an improved Safe and Smooth A*(SSA*)algorithm is employed for global path planning.By incorporating obstacle expansion and cornerpoint optimization,the proposed SSA*enhances the safety and smoothness of the planned path.Then,a Partitioned Dynamic Window Approach(PDWA)is integrated for local planning,which is triggered when dynamic or sudden static obstacles appear,enabling real-time obstacle avoidance and path adjustment.A unified objective function is constructed,considering path length,safety,and smoothness comprehensively.Multiple simulation experiments are conducted on typical port grid maps.The results demonstrate that the improved SSA*significantly reduces the number of expanded nodes and computation time in static environmentswhile generating smoother and safer paths.Meanwhile,the PDWA exhibits strong real-time performance and robustness in dynamic scenarios,achieving shorter paths and lower planning times compared to other graph search algorithms.The proposedmethodmaintains stable performance across maps of different scales and various port scenarios,verifying its practicality and potential for wider application.展开更多
The 17 Sustainable Development Goals(SDGs)for 2030,adopted by all United Nations member states in 2015,are facing a range of challenges.Factors such as climate change,regional conflicts and economic recession are havi...The 17 Sustainable Development Goals(SDGs)for 2030,adopted by all United Nations member states in 2015,are facing a range of challenges.Factors such as climate change,regional conflicts and economic recession are having a significant impact,particularly on global poverty governance.As a platform for dialogue,exchange and technical cooperation,the 2025 International Seminar on Global Poverty Reduction Partnerships was held in Beijing on 10 December 2025.展开更多
The rapid identification of γ-emitting radionuclides with low activity levels in public areas is crucial for nuclear safety.However,classical methods rely on full-energy peaks in the integral spectrum,requiring suffi...The rapid identification of γ-emitting radionuclides with low activity levels in public areas is crucial for nuclear safety.However,classical methods rely on full-energy peaks in the integral spectrum,requiring sufficient count accumulation for evaluation,thereby limiting response time.The sequential Bayesian approach,which utilizes prior information and considers both photon energies and interarrival times,can significantly enhance the performance of radionuclides identification.This study proposes a theoretical optimization method for the traditional sequential Bayesian approach.Each photon is processed sequentially,and the corresponding posterior probability is updated in real time using a noninformative prior from the Bayesian theory.By comparing the posterior probabilities of the background and radionuclides based on the energy variance and time interval,the type of γ-rays can be identified(background characteristic γ-rays,Compton plateaus γ-rays,or radionuclide-specific characteristic γ-rays).By integrating the information from these multiple characteristic γ-rays,the presence and type of radionuclides were determined based on the final decision function and a set threshold.Based on theoretical research,verification experiments were conducted using a LaBr_(3)(Ce)detector in both low-and natural background radiation environments with typical radionuclides(^(137)Cs,^(60)Co,and ^(133)Ba).The results show that this approach can identify ^(137)Cs in 7.9 s and 8.5 s(source dose rate contribution:approximately 6.5×10^(−3)μGy/h),^(60)Co in 8.1 s and 9.8 s(approximately 4.8×10^(−2)μGy/h),and ^(133)Ba in 4.05 s and 5.99 s(approximately 3.4×10^(−2)μGy/h)under low and natural background radiation,respectively,with a miss rate below 0.01%.This demonstrates the effectiveness of the proposed approach for fast radionuclides identification,even at low activity levels and highlights its potential for enhancing public safety in diverse radiation environments.展开更多
The world’s increasing population requires the process industry to produce food,fuels,chemicals,and consumer products in a more efficient and sustainable way.Functional process materials lie at the heart of this chal...The world’s increasing population requires the process industry to produce food,fuels,chemicals,and consumer products in a more efficient and sustainable way.Functional process materials lie at the heart of this challenge.Traditionally,new advanced materials are found empirically or through trial-and-error approaches.As theoretical methods and associated tools are being continuously improved and computer power has reached a high level,it is now efficient and popular to use computational methods to guide material selection and design.Due to the strong interaction between material selection and the operation of the process in which the material is used,it is essential to perform material and process design simultaneously.Despite this significant connection,the solution of the integrated material and process design problem is not easy because multiple models at different scales are usually required.Hybrid modeling provides a promising option to tackle such complex design problems.In hybrid modeling,the material properties,which are computationally expensive to obtain,are described by data-driven models,while the well-known process-related principles are represented by mechanistic models.This article highlights the significance of hybrid modeling in multiscale material and process design.The generic design methodology is first introduced.Six important application areas are then selected:four from the chemical engineering field and two from the energy systems engineering domain.For each selected area,state-ofthe-art work using hybrid modeling for multiscale material and process design is discussed.Concluding remarks are provided at the end,and current limitations and future opportunities are pointed out.展开更多
A comprehensive and precise analysis of shale gas production performance is crucial for evaluating resource potential,designing a field development plan,and making investment decisions.However,quantitative analysis ca...A comprehensive and precise analysis of shale gas production performance is crucial for evaluating resource potential,designing a field development plan,and making investment decisions.However,quantitative analysis can be challenging because production performance is dominated by the complex interaction among a series of geological and engineering factors.In fact,each factor can be viewed as a player who makes cooperative contributions to the production payoff within the constraints of physical laws and models.Inspired by the idea,we propose a hybrid data-driven analysis framework in this study,where the contributions of dominant factors are quantitatively evaluated,the productions are precisely forecasted,and the development optimization suggestions are comprehensively generated.More specifically,game theory and machine learning models are coupled to determine the dominating geological and engineering factors.The Shapley value with definite physical meaning is employed to quantitatively measure the effects of individual factors.A multi-model-fused stacked model is trained for production forecast,which provides the basis for derivative-free optimization algorithms to optimize the development plan.The complete workflow is validated with actual production data collected from the Fuling shale gas field,Sichuan Basin,China.The validation results show that the proposed procedure can draw rigorous conclusions with quantified evidence and thereby provide specific and reliable suggestions for development plan optimization.Comparing with traditional and experience-based approaches,the hybrid data-driven procedure is advanced in terms of both efficiency and accuracy.展开更多
基金National Natural Science Foundation of China(Project No.:12371428)Projects of the Provincial College Students’Innovation and Training Program in 2024(Project No.:S202413023106,S202413023110)。
文摘This paper focuses on the numerical solution of a tumor growth model under a data-driven approach.Based on the inherent laws of the data and reasonable assumptions,an ordinary differential equation model for tumor growth is established.Nonlinear fitting is employed to obtain the optimal parameter estimation of the mathematical model,and the numerical solution is carried out using the Matlab software.By comparing the clinical data with the simulation results,a good agreement is achieved,which verifies the rationality and feasibility of the model.
文摘Permanent magnet synchronous motor(PMSM)is widely used in alternating current servo systems as it provides high eficiency,high power density,and a wide speed regulation range.The servo system is placing higher demands on its control performance.The model predictive control(MPC)algorithm is emerging as a potential high-performance motor control algorithm due to its capability of handling multiple-input and multipleoutput variables and imposed constraints.For the MPC used in the PMSM control process,there is a nonlinear disturbance caused by the change of electromagnetic parameters or load disturbance that may lead to a mismatch between the nominal model and the controlled object,which causes the prediction error and thus affects the dynamic stability of the control system.This paper proposes a data-driven MPC strategy in which the historical data in an appropriate range are utilized to eliminate the impact of parameter mismatch and further improve the control performance.The stability of the proposed algorithm is proved as the simulation demonstrates the feasibility.Compared with the classical MPC strategy,the superiority of the algorithm has also been verified.
文摘This paper aims to develop Machine Learning algorithms to classify electronic articles related to this phenomenon by retrieving information and topic modelling.The Methodology of this study is categorized into three phases:the Text Classification Approach(TCA),the Proposed Algorithms Interpretation(PAI),andfinally,Information Retrieval Approach(IRA).The TCA reflects the text preprocessing pipeline called a clean corpus.The Global Vec-tors for Word Representation(Glove)pre-trained model,FastText,Term Frequency-Inverse Document Fre-quency(TF-IDF),and Bag-of-Words(BOW)for extracting the features have been interpreted in this research.The PAI manifests the Bidirectional Long Short-Term Memory(Bi-LSTM)and Convolutional Neural Network(CNN)to classify the COVID-19 news.Again,the IRA explains the mathematical interpretation of Latent Dirich-let Allocation(LDA),obtained for modelling the topic of Information Retrieval(IR).In this study,99%accuracy was obtained by performing K-fold cross-validation on Bi-LSTM with Glove.A comparative analysis between Deep Learning and Machine Learning based on feature extraction and computational complexity exploration has been performed in this research.Furthermore,some text analyses and the most influential aspects of each document have been explored in this study.We have utilized Bidirectional Encoder Representations from Trans-formers(BERT)as a Deep Learning mechanism in our model training,but the result has not been uncovered satisfactory.However,the proposed system can be adjustable in the real-time news classification of COVID-19.
基金funded by the National Natural Science Foundation of China(42374014,42004014).
文摘Due to the signal reflection and diffraction,site-specific unmodeled errors like multipath effect and Non-Line-of-Sight reception are significant error sources in Global Navigation Satellite System since they cannot be easily mitigated.However,how to characterize and model the internal mechanisms and external influences of these site-specific unmodeled errors are still to be investigated.Therefore,we propose a method for characterizing and modeling site-specific unmodeled errors under reflection and diffraction using a data-driven approach.Specifically,we first consider all the popular potential features,which generate the site-specific unmodeled errors.We then use the random forest regression to comprehensively analyze the correlations between the site-specific unmodeled errors and the potential features.We finally characterize and model the site-specific unmodeled errors.Two 7-consecutive datasets dominated by signal reflection and diffraction were conducted.The results show that there are significant differences in the correlations with potential features.They are highly related to the application scenarios,observation types,and satellite types.Notably,the innovation vector often shows a strong correlation with the code site-specific unmodeled errors.For the phase site-specific unmodeled errors,they have high correlations with elevation,azimuth,number of visible satellites,and between-frequency differenced phase observations.In the environments of reflection and diffraction,the sum of the correlations of the top six potential features can reach approximately 88.5 and 87.7%,respectively.Meanwhile,these correlations are stable for different observation types and satellite types.With the integration of a transformer model with the random forest method,a high-precision unmodeled error prediction model is established,demonstrating the necessity to include multiple features for accurate and efficient characterization and modeling of site-specific unmodeled errors.
基金supported in part by Major Project of the National Social Science Fund of China,under Grant No.23&ZD050in part by National Natural Science Foundation of China(NSFC),under Grant Nos.72402031 and 71971052+2 种基金in part by Open Project Program of State Key Laboratory of Massive Personalized Customization System and Technology,under Grant No.H&C-MPC-2023-04-03in part by the Fundamental Research Funds for the Central Universities,under Grant No.N25ZJL015the Joint Funds of the Natural Science Foundation of Liaoning,under Grant No.2023-BSBA-139.
文摘Manufacturers are striving to achieve higher energy efficiency without compromising production performance and quality standards.Parallel-serial structures,commonly found in modern production systems,offer a unique balance of flexibility and efficiency by combining parallel processes with sequential workflows.However,their inherent complexity poses significant challenges,particularly in optimizing energy efficiency and ensuring consistent product quality.In data-driven manufacturing environments,it is not clear how to leverage production data to enhance the energy efficiency of production systems.Therefore,this paper studied a data-driven approach to improving energy efficiency in parallel-serial production lines with product quality issues.Firstly,the authors developed a data-driven performance analysis method to evaluate the effects of disruption events,such as energy-saving control actions,machine breakdowns,and product quality failures,on system throughput and energy consumption.Secondly,a periodic energy-saving control method was developed to enhance system energy efficiency using a non-linear programming model.To reduce complexity and improve computational efficiency,the model was simplified by leveraging the intrinsic properties of parallel-serial production lines and solved using an adaptive genetic algorithm.Finally,the effectiveness of the proposed data-driven approach was validated through case studies,providing actionable insights into achieving data-driven energy efficiency optimization in complex production systems.
文摘Building integrated energy systems(BIESs)are pivotal for enhancing energy efficiency by accounting for a significant proportion of global energy consumption.Two key barriers that reduce the BIES operational efficiency mainly lie in the renewable generation uncertainty and operational non-convexity of combined heat and power(CHP)units.To this end,this paper proposes a soft actor-critic(SAC)algorithm to solve the scheduling problem of BIES,which overcomes the model non-convexity and shows advantages in robustness and generalization.This paper also adopts a temporal fusion transformer(TFT)to enhance the optimal solution for the SAC algorithm by forecasting the renewable generation and energy demand.The TFT can effectively capture the complex temporal patterns and dependencies that span multiple steps.Furthermore,its forecasting results are interpretable due to the employment of a self-attention layer so as to assist in more trustworthy decision-making in the SAC algorithm.The proposed hybrid data-driven approach integrating TFT and SAC algorithm,i.e.,TFT-SAC approach,is trained and tested on a real-world dataset to validate its superior performance in reducing the energy cost and computational time compared with the benchmark approaches.The generalization performance for the scheduling policy,as well as the sensitivity analysis,are examined in the case studies.
基金supported by the National Natural Science Foundation of China under Grant Number U2341215the China Postdoctoral Science Foundation under Grant Number 2024M764224.
文摘To address the complex coupling between aerodynamic characteristics and guidance control for morphing flight missiles,this study proposes a data-driven approach to integrated adaptive morphing and guidance.Firstly,an aerodynamic surrogate model is constructed using a fully connected neural network(FCNN),mapping the configuration parameters to aerodynamic parameters.Secondly,an adaptive physical parameters optimization network(PPON)is developed to optimize aerodynamic characteristics based on predictions from the aerodynamic surrogate model.Thirdly,an integrated morphing and guidance model is derived by applying the proximal policy optimization(PPO)algorithm from deep reinforcement learning(DRL),embedded with the adaptive aerodynamic optimization model.Eventually,the proposed integrated approach is applied to the guidance task of a morphing cruise missile with variable camber wings.Simulation results demonstrate that the integrated guidance model significantly enhances aerodynamic performance and generates more continuous guidance commands within approximately 4.3 s,outperforming the deep Q-Network(DQN)algorithm under morphing flight conditions.Moreover,compared to the PPO and DQN-based guidance laws without morphing flight conditions,the integrated model improves both the guidance accuracy and terminal kinetic energy.Furthermore,the integrated guidance model,trained on stationary targets,remains effective for engaging moving and maneuvering targets,showcasing its robust generalization capability.
文摘BACKGROUND Due to the increasing rate of thyroid nodules diagnosis,and the desire to avoid the unsightly cervical scar,remote thyroidectomies were invented and are increasingly performed.Transoral endoscopic thyroidectomy vestibular approach and trans-areolar approaches(TAA)are the two most commonly used remote approaches.No previous meta-analysis has compared postoperative infections and swallowing difficulties among the two procedures.AIM To compared the same among patients undergoing lobectomy for unilateral thyroid carcinoma/benign thyroid nodule.METHODS We searched PubMed MEDLINE,Google Scholar,and Cochrane Library from the date of the first published article up to August 2025.The term used were transoral thyroidectomy vestibular approach,trans areolar thyroidectomy,scarless thyroidectomy,remote thyroidectomy,infections,postoperative,inflammation,dysphagia,and swallowing difficulties.We identified 130 studies,of them,30 full texts were screened and only six studies were included in the final meta-analysis.RESULTS Postoperative infections were not different between the two approaches,odd ratio=1.33,95%confidence interval:0.50-3.53,theχ2 was 1.92 and the P-value for overall effect of 0.57.Similarly,transient swallowing difficulty was not different between the two forms of surgery,with odd ratio=0.91,95%confidence interval:0.35-2.40;theχ2 was 1.32,and the P-value for overall effect of 0.85.CONCLUSION No significant statistical differences were evident between trans-oral endoscopic Mirghani H.Infections and swallowing difficulty in scarless thyroidectomy WJCC https://www.wjgnet.com 2 January 6,2026 Volume 14 Issue 1 thyroidectomy vestibular approach and trans-areolar approach regarding postoperative infection and transient swallowing difficulties.Further longer randomized trials are needed.
基金Chengdu City Philosophy and Social Sciences Research Center“artificial intelligence+urban communication”theory and Application Research Center Project“Chengdu real estate vertical market public opinion data visualization research”(Project No.RZCC2025017).
文摘This study integrates multiple sources of data(transaction data,policy text,public opinion data)with visualization techniques(such as heat maps,time-series trend charts,3D building brochures)to construct an analysis framework for the Chengdu real estate market.By using the Adaptive Neuro-Fuzzy Inference System(ANFIS)prediction model,spatial GIS(Geographic Information System analysis)analysis,and interactive dashboards,this study reveals market differentiation,policy impacts,and changes in demand structure,thereby providing decision support for the government,enterprises,and homebuyers.
基金supported in part by the National Natural Science Foundation of China,Grant/Award Number:62003267the Key Research and Development Program of Shaanxi Province,Grant/Award Number:2023-GHZD-33Open Project of the State Key Laboratory of Intelligent Game,Grant/Award Number:ZBKF-23-05。
文摘To address the issue of instability or even imbalance in the orientation and attitude control of quadrotor unmanned aerial vehicles(QUAVs)under random disturbances,this paper proposes a distributed antidisturbance data-driven event-triggered fusion control method,which achieves efficient fault diagnosis while suppressing random disturbances and mitigating communication conflicts within the QUAV swarm.First,the impact of random disturbances on the UAV swarm is analyzed,and a model for orientation and attitude control of QUAVs under stochastic perturbations is established,with the disturbance gain threshold determined.Second,a fault diagnosis system based on a high-gain observer is designed,constructing a fault gain criterion by integrating orientation and attitude information from QUAVs.Subsequently,a model-free dynamic linearization-based data modeling(MFDLDM)framework is developed using model-free adaptive control,which efficiently fits the nonlinear control model of the QUAV swarm while reducing temporal constraints on control data.On this basis,this paper constructs a distributed data-driven event-triggered controller based on the staggered communication mechanism,which consists of an equivalent QUAV controller and an event-triggered controller,and is able to reduce the communication conflicts while suppressing the influence of random interference.Finally,by incorporating random disturbances into the controller,comparative experiments and physical validations are conducted on the QUAV platforms,fully demonstrating the strong adaptability and robustness of the proposed distributed event-triggered fault-tolerant control system.
基金supported by the National Key R&D Program of China(Grant No.2023YFC3209504)Natural Science Foundation of Wuhan(Grant No.2024040801020271)the Fundamental Research Funds for Central Public Welfare Research Institutes(Grant No.CKSF2025718/YT).
文摘Wetting deformation in earth-rockfill dams is a critical factor influencingdam safety.Although numerous mathematical models have been developed to describe this phenomenon,most of them rely on empirical formulations and lack prior knowledge of model parameters,which is essential for Bayesian parameter inversion to enhance accuracy and reduce uncertainty.This study introduces a datadriven approach to establishing prior knowledge of earth-rockfill dams.Driving factors are utilized to determine the potential range of model parameters,and settlement changes within this range are calculated.The results are iteratively compared with actual monitoring data until the calculated range encompasses the observed data,thereby providing prior knowledge of the model parameters.The proposed method is applied to the right-bank earth-rockfilldam of Danjiangkou.Employing a Gibbs sample size of 30,000,the proposed method effectively calibrates the prior knowledge of the wetting model parameters,achieving a root mean square error(RMSE)of 5.18 mm for the settlement predictions.By comparison,the use of non-informative priors with sample sizes of 30,000 and 50,000 results in significantly larger RMSE values of 11.97 mm and 16.07 mm,respectively.Furthermore,the computational efficiencyof the proposed method is demonstrated by an inversion computation time of 902 s for 30,000 samples,which is notably shorter than the 1026 s and 1558 s required for noninformative priors with 30,000 and 50,000 samples,respectively.These findingsunderscore the superior performance of the proposed approach in terms of both prediction accuracy and computational efficiency.These results demonstrate that the proposed method not only improves the predictive accuracy but also enhances the computational efficiency,enabling optimal parameter identificationwith reduced computational effort.This approach provides a robust and efficientframework for advancing dam safety assessments.
基金supported by the National Key Research and Development Program of China(2022YFC2402400)the National Natural Science Foundation of China(82027803,62275062)+7 种基金the Guangdong Provincial Key Laboratory of Biomedical Optical Imaging Technology(2020B121201010)the Shenzhen Science and Technology Innovation Committee under Grant(JCYJ20220818101417039)the Shenzhen Key Laboratory for Molecular lmaging(ZDSY20130401165820357)the Shenzhen Medical Research Fund(D2404002)the Project of Shandong Innovation and Startup Community of High-end Medical Apparatus and Instruments(2023-SGTTXM-002 and 2024-SGTTXM-005)the Shandong Province Technology Innovation Guidance Plan(Central Leading Local Science and Technology Development Fund)(YDZX2023115)the Taishan Scholar Special Funding Project of Shandong Provinceand the Shandong Laboratory of Advanced Biomaterials and Medical Devices in Weihai(ZL202402).
文摘The photoacoustic imaging of lipid is intrinsically constrained by the feeble nature of endogenous lipid signals,posing a persistent sensitivity challenge that demands innovative solutions.Although adopting high-efficiency excitation and detection elements may improve the imaging sensitivity to a certain extent,the application of the elements is inevitably subject to various limitations in practical applications,particularly during in vivo imaging and endoscopic imaging.In this study,we propose a multi-combinatorial approach to enhance the sensitivity of lipid photoacoustic imaging.The approach involves wavelet transform processing of one-dimensional A-line signals,gradient-based denoising of two-dimensional B-scan images,and finally,threedimensional spatial weighted averaging of the data processed by the previous two steps.This method not only significantly improves the signal-to-noise ratio(SNR)in distinguished feature regions of the image by around 10 dB,but also efficiently extracts weak signals with no distinct features in the original image.After processing with this method,the images acquired under single scanning were compared with those obtained under multiple scanning.The results showed highly consistent image features,with the structural similarity index increasing from 0.2 to 0.8,confirming the accuracy and reliability of the multi-combinatorial approach.
文摘With the rapid development of intelligent navigation technology,efficient and safe path planning for mobile robots has become a core requirement.To address the challenges of complex dynamic environments,this paper proposes an intelligent path planning framework based on grid map modeling.First,an improved Safe and Smooth A*(SSA*)algorithm is employed for global path planning.By incorporating obstacle expansion and cornerpoint optimization,the proposed SSA*enhances the safety and smoothness of the planned path.Then,a Partitioned Dynamic Window Approach(PDWA)is integrated for local planning,which is triggered when dynamic or sudden static obstacles appear,enabling real-time obstacle avoidance and path adjustment.A unified objective function is constructed,considering path length,safety,and smoothness comprehensively.Multiple simulation experiments are conducted on typical port grid maps.The results demonstrate that the improved SSA*significantly reduces the number of expanded nodes and computation time in static environmentswhile generating smoother and safer paths.Meanwhile,the PDWA exhibits strong real-time performance and robustness in dynamic scenarios,achieving shorter paths and lower planning times compared to other graph search algorithms.The proposedmethodmaintains stable performance across maps of different scales and various port scenarios,verifying its practicality and potential for wider application.
文摘The 17 Sustainable Development Goals(SDGs)for 2030,adopted by all United Nations member states in 2015,are facing a range of challenges.Factors such as climate change,regional conflicts and economic recession are having a significant impact,particularly on global poverty governance.As a platform for dialogue,exchange and technical cooperation,the 2025 International Seminar on Global Poverty Reduction Partnerships was held in Beijing on 10 December 2025.
基金supported by the Program for NIM-Basic Research Business Expenses Key Field Program,China(No.AKYCX2315).
文摘The rapid identification of γ-emitting radionuclides with low activity levels in public areas is crucial for nuclear safety.However,classical methods rely on full-energy peaks in the integral spectrum,requiring sufficient count accumulation for evaluation,thereby limiting response time.The sequential Bayesian approach,which utilizes prior information and considers both photon energies and interarrival times,can significantly enhance the performance of radionuclides identification.This study proposes a theoretical optimization method for the traditional sequential Bayesian approach.Each photon is processed sequentially,and the corresponding posterior probability is updated in real time using a noninformative prior from the Bayesian theory.By comparing the posterior probabilities of the background and radionuclides based on the energy variance and time interval,the type of γ-rays can be identified(background characteristic γ-rays,Compton plateaus γ-rays,or radionuclide-specific characteristic γ-rays).By integrating the information from these multiple characteristic γ-rays,the presence and type of radionuclides were determined based on the final decision function and a set threshold.Based on theoretical research,verification experiments were conducted using a LaBr_(3)(Ce)detector in both low-and natural background radiation environments with typical radionuclides(^(137)Cs,^(60)Co,and ^(133)Ba).The results show that this approach can identify ^(137)Cs in 7.9 s and 8.5 s(source dose rate contribution:approximately 6.5×10^(−3)μGy/h),^(60)Co in 8.1 s and 9.8 s(approximately 4.8×10^(−2)μGy/h),and ^(133)Ba in 4.05 s and 5.99 s(approximately 3.4×10^(−2)μGy/h)under low and natural background radiation,respectively,with a miss rate below 0.01%.This demonstrates the effectiveness of the proposed approach for fast radionuclides identification,even at low activity levels and highlights its potential for enhancing public safety in diverse radiation environments.
文摘The world’s increasing population requires the process industry to produce food,fuels,chemicals,and consumer products in a more efficient and sustainable way.Functional process materials lie at the heart of this challenge.Traditionally,new advanced materials are found empirically or through trial-and-error approaches.As theoretical methods and associated tools are being continuously improved and computer power has reached a high level,it is now efficient and popular to use computational methods to guide material selection and design.Due to the strong interaction between material selection and the operation of the process in which the material is used,it is essential to perform material and process design simultaneously.Despite this significant connection,the solution of the integrated material and process design problem is not easy because multiple models at different scales are usually required.Hybrid modeling provides a promising option to tackle such complex design problems.In hybrid modeling,the material properties,which are computationally expensive to obtain,are described by data-driven models,while the well-known process-related principles are represented by mechanistic models.This article highlights the significance of hybrid modeling in multiscale material and process design.The generic design methodology is first introduced.Six important application areas are then selected:four from the chemical engineering field and two from the energy systems engineering domain.For each selected area,state-ofthe-art work using hybrid modeling for multiscale material and process design is discussed.Concluding remarks are provided at the end,and current limitations and future opportunities are pointed out.
基金This work was supported by the National Natural Science Foundation of China(Grant No.42050104)the Science Foundation of SINOPEC Group(Grant No.P20030).
文摘A comprehensive and precise analysis of shale gas production performance is crucial for evaluating resource potential,designing a field development plan,and making investment decisions.However,quantitative analysis can be challenging because production performance is dominated by the complex interaction among a series of geological and engineering factors.In fact,each factor can be viewed as a player who makes cooperative contributions to the production payoff within the constraints of physical laws and models.Inspired by the idea,we propose a hybrid data-driven analysis framework in this study,where the contributions of dominant factors are quantitatively evaluated,the productions are precisely forecasted,and the development optimization suggestions are comprehensively generated.More specifically,game theory and machine learning models are coupled to determine the dominating geological and engineering factors.The Shapley value with definite physical meaning is employed to quantitatively measure the effects of individual factors.A multi-model-fused stacked model is trained for production forecast,which provides the basis for derivative-free optimization algorithms to optimize the development plan.The complete workflow is validated with actual production data collected from the Fuling shale gas field,Sichuan Basin,China.The validation results show that the proposed procedure can draw rigorous conclusions with quantified evidence and thereby provide specific and reliable suggestions for development plan optimization.Comparing with traditional and experience-based approaches,the hybrid data-driven procedure is advanced in terms of both efficiency and accuracy.