Tropospheric zenith wet delay(ZWD)plays a vital role in the analysis of space geodetic observations.In recent years,machine learning methods have been increasingly applied to improve the accuracy of ZWD calculations.H...Tropospheric zenith wet delay(ZWD)plays a vital role in the analysis of space geodetic observations.In recent years,machine learning methods have been increasingly applied to improve the accuracy of ZWD calculations.However,a single machine learning model has limited generalization capabilities.To address these limitations,this study introduces a novel machine learning fusion(MLF)algorithm with stronger generalization capabilities to enhance ZWD modeling and prediction accuracy.The MLF algorithm utilizes a two-layer structure integrating extra trees(ET),backpropagation neural network(BPNN),and linear regression models.By comparing the root mean square error(RMSE)of these models,we found that both ET-based and MLF-based models outperform RF-based and BPNN-based models in terms of internal and external accuracy,across both surface meteorological data-based and blind models.The improvement in exte rnal accuracy is particularly significant in the blind models.Our re sults show that the MLF(with an RMSE of 3.93 cm)and ET(3.99 cm)models outperform the traditional GPT3model(4.07 cm),while the RF(4.21 cm)and BPNN(4.14 cm)have worse external accuracies than the GPT3 model.It is worth noting that the BPNN suffered from overfitting during external accuracy tests,which was avoided by the MLF.In summary,regardless of the availability of surface meteorological data,the MLF-based empirical models demonstrate superior internal and external accuracy compared to the other tested models in this study.展开更多
Diabetes mellitus,generally known as diabetes,is one of the most common diseases worldwide.It is a metabolic disease characterized by insulin deciency,or glucose(blood sugar)levels that exceed 200 mg/dL(11.1 ml/L)for ...Diabetes mellitus,generally known as diabetes,is one of the most common diseases worldwide.It is a metabolic disease characterized by insulin deciency,or glucose(blood sugar)levels that exceed 200 mg/dL(11.1 ml/L)for prolonged periods,and may lead to death if left uncontrolled by medication or insulin injections.Diabetes is categorized into two main types—type 1 and type 2—both of which feature glucose levels above“normal,”dened as 140 mg/dL.Diabetes is triggered by malfunction of the pancreas,which releases insulin,a natural hormone responsible for controlling glucose levels in blood cells.Diagnosis and comprehensive analysis of this potentially fatal disease necessitate application of techniques with minimal rates of error.The primary purpose of this research study is to assess the potential role of machine learning in predicting a person’s risk of developing diabetes.Historically,research has supported the use of various machine algorithms,such as naïve Bayes,decision trees,and articial neural networks,for early diagnosis of diabetes.However,to achieve maximum accuracy and minimal error in diagnostic predictions,there remains an immense need for further research and innovation to improve the machine-learning tools and techniques available to healthcare professionals.Therefore,in this paper,we propose a novel cloud-based machine-learning fusion technique involving synthesis of three machine algorithms and use of fuzzy systems for collective generation of highly accurate nal decisions regarding early diagnosis of diabetes.展开更多
Heart disease,which is also known as cardiovascular disease,includes various conditions that affect the heart and has been considered a major cause of death over the past decades.Accurate and timely detection of heart...Heart disease,which is also known as cardiovascular disease,includes various conditions that affect the heart and has been considered a major cause of death over the past decades.Accurate and timely detection of heart disease is the single key factor for appropriate investigation,treatment,and prescription of medication.Emerging technologies such as fog,cloud,and mobile computing provide substantial support for the diagnosis and prediction of fatal diseases such as diabetes,cancer,and cardiovascular disease.Cloud computing provides a cost-efficient infrastructure for data processing,storage,and retrieval,with much of the extant research recommending machine learning(ML)algorithms for generating models for sample data.ML is considered best suited to explore hidden patterns,which is ultimately helpful for analysis and prediction.Accordingly,this study combines cloud computing with ML,collecting datasets from different geographical areas and applying fusion techniques to maintain data accuracy and consistency for the ML algorithms.Our recommended model considered three ML techniques:Artificial Neural Network,Decision Tree,and Naïve Bayes.Real-time patient data were extracted using the fuzzy-based model stored in the cloud.展开更多
The software engineering field has long focused on creating high-quality software despite limited resources.Detecting defects before the testing stage of software development can enable quality assurance engineers to ...The software engineering field has long focused on creating high-quality software despite limited resources.Detecting defects before the testing stage of software development can enable quality assurance engineers to con-centrate on problematic modules rather than all the modules.This approach can enhance the quality of the final product while lowering development costs.Identifying defective modules early on can allow for early corrections and ensure the timely delivery of a high-quality product that satisfies customers and instills greater confidence in the development team.This process is known as software defect prediction,and it can improve end-product quality while reducing the cost of testing and maintenance.This study proposes a software defect prediction system that utilizes data fusion,feature selection,and ensemble machine learning fusion techniques.A novel filter-based metric selection technique is proposed in the framework to select the optimum features.A three-step nested approach is presented for predicting defective modules to achieve high accuracy.In the first step,three supervised machine learning techniques,including Decision Tree,Support Vector Machines,and Naïve Bayes,are used to detect faulty modules.The second step involves integrating the predictive accuracy of these classification techniques through three ensemble machine-learning methods:Bagging,Voting,and Stacking.Finally,in the third step,a fuzzy logic technique is employed to integrate the predictive accuracy of the ensemble machine learning techniques.The experiments are performed on a fused software defect dataset to ensure that the developed fused ensemble model can perform effectively on diverse datasets.Five NASA datasets are integrated to create the fused dataset:MW1,PC1,PC3,PC4,and CM1.According to the results,the proposed system exhibited superior performance to other advanced techniques for predicting software defects,achieving a remarkable accuracy rate of 92.08%.展开更多
This study presents an energy consumption(EC)forecasting method for laser melting manufacturing of metal artifacts based on fusionable transfer learning(FTL).To predict the EC of manufacturing products,particularly fr...This study presents an energy consumption(EC)forecasting method for laser melting manufacturing of metal artifacts based on fusionable transfer learning(FTL).To predict the EC of manufacturing products,particularly from scale-down to scale-up,a general paradigm was first developed by categorizing the overall process into three main sub-steps.The operating electrical power was further formulated as a combinatorial function,based on which an operator learning network was adopted to fit the nonlinear relations between the fabricating arguments and EC.Parallel-arranged networks were constructed to investigate the impacts of fabrication variables and devices on power.Considering the interconnections among these factors,the outputs of the neural networks were blended and fused to jointly predict the electrical power.Most innovatively,large artifacts can be decomposed into timedependent laser-scanning trajectories,which can be further transformed into fusionable information via neural networks,inspired by large language model.Accordingly,transfer learning can deal with either scale-down or scale-up forecasting,namely,FTL with scalability within artifact structures.The effectiveness of the proposed FTL was verified through physical fabrication experiments via laser powder bed fusion.The relative error of the average and overall EC predictions based on FTL was maintained below 0.83%.The melting fusion quality was examined using metallographic diagrams.The proposed FTL framework can forecast the EC of scaled structures,which is particularly helpful in price estimation and quotation of large metal products towards carbon peaking and carbon neutrality.展开更多
Due to the high-dimensional characteristics of photon orbital angular momentum (OAM), a beam can carry multiple OAMs simultaneously thus forming an OAM comb, which has been proved to show significant potential in both...Due to the high-dimensional characteristics of photon orbital angular momentum (OAM), a beam can carry multiple OAMs simultaneously thus forming an OAM comb, which has been proved to show significant potential in both classical and quantum photonics. Tailoring broadband OAM combs on demand in a fast and accurate manner is a crucial basis for their application in advanced scenarios. However, obtaining phase-only gratings for the generation of arbitrary desired OAM combs still poses challenges. In this paper, we propose a multi-scale fusion learning U-shaped neural network that encodes a phase-only hologram for tailoring broadband OAM combs on-demand. Proof-of-principle experiments demonstrate that our scheme achieves fast computational speed, high modulation precision, and high manipulation dimensionality, with a mode range of-75 to+75,an average root mean square error of 0.0037, and a fidelity of 85.01%, all achieved in about 30 ms.Furthermore, we utilize the tailored broadband OAM combs in conducting optical convolution calculation,enabling vector convolution for arbitrary discrete functions, showcasing the extended capability of our proposal.This work opens, to our knowledge, new insight for on-demand tailoring of broadband OAM combs, paving the way for further advancements in high-dimensional OAM-based applications.展开更多
基金funded by National Natural Science Foundation of China Key Program(12431014)Key Project of Hunan Education Department(22A0126)+1 种基金Natural Science Foundation of Hunan Province(2022JJ30555)Postgraduate Scientific Research Innovation Project of Xiangtan University(XDCX2024Y172)。
文摘Tropospheric zenith wet delay(ZWD)plays a vital role in the analysis of space geodetic observations.In recent years,machine learning methods have been increasingly applied to improve the accuracy of ZWD calculations.However,a single machine learning model has limited generalization capabilities.To address these limitations,this study introduces a novel machine learning fusion(MLF)algorithm with stronger generalization capabilities to enhance ZWD modeling and prediction accuracy.The MLF algorithm utilizes a two-layer structure integrating extra trees(ET),backpropagation neural network(BPNN),and linear regression models.By comparing the root mean square error(RMSE)of these models,we found that both ET-based and MLF-based models outperform RF-based and BPNN-based models in terms of internal and external accuracy,across both surface meteorological data-based and blind models.The improvement in exte rnal accuracy is particularly significant in the blind models.Our re sults show that the MLF(with an RMSE of 3.93 cm)and ET(3.99 cm)models outperform the traditional GPT3model(4.07 cm),while the RF(4.21 cm)and BPNN(4.14 cm)have worse external accuracies than the GPT3 model.It is worth noting that the BPNN suffered from overfitting during external accuracy tests,which was avoided by the MLF.In summary,regardless of the availability of surface meteorological data,the MLF-based empirical models demonstrate superior internal and external accuracy compared to the other tested models in this study.
文摘Diabetes mellitus,generally known as diabetes,is one of the most common diseases worldwide.It is a metabolic disease characterized by insulin deciency,or glucose(blood sugar)levels that exceed 200 mg/dL(11.1 ml/L)for prolonged periods,and may lead to death if left uncontrolled by medication or insulin injections.Diabetes is categorized into two main types—type 1 and type 2—both of which feature glucose levels above“normal,”dened as 140 mg/dL.Diabetes is triggered by malfunction of the pancreas,which releases insulin,a natural hormone responsible for controlling glucose levels in blood cells.Diagnosis and comprehensive analysis of this potentially fatal disease necessitate application of techniques with minimal rates of error.The primary purpose of this research study is to assess the potential role of machine learning in predicting a person’s risk of developing diabetes.Historically,research has supported the use of various machine algorithms,such as naïve Bayes,decision trees,and articial neural networks,for early diagnosis of diabetes.However,to achieve maximum accuracy and minimal error in diagnostic predictions,there remains an immense need for further research and innovation to improve the machine-learning tools and techniques available to healthcare professionals.Therefore,in this paper,we propose a novel cloud-based machine-learning fusion technique involving synthesis of three machine algorithms and use of fuzzy systems for collective generation of highly accurate nal decisions regarding early diagnosis of diabetes.
文摘Heart disease,which is also known as cardiovascular disease,includes various conditions that affect the heart and has been considered a major cause of death over the past decades.Accurate and timely detection of heart disease is the single key factor for appropriate investigation,treatment,and prescription of medication.Emerging technologies such as fog,cloud,and mobile computing provide substantial support for the diagnosis and prediction of fatal diseases such as diabetes,cancer,and cardiovascular disease.Cloud computing provides a cost-efficient infrastructure for data processing,storage,and retrieval,with much of the extant research recommending machine learning(ML)algorithms for generating models for sample data.ML is considered best suited to explore hidden patterns,which is ultimately helpful for analysis and prediction.Accordingly,this study combines cloud computing with ML,collecting datasets from different geographical areas and applying fusion techniques to maintain data accuracy and consistency for the ML algorithms.Our recommended model considered three ML techniques:Artificial Neural Network,Decision Tree,and Naïve Bayes.Real-time patient data were extracted using the fuzzy-based model stored in the cloud.
基金supported by the Center for Cyber-Physical Systems,Khalifa University,under Grant 8474000137-RC1-C2PS-T5.
文摘The software engineering field has long focused on creating high-quality software despite limited resources.Detecting defects before the testing stage of software development can enable quality assurance engineers to con-centrate on problematic modules rather than all the modules.This approach can enhance the quality of the final product while lowering development costs.Identifying defective modules early on can allow for early corrections and ensure the timely delivery of a high-quality product that satisfies customers and instills greater confidence in the development team.This process is known as software defect prediction,and it can improve end-product quality while reducing the cost of testing and maintenance.This study proposes a software defect prediction system that utilizes data fusion,feature selection,and ensemble machine learning fusion techniques.A novel filter-based metric selection technique is proposed in the framework to select the optimum features.A three-step nested approach is presented for predicting defective modules to achieve high accuracy.In the first step,three supervised machine learning techniques,including Decision Tree,Support Vector Machines,and Naïve Bayes,are used to detect faulty modules.The second step involves integrating the predictive accuracy of these classification techniques through three ensemble machine-learning methods:Bagging,Voting,and Stacking.Finally,in the third step,a fuzzy logic technique is employed to integrate the predictive accuracy of the ensemble machine learning techniques.The experiments are performed on a fused software defect dataset to ensure that the developed fused ensemble model can perform effectively on diverse datasets.Five NASA datasets are integrated to create the fused dataset:MW1,PC1,PC3,PC4,and CM1.According to the results,the proposed system exhibited superior performance to other advanced techniques for predicting software defects,achieving a remarkable accuracy rate of 92.08%.
基金funded by the National Key Research and Development Program of China,No.2022YFB3303303Key Open Fund of State Key Lab of Materials Processing and Die&Mould Technology of China,No.P2024-001Zhejiang Provincial Research and Development Project of China,No.LGG22E050010。
文摘This study presents an energy consumption(EC)forecasting method for laser melting manufacturing of metal artifacts based on fusionable transfer learning(FTL).To predict the EC of manufacturing products,particularly from scale-down to scale-up,a general paradigm was first developed by categorizing the overall process into three main sub-steps.The operating electrical power was further formulated as a combinatorial function,based on which an operator learning network was adopted to fit the nonlinear relations between the fabricating arguments and EC.Parallel-arranged networks were constructed to investigate the impacts of fabrication variables and devices on power.Considering the interconnections among these factors,the outputs of the neural networks were blended and fused to jointly predict the electrical power.Most innovatively,large artifacts can be decomposed into timedependent laser-scanning trajectories,which can be further transformed into fusionable information via neural networks,inspired by large language model.Accordingly,transfer learning can deal with either scale-down or scale-up forecasting,namely,FTL with scalability within artifact structures.The effectiveness of the proposed FTL was verified through physical fabrication experiments via laser powder bed fusion.The relative error of the average and overall EC predictions based on FTL was maintained below 0.83%.The melting fusion quality was examined using metallographic diagrams.The proposed FTL framework can forecast the EC of scaled structures,which is particularly helpful in price estimation and quotation of large metal products towards carbon peaking and carbon neutrality.
基金National Natural Science Foundation of China(62350011,62375014)National Key Research and Development Program of China (2022YFB3607700)+1 种基金Beijing Natural Science Foundation (1232031)Special Fund for Basic Scientific Research of Central Universities of China (2024CX11002)。
文摘Due to the high-dimensional characteristics of photon orbital angular momentum (OAM), a beam can carry multiple OAMs simultaneously thus forming an OAM comb, which has been proved to show significant potential in both classical and quantum photonics. Tailoring broadband OAM combs on demand in a fast and accurate manner is a crucial basis for their application in advanced scenarios. However, obtaining phase-only gratings for the generation of arbitrary desired OAM combs still poses challenges. In this paper, we propose a multi-scale fusion learning U-shaped neural network that encodes a phase-only hologram for tailoring broadband OAM combs on-demand. Proof-of-principle experiments demonstrate that our scheme achieves fast computational speed, high modulation precision, and high manipulation dimensionality, with a mode range of-75 to+75,an average root mean square error of 0.0037, and a fidelity of 85.01%, all achieved in about 30 ms.Furthermore, we utilize the tailored broadband OAM combs in conducting optical convolution calculation,enabling vector convolution for arbitrary discrete functions, showcasing the extended capability of our proposal.This work opens, to our knowledge, new insight for on-demand tailoring of broadband OAM combs, paving the way for further advancements in high-dimensional OAM-based applications.