CO_(2)flooding for enhanced oil recovery(EOR)not only enables underground carbon storage but also plays a critical role in tertiary oil recovery.However,its displacement efficiency is constrained by whether CO_(2)and ...CO_(2)flooding for enhanced oil recovery(EOR)not only enables underground carbon storage but also plays a critical role in tertiary oil recovery.However,its displacement efficiency is constrained by whether CO_(2)and crude oil achieve miscibility,necessitating precise prediction of the minimum miscibility pressure(MMP)for CO_(2)-oil systems.Traditional methods,such as experimental measurements and empirical correlations,face challenges including time-consuming procedures and limited applicability.In contrast,artificial intelligence(AI)algorithms have emerged as superior alternatives due to their efficiency,broad applicability,and high prediction accuracy.This study employs four AI algorithms—Random Forest Regression(RFR),Genetic Algorithm Based Back Propagation Artificial Neural Network(GA-BPNN),Support Vector Regression(SVR),and Gaussian Process Regression(GPR)—to establish predictive models for CO_(2)-oil MMP.A comprehensive database comprising 151 data entries was utilized for model development.The performance of these models was rigorously evaluated using five distinct statistical metrics and visualized comparisons.Validation results confirm their accuracy.Field applications demonstrate that all four models are effective for predicting MMP in ultra-deep reservoirs(burial depth>5000 m)with complex crude oil compositions.Among them,the RFR and GA-BPNN models outperform SVR and GPR,achieving root mean square errors(RMSE)of 0.33%and 2.23%,and average absolute percentage relative errors(AAPRE)of 0.01%and 0.04%,respectively.Sensitivity analysis of MMP-influencing factors reveals that reservoir temperature(T_(R))exerts the most significant impact on MMP,while Xint(mole fraction of intermediate oil components,including C_(2)-C_(4),CO_(2),and H_(2)S)exhibits the least influence.展开更多
Satellite Precipitation Products(SPPs) face challenges in detecting Extreme Precipitation Events(EPEs). Hence, the primary objective of this research is to introduce a novel framework termed Machine-Learning Clusterin...Satellite Precipitation Products(SPPs) face challenges in detecting Extreme Precipitation Events(EPEs). Hence, the primary objective of this research is to introduce a novel framework termed Machine-Learning Clustering-Merging Algorithms(ML-CMAs) to evaluate EPEs using SPPs and Auxiliary Data(AD). Daily precipitation measurements were utilized for training and evaluating EPE estimates over Iran, which is comprised of arid and semi-arid regions. Statistical analysis and evaluation of five SPPs demonstrated that during EPE occurrences, all products face challenges in precipitation estimation, and using these products individually is not recommended. Among the SPPs, Multi-Source Weighted-Ensemble Precipitation(MSWEP) performed best for heavy(>20 mm d–1) and extreme(>40 mm d–1)precipitation events, followed by Global Satellite Mapping of Precipitation(GSMa P), Integrated Multi-Satellite Retrievals for Global Precipitation Measurement(IMERG), Climate Prediction Center morphing technique(CMORPH), and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks Dynamic Infrared-Rain Rate(PERSIANN-PDIR). The findings indicate that all proposed methods based on ML-CMAs could estimate precipitation rates more accurately than SPPs and improve statistical indices. The seasonal assessment and spatial analysis of statistical metrics of the overall daily precipitation results for all periods and climates revealed that all methods based on ML-CMAs performed well in all seasons and at nearly all measurement stations. Using unsupervised K-means++ classification for clustering EPEs and Deep Neural Network(DNN) and Convolutional Neural Network(CNN) methods for merging the MLCMAs reduced the error rate of SPPs in EPE estimation by approximately 50%. Therefore, incorporating ML-CMAs along with PWV as AD can significantly improve the performance of SPPs in evaluating EPEs over the study region.展开更多
Since real world communication channels are not error free, the coded data transmitted on them may be corrupted, and block based image coding systems are vulnerable to transmission impairment. So the best neighborh...Since real world communication channels are not error free, the coded data transmitted on them may be corrupted, and block based image coding systems are vulnerable to transmission impairment. So the best neighborhood match method using genetic algorithm is used to conceal the error blocks. Experimental results show that the searching space can be greatly reduced by using genetic algorithm compared with exhaustive searching method, and good image quality is achieved. The peak signal noise ratios(PSNRs) of the restored images are increased greatly.展开更多
To resist the side chaimel attacks of elliptic curve cryptography, a new fast and secure point multiplication algorithm is proposed. The algorithm is based on a particular kind of addition chains involving only additi...To resist the side chaimel attacks of elliptic curve cryptography, a new fast and secure point multiplication algorithm is proposed. The algorithm is based on a particular kind of addition chains involving only additions, providing a natural protection against side channel attacks. Moreover, the new addition formulae that take into account the specific structure of those chains making point multiplication very efficient are proposed. The point multiplication algorithm only needs 1 719 multiplications for the SAC260 of 160-bit integers. For chains of length from 280 to 260, the proposed method outperforms all the previous methods with a gain of 26% to 31% over double-and add, 16% to22% over NAF, 7% to 13% over4-NAF and 1% to 8% over the present best algorithm--double-base chain.展开更多
The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO...The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.展开更多
In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding....In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding. These new energy based objective criterions are further combined with the proficient search capability of swarm based algorithms to improve the efficiency and robustness. The proposed multilevel thresholding approach accurately determines the optimal threshold values by using generated energy curve, and acutely distinguishes different objects within the multi-channel complex images. The performance evaluation indices and experiments on different test images illustrate that Kapur's entropy aided with differential evolution and bacterial foraging optimization algorithm generates the most accurate and visually pleasing segmented images.展开更多
This paper introduces a new approach of firefly algorithm based on opposition-based learning (OBFA) to enhance the global search ability of the original algorithm. The new algorithm employs opposition based learning...This paper introduces a new approach of firefly algorithm based on opposition-based learning (OBFA) to enhance the global search ability of the original algorithm. The new algorithm employs opposition based learning concept to generate initial population and also updating agents’ positions. The proposed OBFA is applied for minimization of the factor of safety and search for critical failure surface in slope stability analysis. The numerical experiments demonstrate the effectiveness and robustness of the new algorithm.展开更多
In order to improve the picking efficiency,reduce the picking time,this paper take artificial picking operation of a certain distribution center which has double-area warehouse as the studying object.Discuss the picki...In order to improve the picking efficiency,reduce the picking time,this paper take artificial picking operation of a certain distribution center which has double-area warehouse as the studying object.Discuss the picking task allocation and routing problems.Establish the TSP model of order-picking system.Create a heuristic algorithm bases on the Genetic Algorithm(GA)which help to solve the task allocating problem and to get the associated order-picking routes.And achieve the simulation experiment with the Visual 6.0C++platform to prove the rationality of the model and the effectiveness of the arithmetic.展开更多
The artificial bee colony (ABC) algorithm is a com- petitive stochastic population-based optimization algorithm. How- ever, the ABC algorithm does not use the social information and lacks the knowledge of the proble...The artificial bee colony (ABC) algorithm is a com- petitive stochastic population-based optimization algorithm. How- ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in- sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA called Archimedean copula estima- tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench- mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen- tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.展开更多
For the dense macro-femto coexistence networks scenario, a long-term-based handover(LTBH) algorithm is proposed. The handover decision algorithm is jointly determined by the angle of handover(AHO) and the time-tos...For the dense macro-femto coexistence networks scenario, a long-term-based handover(LTBH) algorithm is proposed. The handover decision algorithm is jointly determined by the angle of handover(AHO) and the time-tostay(TTS) to reduce the unnecessary handover numbers.First, the proposed AHO parameter is used to decrease the computation complexity in multiple candidate base stations(CBSs) scenario. Then, two types of TTS parameters are given for the fixed base stations and mobile base stations to make handover decisions among multiple CBSs. The simulation results show that the proposed LTBH algorithm can not only maintain the required transmission rate of users, but also effectively reduce the unnecessary numbers of handover in the dense macro-femto networks with the coexisting mobile BSs.展开更多
The uncertain duration of each job in each machine in flow shop problem was regarded as an independent random variable and was described by mathematical expectation.And then,an immune based partheno-genetic algorithm ...The uncertain duration of each job in each machine in flow shop problem was regarded as an independent random variable and was described by mathematical expectation.And then,an immune based partheno-genetic algorithm was proposed by making use of concepts and principles introduced from immune system and genetic system in nature.In this method,processing se- quence of products could be expressed by the character encoding and each antibody represents a feasible schedule.Affinity was used to measure the matching degree between antibody and antigen.Then several antibodies producing operators,such as swopping,mov- ing,inverting,etc,were worked out.This algorithm was combined with evolution function of the genetic algorithm and density mechanism in organisms immune system.Promotion and inhibition of antibodies were realized by expected propagation ratio of an- tibodies,and in this way,premature convergence was improved.The simulation proved that this algorithm is effective.展开更多
Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some ...Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some problems: it is still sensitive to initial clustering centers and the clustering results are not good when the tested datasets with noise are very unequal. An improved kernel possibilistic fuzzy c-means algorithm based on invasive weed optimization(IWO-KPFCM) is proposed in this paper. This algorithm first uses invasive weed optimization(IWO) algorithm to seek the optimal solution as the initial clustering centers, and introduces kernel method to make the input data from the sample space map into the high-dimensional feature space. Then, the sample variance is introduced in the objection function to measure the compact degree of data. Finally, the improved algorithm is used to cluster data. The simulation results of the University of California-Irvine(UCI) data sets and artificial data sets show that the proposed algorithm has stronger ability to resist noise, higher cluster accuracy and faster convergence speed than the PFCM algorithm.展开更多
In the recent years,microarray technology gained attention for concurrent monitoring of numerous microarray images.It remains a major challenge to process,store and transmit such huge volumes of microarray images.So,i...In the recent years,microarray technology gained attention for concurrent monitoring of numerous microarray images.It remains a major challenge to process,store and transmit such huge volumes of microarray images.So,image compression techniques are used in the reduction of number of bits so that it can be stored and the images can be shared easily.Various techniques have been proposed in the past with applications in different domains.The current research paper presents a novel image compression technique i.e.,optimized Linde–Buzo–Gray(OLBG)with Lempel Ziv Markov Algorithm(LZMA)coding technique called OLBG-LZMA for compressing microarray images without any loss of quality.LBG model is generally used in designing a local optimal codebook for image compression.Codebook construction is treated as an optimizationissue and can be resolved with the help of Grey Wolf Optimization(GWO)algorithm.Once the codebook is constructed by LBGGWO algorithm,LZMA is employed for the compression of index table and raise its compression efficiency additionally.Experiments were performed on high resolution Tissue Microarray(TMA)image dataset of 50 prostate tissue samples collected from prostate cancer patients.The compression performance of the proposed coding esd compared with recently proposed techniques.The simulation results infer that OLBG-LZMA coding achieved a significant compression performance compared to other techniques.展开更多
This paper investigates the problem of synchronization for offset quadrature amplitude modulation based orthogonal frequency division multiplexing(OFDM/OQAM) systems based on the genetic algorithm. In order to increas...This paper investigates the problem of synchronization for offset quadrature amplitude modulation based orthogonal frequency division multiplexing(OFDM/OQAM) systems based on the genetic algorithm. In order to increase the spectrum efficiency,an improved preamble structure without guard symbols is derived at first. On this basis, instead of deriving the log likelihood function of power spectral density, joint estimation of the symbol timing offset and carrier frequency offset based on the preamble proposed is formulated into a bivariate optimization problem. After that, an improved genetic algorithm is used to find its global optimum solution. Conclusions can be drawn from simulation results that the proposed method has advantages in the joint estimation of synchronization.展开更多
The study aims to investigate the financial technology(FinTech)factors influencing Chinese banking performance.Financial expectations and global realities may be changed by FinTech’s multidimensional scope,which is l...The study aims to investigate the financial technology(FinTech)factors influencing Chinese banking performance.Financial expectations and global realities may be changed by FinTech’s multidimensional scope,which is lacking in the traditional financial sector.The use of technology to automate financial services is becoming more important for economic organizations and industries because the digital age has seen a period of transition in terms of consumers and personalization.The future of FinTech will be shaped by technologies like the Internet of Things,blockchain,and artificial intelligence.The involvement of these platforms in financial services is a major concern for global business growth.FinTech is becoming more popular with customers because of such benefits.FinTech has driven a fundamental change within the financial services industry,placing the client at the center of everything.Protection has become a primary focus since data are a component of FinTech transactions.The task of consolidating research reports for consensus is very manual,as there is no standardized format.Although existing research has proposed certain methods,they have certain drawbacks in FinTech payment systems(including cryptocurrencies),credit markets(including peer-to-peer lending),and insurance systems.This paper implements blockchainbased financial technology for the banking sector to overcome these transition issues.In this study,we have proposed an adaptive neuro-fuzzy-based K-nearest neighbors’algorithm.The chaotic improved foraging optimization algorithm is used to optimize the proposed method.The rolling window autoregressive lag modeling approach analyzes FinTech growth.The proposed algorithm is compared with existing approaches to demonstrate its efficiency.The findings showed that it achieved 91%accuracy,90%privacy,96%robustness,and 25%cyber-risk performance.Compared with traditional approaches,the recommended strategy will be more convenient,safe,and effective in the transition period.展开更多
In this paper,we present a phase multiplication algorithm(PMA)to obtain scalable fringe precision in laser self-mixing interferometer under a weak feedback regime.Merely by applying the double angle formula on the sel...In this paper,we present a phase multiplication algorithm(PMA)to obtain scalable fringe precision in laser self-mixing interferometer under a weak feedback regime.Merely by applying the double angle formula on the self-mixing signal multiple times,the continuously improved fringe precision will be obtained.Theoretical analysis shows that the precision of the fringe could be improved toλ/2^(n+1).The validity of the proposed method is demonstrated by means of simulated SMI signals and confirmed by experiments under different amplitudes.A fringe precision ofλ/128 at a sampling rate of 500 k S/s has been achieved after doing 6 th the PMA.Finally,an amplitude of 50 nm has been proved to be measurable and the absolute error is 3.07 nm,which is within the theoretical error range.The proposed method for vibration measurement has the advantage of high accuracy and reliable without adding any additional optical elements in the optical path,thus it will play an important role in nanoscale measurement field.展开更多
The iterative closest point(ICP)algorithm has the advantages of high accuracy and fast speed for point set registration,but it performs poorly when the point set has a large number of noisy outliers.To solve this prob...The iterative closest point(ICP)algorithm has the advantages of high accuracy and fast speed for point set registration,but it performs poorly when the point set has a large number of noisy outliers.To solve this problem,we propose a new affine registration algorithm based on correntropy which works well in the affine registration of point sets with outliers.Firstly,we substitute the traditional measure of least squares with a maximum correntropy criterion to build a new registration model,which can avoid the influence of outliers.To maximize the objective function,we then propose a robust affine ICP algorithm.At each iteration of this new algorithm,we set up the index mapping of two point sets according to the known transformation,and then compute the closed-form solution of the new transformation according to the known index mapping.Similar to the traditional ICP algorithm,our algorithm converges to a local maximum monotonously for any given initial value.Finally,the robustness and high efficiency of affine ICP algorithm based on correntropy are demonstrated by 2D and 3D point set registration experiments.展开更多
With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the pr...With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the problem of anomaly detection is a hot topic.Based on the development of anomalous trajectory detection of moving objects,this paper introduces the classical trajectory outlier detection(TRAOD) algorithm,and then proposes a density-based trajectory outlier detection(DBTOD) algorithm,which compensates the disadvantages of the TRAOD algorithm that it is unable to detect anomalous defects when the trajectory is local and dense.The results of employing the proposed algorithm to Elk1993 and Deer1995 datasets are also presented,which show the effectiveness of the algorithm.展开更多
Wireless node localization is one of the key technologies for wireless sensor networks. Outdoor localization can use GPS, AGPS (Assisted Global Positioning System) [6], but in buildings like supermarkets and undergrou...Wireless node localization is one of the key technologies for wireless sensor networks. Outdoor localization can use GPS, AGPS (Assisted Global Positioning System) [6], but in buildings like supermarkets and underground parking, the accuracy of GPS and even AGPS will be greatly reduced. Since Indoor localization requests higher accuracy, using GPS or AGPS for indoor localization is not feasible in the current view. RSSI-based trilateral localization algorithm, due to its low cost, no additional hardware support, and easy-understanding, it becomes the mainstream localization algorithm in wireless sensor networks. With the development of wireless sensor networks and smart devices, the number of WIFI access point in these buildings is increasing, as long as a mobile smart device can detect three or three more known WIFI hotspots’ positions, it would be relatively easy to realize self-localization (Usually WIFI access points locations are fixed). The key problem is that the RSSI value is relatively vulnerable to the influence of the physical environment, causing large calculation error in RSSI-based localization algorithm. The paper proposes an improved RSSI-based algorithm, the experimental results show that compared with original RSSI-based localization algorithms the algorithm improves the localization accuracy and reduces the deviation.展开更多
文摘CO_(2)flooding for enhanced oil recovery(EOR)not only enables underground carbon storage but also plays a critical role in tertiary oil recovery.However,its displacement efficiency is constrained by whether CO_(2)and crude oil achieve miscibility,necessitating precise prediction of the minimum miscibility pressure(MMP)for CO_(2)-oil systems.Traditional methods,such as experimental measurements and empirical correlations,face challenges including time-consuming procedures and limited applicability.In contrast,artificial intelligence(AI)algorithms have emerged as superior alternatives due to their efficiency,broad applicability,and high prediction accuracy.This study employs four AI algorithms—Random Forest Regression(RFR),Genetic Algorithm Based Back Propagation Artificial Neural Network(GA-BPNN),Support Vector Regression(SVR),and Gaussian Process Regression(GPR)—to establish predictive models for CO_(2)-oil MMP.A comprehensive database comprising 151 data entries was utilized for model development.The performance of these models was rigorously evaluated using five distinct statistical metrics and visualized comparisons.Validation results confirm their accuracy.Field applications demonstrate that all four models are effective for predicting MMP in ultra-deep reservoirs(burial depth>5000 m)with complex crude oil compositions.Among them,the RFR and GA-BPNN models outperform SVR and GPR,achieving root mean square errors(RMSE)of 0.33%and 2.23%,and average absolute percentage relative errors(AAPRE)of 0.01%and 0.04%,respectively.Sensitivity analysis of MMP-influencing factors reveals that reservoir temperature(T_(R))exerts the most significant impact on MMP,while Xint(mole fraction of intermediate oil components,including C_(2)-C_(4),CO_(2),and H_(2)S)exhibits the least influence.
文摘Satellite Precipitation Products(SPPs) face challenges in detecting Extreme Precipitation Events(EPEs). Hence, the primary objective of this research is to introduce a novel framework termed Machine-Learning Clustering-Merging Algorithms(ML-CMAs) to evaluate EPEs using SPPs and Auxiliary Data(AD). Daily precipitation measurements were utilized for training and evaluating EPE estimates over Iran, which is comprised of arid and semi-arid regions. Statistical analysis and evaluation of five SPPs demonstrated that during EPE occurrences, all products face challenges in precipitation estimation, and using these products individually is not recommended. Among the SPPs, Multi-Source Weighted-Ensemble Precipitation(MSWEP) performed best for heavy(>20 mm d–1) and extreme(>40 mm d–1)precipitation events, followed by Global Satellite Mapping of Precipitation(GSMa P), Integrated Multi-Satellite Retrievals for Global Precipitation Measurement(IMERG), Climate Prediction Center morphing technique(CMORPH), and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks Dynamic Infrared-Rain Rate(PERSIANN-PDIR). The findings indicate that all proposed methods based on ML-CMAs could estimate precipitation rates more accurately than SPPs and improve statistical indices. The seasonal assessment and spatial analysis of statistical metrics of the overall daily precipitation results for all periods and climates revealed that all methods based on ML-CMAs performed well in all seasons and at nearly all measurement stations. Using unsupervised K-means++ classification for clustering EPEs and Deep Neural Network(DNN) and Convolutional Neural Network(CNN) methods for merging the MLCMAs reduced the error rate of SPPs in EPE estimation by approximately 50%. Therefore, incorporating ML-CMAs along with PWV as AD can significantly improve the performance of SPPs in evaluating EPEs over the study region.
文摘Since real world communication channels are not error free, the coded data transmitted on them may be corrupted, and block based image coding systems are vulnerable to transmission impairment. So the best neighborhood match method using genetic algorithm is used to conceal the error blocks. Experimental results show that the searching space can be greatly reduced by using genetic algorithm compared with exhaustive searching method, and good image quality is achieved. The peak signal noise ratios(PSNRs) of the restored images are increased greatly.
基金The National Natural Science Foundation of China (No.60473029,60673072).
文摘To resist the side chaimel attacks of elliptic curve cryptography, a new fast and secure point multiplication algorithm is proposed. The algorithm is based on a particular kind of addition chains involving only additions, providing a natural protection against side channel attacks. Moreover, the new addition formulae that take into account the specific structure of those chains making point multiplication very efficient are proposed. The point multiplication algorithm only needs 1 719 multiplications for the SAC260 of 160-bit integers. For chains of length from 280 to 260, the proposed method outperforms all the previous methods with a gain of 26% to 31% over double-and add, 16% to22% over NAF, 7% to 13% over4-NAF and 1% to 8% over the present best algorithm--double-base chain.
基金financial support extended for this academic work by the Beijing Natural Science Foundation(Grant 2232066)the Open Project Foundation of State Key Laboratory of Solid Lubrication(Grant LSL-2212).
文摘The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.
文摘In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding. These new energy based objective criterions are further combined with the proficient search capability of swarm based algorithms to improve the efficiency and robustness. The proposed multilevel thresholding approach accurately determines the optimal threshold values by using generated energy curve, and acutely distinguishes different objects within the multi-channel complex images. The performance evaluation indices and experiments on different test images illustrate that Kapur's entropy aided with differential evolution and bacterial foraging optimization algorithm generates the most accurate and visually pleasing segmented images.
文摘This paper introduces a new approach of firefly algorithm based on opposition-based learning (OBFA) to enhance the global search ability of the original algorithm. The new algorithm employs opposition based learning concept to generate initial population and also updating agents’ positions. The proposed OBFA is applied for minimization of the factor of safety and search for critical failure surface in slope stability analysis. The numerical experiments demonstrate the effectiveness and robustness of the new algorithm.
文摘In order to improve the picking efficiency,reduce the picking time,this paper take artificial picking operation of a certain distribution center which has double-area warehouse as the studying object.Discuss the picking task allocation and routing problems.Establish the TSP model of order-picking system.Create a heuristic algorithm bases on the Genetic Algorithm(GA)which help to solve the task allocating problem and to get the associated order-picking routes.And achieve the simulation experiment with the Visual 6.0C++platform to prove the rationality of the model and the effectiveness of the arithmetic.
基金supported by the National Natural Science Foundation of China(61201370)the Special Funding Project for Independent Innovation Achievement Transform of Shandong Province(2012CX30202)the Natural Science Foundation of Shandong Province(ZR2014FM039)
文摘The artificial bee colony (ABC) algorithm is a com- petitive stochastic population-based optimization algorithm. How- ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in- sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA called Archimedean copula estima- tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench- mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen- tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.
基金The National Natural Science Foundation of China(No.61471164)the Fundamental Research Funds for the Central Universitiesthe Scientific Innovation Research of College Graduates in Jiangsu Province(No.KYLX-0133)
文摘For the dense macro-femto coexistence networks scenario, a long-term-based handover(LTBH) algorithm is proposed. The handover decision algorithm is jointly determined by the angle of handover(AHO) and the time-tostay(TTS) to reduce the unnecessary handover numbers.First, the proposed AHO parameter is used to decrease the computation complexity in multiple candidate base stations(CBSs) scenario. Then, two types of TTS parameters are given for the fixed base stations and mobile base stations to make handover decisions among multiple CBSs. The simulation results show that the proposed LTBH algorithm can not only maintain the required transmission rate of users, but also effectively reduce the unnecessary numbers of handover in the dense macro-femto networks with the coexisting mobile BSs.
文摘The uncertain duration of each job in each machine in flow shop problem was regarded as an independent random variable and was described by mathematical expectation.And then,an immune based partheno-genetic algorithm was proposed by making use of concepts and principles introduced from immune system and genetic system in nature.In this method,processing se- quence of products could be expressed by the character encoding and each antibody represents a feasible schedule.Affinity was used to measure the matching degree between antibody and antigen.Then several antibodies producing operators,such as swopping,mov- ing,inverting,etc,were worked out.This algorithm was combined with evolution function of the genetic algorithm and density mechanism in organisms immune system.Promotion and inhibition of antibodies were realized by expected propagation ratio of an- tibodies,and in this way,premature convergence was improved.The simulation proved that this algorithm is effective.
文摘Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some problems: it is still sensitive to initial clustering centers and the clustering results are not good when the tested datasets with noise are very unequal. An improved kernel possibilistic fuzzy c-means algorithm based on invasive weed optimization(IWO-KPFCM) is proposed in this paper. This algorithm first uses invasive weed optimization(IWO) algorithm to seek the optimal solution as the initial clustering centers, and introduces kernel method to make the input data from the sample space map into the high-dimensional feature space. Then, the sample variance is introduced in the objection function to measure the compact degree of data. Finally, the improved algorithm is used to cluster data. The simulation results of the University of California-Irvine(UCI) data sets and artificial data sets show that the proposed algorithm has stronger ability to resist noise, higher cluster accuracy and faster convergence speed than the PFCM algorithm.
文摘In the recent years,microarray technology gained attention for concurrent monitoring of numerous microarray images.It remains a major challenge to process,store and transmit such huge volumes of microarray images.So,image compression techniques are used in the reduction of number of bits so that it can be stored and the images can be shared easily.Various techniques have been proposed in the past with applications in different domains.The current research paper presents a novel image compression technique i.e.,optimized Linde–Buzo–Gray(OLBG)with Lempel Ziv Markov Algorithm(LZMA)coding technique called OLBG-LZMA for compressing microarray images without any loss of quality.LBG model is generally used in designing a local optimal codebook for image compression.Codebook construction is treated as an optimizationissue and can be resolved with the help of Grey Wolf Optimization(GWO)algorithm.Once the codebook is constructed by LBGGWO algorithm,LZMA is employed for the compression of index table and raise its compression efficiency additionally.Experiments were performed on high resolution Tissue Microarray(TMA)image dataset of 50 prostate tissue samples collected from prostate cancer patients.The compression performance of the proposed coding esd compared with recently proposed techniques.The simulation results infer that OLBG-LZMA coding achieved a significant compression performance compared to other techniques.
基金supported by the National Natural Science Foundation of China(61671468)。
文摘This paper investigates the problem of synchronization for offset quadrature amplitude modulation based orthogonal frequency division multiplexing(OFDM/OQAM) systems based on the genetic algorithm. In order to increase the spectrum efficiency,an improved preamble structure without guard symbols is derived at first. On this basis, instead of deriving the log likelihood function of power spectral density, joint estimation of the symbol timing offset and carrier frequency offset based on the preamble proposed is formulated into a bivariate optimization problem. After that, an improved genetic algorithm is used to find its global optimum solution. Conclusions can be drawn from simulation results that the proposed method has advantages in the joint estimation of synchronization.
基金from funding agencies in the public,commercial,or not-for-profit sectors.
文摘The study aims to investigate the financial technology(FinTech)factors influencing Chinese banking performance.Financial expectations and global realities may be changed by FinTech’s multidimensional scope,which is lacking in the traditional financial sector.The use of technology to automate financial services is becoming more important for economic organizations and industries because the digital age has seen a period of transition in terms of consumers and personalization.The future of FinTech will be shaped by technologies like the Internet of Things,blockchain,and artificial intelligence.The involvement of these platforms in financial services is a major concern for global business growth.FinTech is becoming more popular with customers because of such benefits.FinTech has driven a fundamental change within the financial services industry,placing the client at the center of everything.Protection has become a primary focus since data are a component of FinTech transactions.The task of consolidating research reports for consensus is very manual,as there is no standardized format.Although existing research has proposed certain methods,they have certain drawbacks in FinTech payment systems(including cryptocurrencies),credit markets(including peer-to-peer lending),and insurance systems.This paper implements blockchainbased financial technology for the banking sector to overcome these transition issues.In this study,we have proposed an adaptive neuro-fuzzy-based K-nearest neighbors’algorithm.The chaotic improved foraging optimization algorithm is used to optimize the proposed method.The rolling window autoregressive lag modeling approach analyzes FinTech growth.The proposed algorithm is compared with existing approaches to demonstrate its efficiency.The findings showed that it achieved 91%accuracy,90%privacy,96%robustness,and 25%cyber-risk performance.Compared with traditional approaches,the recommended strategy will be more convenient,safe,and effective in the transition period.
基金supported by the Natural Science Foundation of Fujian Province(No.2020J01705)the School Foundation of Jimei University(No.C150345)。
文摘In this paper,we present a phase multiplication algorithm(PMA)to obtain scalable fringe precision in laser self-mixing interferometer under a weak feedback regime.Merely by applying the double angle formula on the self-mixing signal multiple times,the continuously improved fringe precision will be obtained.Theoretical analysis shows that the precision of the fringe could be improved toλ/2^(n+1).The validity of the proposed method is demonstrated by means of simulated SMI signals and confirmed by experiments under different amplitudes.A fringe precision ofλ/128 at a sampling rate of 500 k S/s has been achieved after doing 6 th the PMA.Finally,an amplitude of 50 nm has been proved to be measurable and the absolute error is 3.07 nm,which is within the theoretical error range.The proposed method for vibration measurement has the advantage of high accuracy and reliable without adding any additional optical elements in the optical path,thus it will play an important role in nanoscale measurement field.
基金supported in part by the National Natural Science Foundation of China(61627811,61573274,61673126,U1701261)
文摘The iterative closest point(ICP)algorithm has the advantages of high accuracy and fast speed for point set registration,but it performs poorly when the point set has a large number of noisy outliers.To solve this problem,we propose a new affine registration algorithm based on correntropy which works well in the affine registration of point sets with outliers.Firstly,we substitute the traditional measure of least squares with a maximum correntropy criterion to build a new registration model,which can avoid the influence of outliers.To maximize the objective function,we then propose a robust affine ICP algorithm.At each iteration of this new algorithm,we set up the index mapping of two point sets according to the known transformation,and then compute the closed-form solution of the new transformation according to the known index mapping.Similar to the traditional ICP algorithm,our algorithm converges to a local maximum monotonously for any given initial value.Finally,the robustness and high efficiency of affine ICP algorithm based on correntropy are demonstrated by 2D and 3D point set registration experiments.
基金supported by the Aeronautical Science Foundation of China(20111052010)the Jiangsu Graduates Innovation Project (CXZZ120163)+1 种基金the "333" Project of Jiangsu Provincethe Qing Lan Project of Jiangsu Province
文摘With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the problem of anomaly detection is a hot topic.Based on the development of anomalous trajectory detection of moving objects,this paper introduces the classical trajectory outlier detection(TRAOD) algorithm,and then proposes a density-based trajectory outlier detection(DBTOD) algorithm,which compensates the disadvantages of the TRAOD algorithm that it is unable to detect anomalous defects when the trajectory is local and dense.The results of employing the proposed algorithm to Elk1993 and Deer1995 datasets are also presented,which show the effectiveness of the algorithm.
文摘Wireless node localization is one of the key technologies for wireless sensor networks. Outdoor localization can use GPS, AGPS (Assisted Global Positioning System) [6], but in buildings like supermarkets and underground parking, the accuracy of GPS and even AGPS will be greatly reduced. Since Indoor localization requests higher accuracy, using GPS or AGPS for indoor localization is not feasible in the current view. RSSI-based trilateral localization algorithm, due to its low cost, no additional hardware support, and easy-understanding, it becomes the mainstream localization algorithm in wireless sensor networks. With the development of wireless sensor networks and smart devices, the number of WIFI access point in these buildings is increasing, as long as a mobile smart device can detect three or three more known WIFI hotspots’ positions, it would be relatively easy to realize self-localization (Usually WIFI access points locations are fixed). The key problem is that the RSSI value is relatively vulnerable to the influence of the physical environment, causing large calculation error in RSSI-based localization algorithm. The paper proposes an improved RSSI-based algorithm, the experimental results show that compared with original RSSI-based localization algorithms the algorithm improves the localization accuracy and reduces the deviation.