Brain tumor segmentation from Magnetic Resonance Imaging(MRI)supports neurologists and radiologists in analyzing tumors and developing personalized treatment plans,making it a crucial yet challenging task.Supervised m...Brain tumor segmentation from Magnetic Resonance Imaging(MRI)supports neurologists and radiologists in analyzing tumors and developing personalized treatment plans,making it a crucial yet challenging task.Supervised models such as 3D U-Net perform well in this domain,but their accuracy significantly improves with appropriate preprocessing.This paper demonstrates the effectiveness of preprocessing in brain tumor segmentation by applying a pre-segmentation step based on the Generalized Gaussian Mixture Model(GGMM)to T1 contrastenhanced MRI scans from the BraTS 2020 dataset.The Expectation-Maximization(EM)algorithm is employed to estimate parameters for four tissue classes,generating a new pre-segmented channel that enhances the training and performance of the 3DU-Net model.The proposed GGMM+3D U-Net framework achieved a Dice coefficient of 0.88 for whole tumor segmentation,outperforming both the standard multiscale 3D U-Net(0.84)and MMU-Net(0.85).It also delivered higher Intersection over Union(IoU)scores compared to models trained without preprocessing or with simpler GMM-based segmentation.These results,supported by qualitative visualizations,suggest that GGMM-based preprocessing should be integrated into brain tumor segmentation pipelines to optimize performance.展开更多
In the study of the composite materials performance,X-ray computed tomography(XCT)scanning has always been one of the important measures to detect the internal structures.CT image segmentation technology will effectiv...In the study of the composite materials performance,X-ray computed tomography(XCT)scanning has always been one of the important measures to detect the internal structures.CT image segmentation technology will effectively improve the accuracy of the subsequent material feature extraction process,which is of great significance to the study of material performance.This study focuses on the low accuracy problem of image segmentation caused by fiber cross-section adhesion in composite CT images.In the core layer area,area validity is evaluated by morphological indicator and an iterative segmentation strategy is proposed based on the watershed algorithm.In the transition layer area,a U-net neural network model trained by using artificial labels is applied to the prediction of segmentation result.Furthermore,a CT image segmentation method for fiber composite materials based on the improved watershed algorithm and the U-net model is proposed.It is verified by experiments that the method has good adaptability and effectiveness to the CT image segmentation problem of composite materials,and the accuracy of segmentation is significantly improved in comparison with the original method,which ensures the accuracy and robustness of the subsequent fiber feature extraction process.展开更多
The demand for cybersecurity is rising recently due to the rapid improvement of network technologies.As a primary defense mechanism,an intrusion detection system(IDS)was anticipated to adapt and secure com-puting infr...The demand for cybersecurity is rising recently due to the rapid improvement of network technologies.As a primary defense mechanism,an intrusion detection system(IDS)was anticipated to adapt and secure com-puting infrastructures from the constantly evolving,sophisticated threat land-scape.Recently,various deep learning methods have been put forth;however,these methods struggle to recognize all forms of assaults,especially infrequent attacks,because of network traffic imbalances and a shortage of aberrant traffic samples for model training.This work introduces deep learning(DL)based Attention based Nested U-Net(ANU-Net)for intrusion detection to address these issues and enhance detection performance.For this IDS model,the first data preprocessing is carried out in three stages:duplication elimi-nation,label transformation,and data normalization.Then the features are extracted and selected based on the Improved Flower Pollination Algorithm(IFPA).The Improved Monarchy Butterfly Optimization Algorithm(IMBO),a new metaheuristic,is used to modify the hyper-parameters in ANU-Net,effectively increasing the learning rate for spatial-temporal information and resolving the imbalance problem.Through the use of parallel programming,the MapReduce architecture reduces computation complexity while signifi-cantly accelerating processing.Three publicly available data sets were used to evaluate and test the approach.The investigational outcomes suggest that the proposed technique can more efficiently boost the performances of IDS under the scenario of unbalanced data.The proposed method achieves above 98%accuracy and classifies various attacks significantly well compared to other classifiers.展开更多
Since real world communication channels are not error free, the coded data transmitted on them may be corrupted, and block based image coding systems are vulnerable to transmission impairment. So the best neighborh...Since real world communication channels are not error free, the coded data transmitted on them may be corrupted, and block based image coding systems are vulnerable to transmission impairment. So the best neighborhood match method using genetic algorithm is used to conceal the error blocks. Experimental results show that the searching space can be greatly reduced by using genetic algorithm compared with exhaustive searching method, and good image quality is achieved. The peak signal noise ratios(PSNRs) of the restored images are increased greatly.展开更多
To resist the side chaimel attacks of elliptic curve cryptography, a new fast and secure point multiplication algorithm is proposed. The algorithm is based on a particular kind of addition chains involving only additi...To resist the side chaimel attacks of elliptic curve cryptography, a new fast and secure point multiplication algorithm is proposed. The algorithm is based on a particular kind of addition chains involving only additions, providing a natural protection against side channel attacks. Moreover, the new addition formulae that take into account the specific structure of those chains making point multiplication very efficient are proposed. The point multiplication algorithm only needs 1 719 multiplications for the SAC260 of 160-bit integers. For chains of length from 280 to 260, the proposed method outperforms all the previous methods with a gain of 26% to 31% over double-and add, 16% to22% over NAF, 7% to 13% over4-NAF and 1% to 8% over the present best algorithm--double-base chain.展开更多
In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding....In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding. These new energy based objective criterions are further combined with the proficient search capability of swarm based algorithms to improve the efficiency and robustness. The proposed multilevel thresholding approach accurately determines the optimal threshold values by using generated energy curve, and acutely distinguishes different objects within the multi-channel complex images. The performance evaluation indices and experiments on different test images illustrate that Kapur's entropy aided with differential evolution and bacterial foraging optimization algorithm generates the most accurate and visually pleasing segmented images.展开更多
This paper introduces a new approach of firefly algorithm based on opposition-based learning (OBFA) to enhance the global search ability of the original algorithm. The new algorithm employs opposition based learning...This paper introduces a new approach of firefly algorithm based on opposition-based learning (OBFA) to enhance the global search ability of the original algorithm. The new algorithm employs opposition based learning concept to generate initial population and also updating agents’ positions. The proposed OBFA is applied for minimization of the factor of safety and search for critical failure surface in slope stability analysis. The numerical experiments demonstrate the effectiveness and robustness of the new algorithm.展开更多
In order to improve the picking efficiency,reduce the picking time,this paper take artificial picking operation of a certain distribution center which has double-area warehouse as the studying object.Discuss the picki...In order to improve the picking efficiency,reduce the picking time,this paper take artificial picking operation of a certain distribution center which has double-area warehouse as the studying object.Discuss the picking task allocation and routing problems.Establish the TSP model of order-picking system.Create a heuristic algorithm bases on the Genetic Algorithm(GA)which help to solve the task allocating problem and to get the associated order-picking routes.And achieve the simulation experiment with the Visual 6.0C++platform to prove the rationality of the model and the effectiveness of the arithmetic.展开更多
The artificial bee colony (ABC) algorithm is a com- petitive stochastic population-based optimization algorithm. How- ever, the ABC algorithm does not use the social information and lacks the knowledge of the proble...The artificial bee colony (ABC) algorithm is a com- petitive stochastic population-based optimization algorithm. How- ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in- sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA called Archimedean copula estima- tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench- mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen- tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.展开更多
Ore image segmentation is a key step in an ore grain size analysis based on image processing.The traditional segmentation methods do not deal with ore textures and shadows in ore images well Those methods often suffer...Ore image segmentation is a key step in an ore grain size analysis based on image processing.The traditional segmentation methods do not deal with ore textures and shadows in ore images well Those methods often suffer from under-segmentation and over-segmentation.In this article,in order to solve the problem,an ore image segmentation method based on U-Net is proposed.We adjust the structure of U-Net to speed up the processing,and we modify the loss function to enhance the generalization of the model.After the collection of the ore image,we design the annotation standard and train the network with the annotated image.Finally,the marked watershed algorithm is used to segment the adhesion area.The experimental results show that the proposed method has the characteristics of fast speed,strong robustness and high precision.It has great practical value to the actual ore grain statistical task.展开更多
For the dense macro-femto coexistence networks scenario, a long-term-based handover(LTBH) algorithm is proposed. The handover decision algorithm is jointly determined by the angle of handover(AHO) and the time-tos...For the dense macro-femto coexistence networks scenario, a long-term-based handover(LTBH) algorithm is proposed. The handover decision algorithm is jointly determined by the angle of handover(AHO) and the time-tostay(TTS) to reduce the unnecessary handover numbers.First, the proposed AHO parameter is used to decrease the computation complexity in multiple candidate base stations(CBSs) scenario. Then, two types of TTS parameters are given for the fixed base stations and mobile base stations to make handover decisions among multiple CBSs. The simulation results show that the proposed LTBH algorithm can not only maintain the required transmission rate of users, but also effectively reduce the unnecessary numbers of handover in the dense macro-femto networks with the coexisting mobile BSs.展开更多
Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some ...Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some problems: it is still sensitive to initial clustering centers and the clustering results are not good when the tested datasets with noise are very unequal. An improved kernel possibilistic fuzzy c-means algorithm based on invasive weed optimization(IWO-KPFCM) is proposed in this paper. This algorithm first uses invasive weed optimization(IWO) algorithm to seek the optimal solution as the initial clustering centers, and introduces kernel method to make the input data from the sample space map into the high-dimensional feature space. Then, the sample variance is introduced in the objection function to measure the compact degree of data. Finally, the improved algorithm is used to cluster data. The simulation results of the University of California-Irvine(UCI) data sets and artificial data sets show that the proposed algorithm has stronger ability to resist noise, higher cluster accuracy and faster convergence speed than the PFCM algorithm.展开更多
This paper investigates the problem of synchronization for offset quadrature amplitude modulation based orthogonal frequency division multiplexing(OFDM/OQAM) systems based on the genetic algorithm. In order to increas...This paper investigates the problem of synchronization for offset quadrature amplitude modulation based orthogonal frequency division multiplexing(OFDM/OQAM) systems based on the genetic algorithm. In order to increase the spectrum efficiency,an improved preamble structure without guard symbols is derived at first. On this basis, instead of deriving the log likelihood function of power spectral density, joint estimation of the symbol timing offset and carrier frequency offset based on the preamble proposed is formulated into a bivariate optimization problem. After that, an improved genetic algorithm is used to find its global optimum solution. Conclusions can be drawn from simulation results that the proposed method has advantages in the joint estimation of synchronization.展开更多
In the recent years,microarray technology gained attention for concurrent monitoring of numerous microarray images.It remains a major challenge to process,store and transmit such huge volumes of microarray images.So,i...In the recent years,microarray technology gained attention for concurrent monitoring of numerous microarray images.It remains a major challenge to process,store and transmit such huge volumes of microarray images.So,image compression techniques are used in the reduction of number of bits so that it can be stored and the images can be shared easily.Various techniques have been proposed in the past with applications in different domains.The current research paper presents a novel image compression technique i.e.,optimized Linde–Buzo–Gray(OLBG)with Lempel Ziv Markov Algorithm(LZMA)coding technique called OLBG-LZMA for compressing microarray images without any loss of quality.LBG model is generally used in designing a local optimal codebook for image compression.Codebook construction is treated as an optimizationissue and can be resolved with the help of Grey Wolf Optimization(GWO)algorithm.Once the codebook is constructed by LBGGWO algorithm,LZMA is employed for the compression of index table and raise its compression efficiency additionally.Experiments were performed on high resolution Tissue Microarray(TMA)image dataset of 50 prostate tissue samples collected from prostate cancer patients.The compression performance of the proposed coding esd compared with recently proposed techniques.The simulation results infer that OLBG-LZMA coding achieved a significant compression performance compared to other techniques.展开更多
The uncertain duration of each job in each machine in flow shop problem was regarded as an independent random variable and was described by mathematical expectation.And then,an immune based partheno-genetic algorithm ...The uncertain duration of each job in each machine in flow shop problem was regarded as an independent random variable and was described by mathematical expectation.And then,an immune based partheno-genetic algorithm was proposed by making use of concepts and principles introduced from immune system and genetic system in nature.In this method,processing se-quence of products could be expressed by the character encoding and each antibody represents a feasible schedule.Affinity was used to measure the matching degree between antibody and antigen.Then several antibodies producing operators,such as swopping,mov-ing,inverting,etc,were worked out.This algorithm was combined with evolution function of the genetic algorithm and density mechanism in organisms immune system.Promotion and inhibition of antibodies were realized by expected propagation ratio of an-tibodies,and in this way,premature convergence was improved.The simulation proved that this algorithm is effective.展开更多
The study aims to investigate the financial technology(FinTech)factors influencing Chinese banking performance.Financial expectations and global realities may be changed by FinTech’s multidimensional scope,which is l...The study aims to investigate the financial technology(FinTech)factors influencing Chinese banking performance.Financial expectations and global realities may be changed by FinTech’s multidimensional scope,which is lacking in the traditional financial sector.The use of technology to automate financial services is becoming more important for economic organizations and industries because the digital age has seen a period of transition in terms of consumers and personalization.The future of FinTech will be shaped by technologies like the Internet of Things,blockchain,and artificial intelligence.The involvement of these platforms in financial services is a major concern for global business growth.FinTech is becoming more popular with customers because of such benefits.FinTech has driven a fundamental change within the financial services industry,placing the client at the center of everything.Protection has become a primary focus since data are a component of FinTech transactions.The task of consolidating research reports for consensus is very manual,as there is no standardized format.Although existing research has proposed certain methods,they have certain drawbacks in FinTech payment systems(including cryptocurrencies),credit markets(including peer-to-peer lending),and insurance systems.This paper implements blockchainbased financial technology for the banking sector to overcome these transition issues.In this study,we have proposed an adaptive neuro-fuzzy-based K-nearest neighbors’algorithm.The chaotic improved foraging optimization algorithm is used to optimize the proposed method.The rolling window autoregressive lag modeling approach analyzes FinTech growth.The proposed algorithm is compared with existing approaches to demonstrate its efficiency.The findings showed that it achieved 91%accuracy,90%privacy,96%robustness,and 25%cyber-risk performance.Compared with traditional approaches,the recommended strategy will be more convenient,safe,and effective in the transition period.展开更多
In this paper,we present a phase multiplication algorithm(PMA)to obtain scalable fringe precision in laser self-mixing interferometer under a weak feedback regime.Merely by applying the double angle formula on the sel...In this paper,we present a phase multiplication algorithm(PMA)to obtain scalable fringe precision in laser self-mixing interferometer under a weak feedback regime.Merely by applying the double angle formula on the self-mixing signal multiple times,the continuously improved fringe precision will be obtained.Theoretical analysis shows that the precision of the fringe could be improved toλ/2^(n+1).The validity of the proposed method is demonstrated by means of simulated SMI signals and confirmed by experiments under different amplitudes.A fringe precision ofλ/128 at a sampling rate of 500 k S/s has been achieved after doing 6 th the PMA.Finally,an amplitude of 50 nm has been proved to be measurable and the absolute error is 3.07 nm,which is within the theoretical error range.The proposed method for vibration measurement has the advantage of high accuracy and reliable without adding any additional optical elements in the optical path,thus it will play an important role in nanoscale measurement field.展开更多
The iterative closest point(ICP)algorithm has the advantages of high accuracy and fast speed for point set registration,but it performs poorly when the point set has a large number of noisy outliers.To solve this prob...The iterative closest point(ICP)algorithm has the advantages of high accuracy and fast speed for point set registration,but it performs poorly when the point set has a large number of noisy outliers.To solve this problem,we propose a new affine registration algorithm based on correntropy which works well in the affine registration of point sets with outliers.Firstly,we substitute the traditional measure of least squares with a maximum correntropy criterion to build a new registration model,which can avoid the influence of outliers.To maximize the objective function,we then propose a robust affine ICP algorithm.At each iteration of this new algorithm,we set up the index mapping of two point sets according to the known transformation,and then compute the closed-form solution of the new transformation according to the known index mapping.Similar to the traditional ICP algorithm,our algorithm converges to a local maximum monotonously for any given initial value.Finally,the robustness and high efficiency of affine ICP algorithm based on correntropy are demonstrated by 2D and 3D point set registration experiments.展开更多
With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the pr...With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the problem of anomaly detection is a hot topic.Based on the development of anomalous trajectory detection of moving objects,this paper introduces the classical trajectory outlier detection(TRAOD) algorithm,and then proposes a density-based trajectory outlier detection(DBTOD) algorithm,which compensates the disadvantages of the TRAOD algorithm that it is unable to detect anomalous defects when the trajectory is local and dense.The results of employing the proposed algorithm to Elk1993 and Deer1995 datasets are also presented,which show the effectiveness of the algorithm.展开更多
Wireless node localization is one of the key technologies for wireless sensor networks. Outdoor localization can use GPS, AGPS (Assisted Global Positioning System) [6], but in buildings like supermarkets and undergrou...Wireless node localization is one of the key technologies for wireless sensor networks. Outdoor localization can use GPS, AGPS (Assisted Global Positioning System) [6], but in buildings like supermarkets and underground parking, the accuracy of GPS and even AGPS will be greatly reduced. Since Indoor localization requests higher accuracy, using GPS or AGPS for indoor localization is not feasible in the current view. RSSI-based trilateral localization algorithm, due to its low cost, no additional hardware support, and easy-understanding, it becomes the mainstream localization algorithm in wireless sensor networks. With the development of wireless sensor networks and smart devices, the number of WIFI access point in these buildings is increasing, as long as a mobile smart device can detect three or three more known WIFI hotspots’ positions, it would be relatively easy to realize self-localization (Usually WIFI access points locations are fixed). The key problem is that the RSSI value is relatively vulnerable to the influence of the physical environment, causing large calculation error in RSSI-based localization algorithm. The paper proposes an improved RSSI-based algorithm, the experimental results show that compared with original RSSI-based localization algorithms the algorithm improves the localization accuracy and reduces the deviation.展开更多
基金Princess Nourah Bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R826),Princess Nourah Bint Abdulrahman University,Riyadh,Saudi ArabiaNorthern Border University,Saudi Arabia,for supporting this work through project number(NBU-CRP-2025-2933).
文摘Brain tumor segmentation from Magnetic Resonance Imaging(MRI)supports neurologists and radiologists in analyzing tumors and developing personalized treatment plans,making it a crucial yet challenging task.Supervised models such as 3D U-Net perform well in this domain,but their accuracy significantly improves with appropriate preprocessing.This paper demonstrates the effectiveness of preprocessing in brain tumor segmentation by applying a pre-segmentation step based on the Generalized Gaussian Mixture Model(GGMM)to T1 contrastenhanced MRI scans from the BraTS 2020 dataset.The Expectation-Maximization(EM)algorithm is employed to estimate parameters for four tissue classes,generating a new pre-segmented channel that enhances the training and performance of the 3DU-Net model.The proposed GGMM+3D U-Net framework achieved a Dice coefficient of 0.88 for whole tumor segmentation,outperforming both the standard multiscale 3D U-Net(0.84)and MMU-Net(0.85).It also delivered higher Intersection over Union(IoU)scores compared to models trained without preprocessing or with simpler GMM-based segmentation.These results,supported by qualitative visualizations,suggest that GGMM-based preprocessing should be integrated into brain tumor segmentation pipelines to optimize performance.
文摘In the study of the composite materials performance,X-ray computed tomography(XCT)scanning has always been one of the important measures to detect the internal structures.CT image segmentation technology will effectively improve the accuracy of the subsequent material feature extraction process,which is of great significance to the study of material performance.This study focuses on the low accuracy problem of image segmentation caused by fiber cross-section adhesion in composite CT images.In the core layer area,area validity is evaluated by morphological indicator and an iterative segmentation strategy is proposed based on the watershed algorithm.In the transition layer area,a U-net neural network model trained by using artificial labels is applied to the prediction of segmentation result.Furthermore,a CT image segmentation method for fiber composite materials based on the improved watershed algorithm and the U-net model is proposed.It is verified by experiments that the method has good adaptability and effectiveness to the CT image segmentation problem of composite materials,and the accuracy of segmentation is significantly improved in comparison with the original method,which ensures the accuracy and robustness of the subsequent fiber feature extraction process.
文摘The demand for cybersecurity is rising recently due to the rapid improvement of network technologies.As a primary defense mechanism,an intrusion detection system(IDS)was anticipated to adapt and secure com-puting infrastructures from the constantly evolving,sophisticated threat land-scape.Recently,various deep learning methods have been put forth;however,these methods struggle to recognize all forms of assaults,especially infrequent attacks,because of network traffic imbalances and a shortage of aberrant traffic samples for model training.This work introduces deep learning(DL)based Attention based Nested U-Net(ANU-Net)for intrusion detection to address these issues and enhance detection performance.For this IDS model,the first data preprocessing is carried out in three stages:duplication elimi-nation,label transformation,and data normalization.Then the features are extracted and selected based on the Improved Flower Pollination Algorithm(IFPA).The Improved Monarchy Butterfly Optimization Algorithm(IMBO),a new metaheuristic,is used to modify the hyper-parameters in ANU-Net,effectively increasing the learning rate for spatial-temporal information and resolving the imbalance problem.Through the use of parallel programming,the MapReduce architecture reduces computation complexity while signifi-cantly accelerating processing.Three publicly available data sets were used to evaluate and test the approach.The investigational outcomes suggest that the proposed technique can more efficiently boost the performances of IDS under the scenario of unbalanced data.The proposed method achieves above 98%accuracy and classifies various attacks significantly well compared to other classifiers.
文摘Since real world communication channels are not error free, the coded data transmitted on them may be corrupted, and block based image coding systems are vulnerable to transmission impairment. So the best neighborhood match method using genetic algorithm is used to conceal the error blocks. Experimental results show that the searching space can be greatly reduced by using genetic algorithm compared with exhaustive searching method, and good image quality is achieved. The peak signal noise ratios(PSNRs) of the restored images are increased greatly.
基金The National Natural Science Foundation of China (No.60473029,60673072).
文摘To resist the side chaimel attacks of elliptic curve cryptography, a new fast and secure point multiplication algorithm is proposed. The algorithm is based on a particular kind of addition chains involving only additions, providing a natural protection against side channel attacks. Moreover, the new addition formulae that take into account the specific structure of those chains making point multiplication very efficient are proposed. The point multiplication algorithm only needs 1 719 multiplications for the SAC260 of 160-bit integers. For chains of length from 280 to 260, the proposed method outperforms all the previous methods with a gain of 26% to 31% over double-and add, 16% to22% over NAF, 7% to 13% over4-NAF and 1% to 8% over the present best algorithm--double-base chain.
文摘In this paper, a comprehensive energy function is used to formulate the three most popular objective functions:Kapur's, Otsu and Tsalli's functions for performing effective multilevel color image thresholding. These new energy based objective criterions are further combined with the proficient search capability of swarm based algorithms to improve the efficiency and robustness. The proposed multilevel thresholding approach accurately determines the optimal threshold values by using generated energy curve, and acutely distinguishes different objects within the multi-channel complex images. The performance evaluation indices and experiments on different test images illustrate that Kapur's entropy aided with differential evolution and bacterial foraging optimization algorithm generates the most accurate and visually pleasing segmented images.
文摘This paper introduces a new approach of firefly algorithm based on opposition-based learning (OBFA) to enhance the global search ability of the original algorithm. The new algorithm employs opposition based learning concept to generate initial population and also updating agents’ positions. The proposed OBFA is applied for minimization of the factor of safety and search for critical failure surface in slope stability analysis. The numerical experiments demonstrate the effectiveness and robustness of the new algorithm.
文摘In order to improve the picking efficiency,reduce the picking time,this paper take artificial picking operation of a certain distribution center which has double-area warehouse as the studying object.Discuss the picking task allocation and routing problems.Establish the TSP model of order-picking system.Create a heuristic algorithm bases on the Genetic Algorithm(GA)which help to solve the task allocating problem and to get the associated order-picking routes.And achieve the simulation experiment with the Visual 6.0C++platform to prove the rationality of the model and the effectiveness of the arithmetic.
基金supported by the National Natural Science Foundation of China(61201370)the Special Funding Project for Independent Innovation Achievement Transform of Shandong Province(2012CX30202)the Natural Science Foundation of Shandong Province(ZR2014FM039)
文摘The artificial bee colony (ABC) algorithm is a com- petitive stochastic population-based optimization algorithm. How- ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in- sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA called Archimedean copula estima- tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench- mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen- tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.
基金This work was supported by The National Natural Science Foundation of China(Grant 61801019).
文摘Ore image segmentation is a key step in an ore grain size analysis based on image processing.The traditional segmentation methods do not deal with ore textures and shadows in ore images well Those methods often suffer from under-segmentation and over-segmentation.In this article,in order to solve the problem,an ore image segmentation method based on U-Net is proposed.We adjust the structure of U-Net to speed up the processing,and we modify the loss function to enhance the generalization of the model.After the collection of the ore image,we design the annotation standard and train the network with the annotated image.Finally,the marked watershed algorithm is used to segment the adhesion area.The experimental results show that the proposed method has the characteristics of fast speed,strong robustness and high precision.It has great practical value to the actual ore grain statistical task.
基金The National Natural Science Foundation of China(No.61471164)the Fundamental Research Funds for the Central Universitiesthe Scientific Innovation Research of College Graduates in Jiangsu Province(No.KYLX-0133)
文摘For the dense macro-femto coexistence networks scenario, a long-term-based handover(LTBH) algorithm is proposed. The handover decision algorithm is jointly determined by the angle of handover(AHO) and the time-tostay(TTS) to reduce the unnecessary handover numbers.First, the proposed AHO parameter is used to decrease the computation complexity in multiple candidate base stations(CBSs) scenario. Then, two types of TTS parameters are given for the fixed base stations and mobile base stations to make handover decisions among multiple CBSs. The simulation results show that the proposed LTBH algorithm can not only maintain the required transmission rate of users, but also effectively reduce the unnecessary numbers of handover in the dense macro-femto networks with the coexisting mobile BSs.
文摘Fuzzy c-means(FCM) clustering algorithm is sensitive to noise points and outlier data, and the possibilistic fuzzy c-means(PFCM) clustering algorithm overcomes the problem well, but PFCM clustering algorithm has some problems: it is still sensitive to initial clustering centers and the clustering results are not good when the tested datasets with noise are very unequal. An improved kernel possibilistic fuzzy c-means algorithm based on invasive weed optimization(IWO-KPFCM) is proposed in this paper. This algorithm first uses invasive weed optimization(IWO) algorithm to seek the optimal solution as the initial clustering centers, and introduces kernel method to make the input data from the sample space map into the high-dimensional feature space. Then, the sample variance is introduced in the objection function to measure the compact degree of data. Finally, the improved algorithm is used to cluster data. The simulation results of the University of California-Irvine(UCI) data sets and artificial data sets show that the proposed algorithm has stronger ability to resist noise, higher cluster accuracy and faster convergence speed than the PFCM algorithm.
基金supported by the National Natural Science Foundation of China(61671468)。
文摘This paper investigates the problem of synchronization for offset quadrature amplitude modulation based orthogonal frequency division multiplexing(OFDM/OQAM) systems based on the genetic algorithm. In order to increase the spectrum efficiency,an improved preamble structure without guard symbols is derived at first. On this basis, instead of deriving the log likelihood function of power spectral density, joint estimation of the symbol timing offset and carrier frequency offset based on the preamble proposed is formulated into a bivariate optimization problem. After that, an improved genetic algorithm is used to find its global optimum solution. Conclusions can be drawn from simulation results that the proposed method has advantages in the joint estimation of synchronization.
文摘In the recent years,microarray technology gained attention for concurrent monitoring of numerous microarray images.It remains a major challenge to process,store and transmit such huge volumes of microarray images.So,image compression techniques are used in the reduction of number of bits so that it can be stored and the images can be shared easily.Various techniques have been proposed in the past with applications in different domains.The current research paper presents a novel image compression technique i.e.,optimized Linde–Buzo–Gray(OLBG)with Lempel Ziv Markov Algorithm(LZMA)coding technique called OLBG-LZMA for compressing microarray images without any loss of quality.LBG model is generally used in designing a local optimal codebook for image compression.Codebook construction is treated as an optimizationissue and can be resolved with the help of Grey Wolf Optimization(GWO)algorithm.Once the codebook is constructed by LBGGWO algorithm,LZMA is employed for the compression of index table and raise its compression efficiency additionally.Experiments were performed on high resolution Tissue Microarray(TMA)image dataset of 50 prostate tissue samples collected from prostate cancer patients.The compression performance of the proposed coding esd compared with recently proposed techniques.The simulation results infer that OLBG-LZMA coding achieved a significant compression performance compared to other techniques.
文摘The uncertain duration of each job in each machine in flow shop problem was regarded as an independent random variable and was described by mathematical expectation.And then,an immune based partheno-genetic algorithm was proposed by making use of concepts and principles introduced from immune system and genetic system in nature.In this method,processing se-quence of products could be expressed by the character encoding and each antibody represents a feasible schedule.Affinity was used to measure the matching degree between antibody and antigen.Then several antibodies producing operators,such as swopping,mov-ing,inverting,etc,were worked out.This algorithm was combined with evolution function of the genetic algorithm and density mechanism in organisms immune system.Promotion and inhibition of antibodies were realized by expected propagation ratio of an-tibodies,and in this way,premature convergence was improved.The simulation proved that this algorithm is effective.
基金from funding agencies in the public,commercial,or not-for-profit sectors.
文摘The study aims to investigate the financial technology(FinTech)factors influencing Chinese banking performance.Financial expectations and global realities may be changed by FinTech’s multidimensional scope,which is lacking in the traditional financial sector.The use of technology to automate financial services is becoming more important for economic organizations and industries because the digital age has seen a period of transition in terms of consumers and personalization.The future of FinTech will be shaped by technologies like the Internet of Things,blockchain,and artificial intelligence.The involvement of these platforms in financial services is a major concern for global business growth.FinTech is becoming more popular with customers because of such benefits.FinTech has driven a fundamental change within the financial services industry,placing the client at the center of everything.Protection has become a primary focus since data are a component of FinTech transactions.The task of consolidating research reports for consensus is very manual,as there is no standardized format.Although existing research has proposed certain methods,they have certain drawbacks in FinTech payment systems(including cryptocurrencies),credit markets(including peer-to-peer lending),and insurance systems.This paper implements blockchainbased financial technology for the banking sector to overcome these transition issues.In this study,we have proposed an adaptive neuro-fuzzy-based K-nearest neighbors’algorithm.The chaotic improved foraging optimization algorithm is used to optimize the proposed method.The rolling window autoregressive lag modeling approach analyzes FinTech growth.The proposed algorithm is compared with existing approaches to demonstrate its efficiency.The findings showed that it achieved 91%accuracy,90%privacy,96%robustness,and 25%cyber-risk performance.Compared with traditional approaches,the recommended strategy will be more convenient,safe,and effective in the transition period.
基金supported by the Natural Science Foundation of Fujian Province(No.2020J01705)the School Foundation of Jimei University(No.C150345)。
文摘In this paper,we present a phase multiplication algorithm(PMA)to obtain scalable fringe precision in laser self-mixing interferometer under a weak feedback regime.Merely by applying the double angle formula on the self-mixing signal multiple times,the continuously improved fringe precision will be obtained.Theoretical analysis shows that the precision of the fringe could be improved toλ/2^(n+1).The validity of the proposed method is demonstrated by means of simulated SMI signals and confirmed by experiments under different amplitudes.A fringe precision ofλ/128 at a sampling rate of 500 k S/s has been achieved after doing 6 th the PMA.Finally,an amplitude of 50 nm has been proved to be measurable and the absolute error is 3.07 nm,which is within the theoretical error range.The proposed method for vibration measurement has the advantage of high accuracy and reliable without adding any additional optical elements in the optical path,thus it will play an important role in nanoscale measurement field.
基金supported in part by the National Natural Science Foundation of China(61627811,61573274,61673126,U1701261)
文摘The iterative closest point(ICP)algorithm has the advantages of high accuracy and fast speed for point set registration,but it performs poorly when the point set has a large number of noisy outliers.To solve this problem,we propose a new affine registration algorithm based on correntropy which works well in the affine registration of point sets with outliers.Firstly,we substitute the traditional measure of least squares with a maximum correntropy criterion to build a new registration model,which can avoid the influence of outliers.To maximize the objective function,we then propose a robust affine ICP algorithm.At each iteration of this new algorithm,we set up the index mapping of two point sets according to the known transformation,and then compute the closed-form solution of the new transformation according to the known index mapping.Similar to the traditional ICP algorithm,our algorithm converges to a local maximum monotonously for any given initial value.Finally,the robustness and high efficiency of affine ICP algorithm based on correntropy are demonstrated by 2D and 3D point set registration experiments.
基金supported by the Aeronautical Science Foundation of China(20111052010)the Jiangsu Graduates Innovation Project (CXZZ120163)+1 种基金the "333" Project of Jiangsu Provincethe Qing Lan Project of Jiangsu Province
文摘With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the problem of anomaly detection is a hot topic.Based on the development of anomalous trajectory detection of moving objects,this paper introduces the classical trajectory outlier detection(TRAOD) algorithm,and then proposes a density-based trajectory outlier detection(DBTOD) algorithm,which compensates the disadvantages of the TRAOD algorithm that it is unable to detect anomalous defects when the trajectory is local and dense.The results of employing the proposed algorithm to Elk1993 and Deer1995 datasets are also presented,which show the effectiveness of the algorithm.
文摘Wireless node localization is one of the key technologies for wireless sensor networks. Outdoor localization can use GPS, AGPS (Assisted Global Positioning System) [6], but in buildings like supermarkets and underground parking, the accuracy of GPS and even AGPS will be greatly reduced. Since Indoor localization requests higher accuracy, using GPS or AGPS for indoor localization is not feasible in the current view. RSSI-based trilateral localization algorithm, due to its low cost, no additional hardware support, and easy-understanding, it becomes the mainstream localization algorithm in wireless sensor networks. With the development of wireless sensor networks and smart devices, the number of WIFI access point in these buildings is increasing, as long as a mobile smart device can detect three or three more known WIFI hotspots’ positions, it would be relatively easy to realize self-localization (Usually WIFI access points locations are fixed). The key problem is that the RSSI value is relatively vulnerable to the influence of the physical environment, causing large calculation error in RSSI-based localization algorithm. The paper proposes an improved RSSI-based algorithm, the experimental results show that compared with original RSSI-based localization algorithms the algorithm improves the localization accuracy and reduces the deviation.