期刊文献+
共找到9篇文章
< 1 >
每页显示 20 50 100
An Efficient Machine Learning Based Precoding Algorithm for Millimeter-Wave Massive MIMO
1
作者 Waleed Shahjehan Abid Ullah +3 位作者 Syed Waqar Shah Ayman A.Aly Bassem F.Felemban Wonjong Noh 《Computers, Materials & Continua》 SCIE EI 2022年第6期5399-5411,共13页
Millimeter wave communication works in the 30–300 GHz frequency range,and can obtain a very high bandwidth,which greatly improves the transmission rate of the communication system and becomes one of the key technolog... Millimeter wave communication works in the 30–300 GHz frequency range,and can obtain a very high bandwidth,which greatly improves the transmission rate of the communication system and becomes one of the key technologies of fifth-generation(5G).The smaller wavelength of the millimeter wave makes it possible to assemble a large number of antennas in a small aperture.The resulting array gain can compensate for the path loss of the millimeter wave.Utilizing this feature,the millimeter wave massive multiple-input multiple-output(MIMO)system uses a large antenna array at the base station.It enables the transmission of multiple data streams,making the system have a higher data transmission rate.In the millimeter wave massive MIMO system,the precoding technology uses the state information of the channel to adjust the transmission strategy at the transmitting end,and the receiving end performs equalization,so that users can better obtain the antenna multiplexing gain and improve the system capacity.This paper proposes an efficient algorithm based on machine learning(ML)for effective system performance in mmwave massive MIMO systems.The main idea is to optimize the adaptive connection structure to maximize the received signal power of each user and correlate the RF chain and base station antenna.Simulation results show that,the proposed algorithm effectively improved the system performance in terms of spectral efficiency and complexity as compared with existing algorithms. 展开更多
关键词 MIMO phased array precoding scheme machine learning optimization
在线阅读 下载PDF
Hybrid Gene Selection Methods for High-Dimensional Lung Cancer Data Using Improved Arithmetic Optimization Algorithm
2
作者 Mutasem K.Alsmadi 《Computers, Materials & Continua》 SCIE EI 2024年第6期5175-5200,共26页
Lung cancer is among the most frequent cancers in the world,with over one million deaths per year.Classification is required for lung cancer diagnosis and therapy to be effective,accurate,and reliable.Gene expression ... Lung cancer is among the most frequent cancers in the world,with over one million deaths per year.Classification is required for lung cancer diagnosis and therapy to be effective,accurate,and reliable.Gene expression microarrays have made it possible to find genetic biomarkers for cancer diagnosis and prediction in a high-throughput manner.Machine Learning(ML)has been widely used to diagnose and classify lung cancer where the performance of ML methods is evaluated to identify the appropriate technique.Identifying and selecting the gene expression patterns can help in lung cancer diagnoses and classification.Normally,microarrays include several genes and may cause confusion or false prediction.Therefore,the Arithmetic Optimization Algorithm(AOA)is used to identify the optimal gene subset to reduce the number of selected genes.Which can allow the classifiers to yield the best performance for lung cancer classification.In addition,we proposed a modified version of AOA which can work effectively on the high dimensional dataset.In the modified AOA,the features are ranked by their weights and are used to initialize the AOA population.The exploitation process of AOA is then enhanced by developing a local search algorithm based on two neighborhood strategies.Finally,the efficiency of the proposed methods was evaluated on gene expression datasets related to Lung cancer using stratified 4-fold cross-validation.The method’s efficacy in selecting the optimal gene subset is underscored by its ability to maintain feature proportions between 10%to 25%.Moreover,the approach significantly enhances lung cancer prediction accuracy.For instance,Lung_Harvard1 achieved an accuracy of 97.5%,Lung_Harvard2 and Lung_Michigan datasets both achieved 100%,Lung_Adenocarcinoma obtained an accuracy of 88.2%,and Lung_Ontario achieved an accuracy of 87.5%.In conclusion,the results indicate the potential promise of the proposed modified AOA approach in classifying microarray cancer data. 展开更多
关键词 Lung cancer gene selection improved arithmetic optimization algorithm and machine learning
暂未订购
A novel state of health estimation model for lithium-ion batteries incorporating signal processing and optimized machine learning methods
3
作者 Xing Zhang Juqiang Feng +2 位作者 Feng Cai Kaifeng Huang Shunli Wang 《Frontiers in Energy》 2025年第3期348-364,共17页
An accurate assessment of the state of health(SOH)is the cornerstone for guaranteeing the long-term stable operation of electrical equipment.However,the noise the data carries during cyclic aging poses a severe challe... An accurate assessment of the state of health(SOH)is the cornerstone for guaranteeing the long-term stable operation of electrical equipment.However,the noise the data carries during cyclic aging poses a severe challenge to the accuracy of SOH estimation and the generalization ability of the model.To this end,this paper proposed a novel SOH estimation model for lithium-ion batteries that incorporates advanced signal-processing techniques and optimized machine-learning strategies.The model employs a whale optimization algorithm(WOA)to seek the optimal parameter combination(K,α)for the variational modal decomposition(VMD)method to ensure that the signals are accurately decomposed into different modes representing the SOH of batteries.Then,the excellent local feature extraction capability of the convolutional neural network(CNN)was utilized to obtain the critical features of each modal of SOH.Finally,the support vector machine(SVM)was selected as the final SOH estimation regressor based on its generalization ability and efficient performance on small sample datasets.The method proposed was validated on a two-class publicly available aging dataset of lithium-ion batteries containing different temperatures,discharge rates,and discharge depths.The results show that the WOA-VMD-based data processing technique effectively solves the interference problem of cyclic aging data noise on SOH estimation.The CNN-SVM optimized machine learning method significantly improves the accuracy of SOH estimation.Compared with traditional techniques,the fused algorithm achieves significant results in solving the interference of data noise,improving the accuracy of SOH estimation,and enhancing the generalization ability. 展开更多
关键词 state of health(SOH)estimation optimized machine learning signal processing whale optimization algorithm-variational modal decomposition(WOA-VMD) convolutional neural network-support vector machine(CNN-SVM)
原文传递
A novel hybrid estimation of distribution algorithm for solving hybrid flowshop scheduling problem with unrelated parallel machine 被引量:10
4
作者 孙泽文 顾幸生 《Journal of Central South University》 SCIE EI CAS CSCD 2017年第8期1779-1788,共10页
The hybrid flow shop scheduling problem with unrelated parallel machine is a typical NP-hard combinatorial optimization problem, and it exists widely in chemical, manufacturing and pharmaceutical industry. In this wor... The hybrid flow shop scheduling problem with unrelated parallel machine is a typical NP-hard combinatorial optimization problem, and it exists widely in chemical, manufacturing and pharmaceutical industry. In this work, a novel mathematic model for the hybrid flow shop scheduling problem with unrelated parallel machine(HFSPUPM) was proposed. Additionally, an effective hybrid estimation of distribution algorithm was proposed to solve the HFSPUPM, taking advantage of the features in the mathematic model. In the optimization algorithm, a new individual representation method was adopted. The(EDA) structure was used for global search while the teaching learning based optimization(TLBO) strategy was used for local search. Based on the structure of the HFSPUPM, this work presents a series of discrete operations. Simulation results show the effectiveness of the proposed hybrid algorithm compared with other algorithms. 展开更多
关键词 hybrid estimation of distribution algorithm teaching learning based optimization strategy hybrid flow shop unrelated parallel machine scheduling
在线阅读 下载PDF
Improved nonconvex optimization model for low-rank matrix recovery 被引量:1
5
作者 李玲芝 邹北骥 朱承璋 《Journal of Central South University》 SCIE EI CAS CSCD 2015年第3期984-991,共8页
Low-rank matrix recovery is an important problem extensively studied in machine learning, data mining and computer vision communities. A novel method is proposed for low-rank matrix recovery, targeting at higher recov... Low-rank matrix recovery is an important problem extensively studied in machine learning, data mining and computer vision communities. A novel method is proposed for low-rank matrix recovery, targeting at higher recovery accuracy and stronger theoretical guarantee. Specifically, the proposed method is based on a nonconvex optimization model, by solving the low-rank matrix which can be recovered from the noisy observation. To solve the model, an effective algorithm is derived by minimizing over the variables alternately. It is proved theoretically that this algorithm has stronger theoretical guarantee than the existing work. In natural image denoising experiments, the proposed method achieves lower recovery error than the two compared methods. The proposed low-rank matrix recovery method is also applied to solve two real-world problems, i.e., removing noise from verification code and removing watermark from images, in which the images recovered by the proposed method are less noisy than those of the two compared methods. 展开更多
关键词 machine learning computer vision matrix recovery nonconvex optimization
在线阅读 下载PDF
Optimized Phishing Detection with Recurrent Neural Network and Whale Optimizer Algorithm
6
作者 Brij Bhooshan Gupta Akshat Gaurav +3 位作者 Razaz Waheeb Attar Varsha Arya Ahmed Alhomoud Kwok Tai Chui 《Computers, Materials & Continua》 SCIE EI 2024年第9期4895-4916,共22页
Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detec... Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detection have relied heavily on feature engineering and have often fallen short in adapting to the dynamically changing patterns of phishingUniformResource Locator(URLs).Addressing these challenge,we introduce a framework that integrates the sequential data processing strengths of a Recurrent Neural Network(RNN)with the hyperparameter optimization prowess of theWhale Optimization Algorithm(WOA).Ourmodel capitalizes on an extensive Kaggle dataset,featuring over 11,000 URLs,each delineated by 30 attributes.The WOA’s hyperparameter optimization enhances the RNN’s performance,evidenced by a meticulous validation process.The results,encapsulated in precision,recall,and F1-score metrics,surpass baseline models,achieving an overall accuracy of 92%.This study not only demonstrates the RNN’s proficiency in learning complex patterns but also underscores the WOA’s effectiveness in refining machine learning models for the critical task of phishing detection. 展开更多
关键词 Phishing detection Recurrent Neural Network(RNN) Whale optimization Algorithm(WOA) CYBERSECURITY machine learning optimization
在线阅读 下载PDF
Learning to optimize:A tutorial for continuous and mixed-integer optimization 被引量:1
7
作者 Xiaohan Chen Jialin Liu Wotao Yin 《Science China Mathematics》 SCIE CSCD 2024年第6期1191-1262,共72页
Learning to optimize(L2O)stands at the intersection of traditional optimization and machine learning,utilizing the capabilities of machine learning to enhance conventional optimization techniques.As real-world optimiz... Learning to optimize(L2O)stands at the intersection of traditional optimization and machine learning,utilizing the capabilities of machine learning to enhance conventional optimization techniques.As real-world optimization problems frequently share common structures,L2O provides a tool to exploit these structures for better or faster solutions.This tutorial dives deep into L2O techniques,introducing how to accelerate optimization algorithms,promptly estimate the solutions,or even reshape the optimization problem itself,making it more adaptive to real-world applications.By considering the prerequisites for successful applications of L2O and the structure of the optimization problems at hand,this tutorial provides a comprehensive guide for practitioners and researchers alike. 展开更多
关键词 AI for mathematics(AI4Math) learning to optimize algorithm unrolling plug-and-play methods differentiable programming machine learning for combinatorial optimization(ML4CO)
原文传递
Optimal Kernel-based Extreme Learning and Multi-objective Function-aided Task Scheduling for Solving Load Balancing Problems in Cloud Environment
8
作者 Ravi Gugulothu Vijaya Saradhi Thommandru Suneetha Bulla 《Journal of Systems Science and Systems Engineering》 2025年第4期385-409,共25页
Workload balancing in cloud computing is not yet resolved,particularly considering Infrastructure as a Service(IaaS)in the cloud network.The problem of being underloaded or overloaded should not occur at the time of t... Workload balancing in cloud computing is not yet resolved,particularly considering Infrastructure as a Service(IaaS)in the cloud network.The problem of being underloaded or overloaded should not occur at the time of the server or host accessing the cloud which may lead to create system crash problem.Thus,to resolve these existing problems,an efficient task scheduling algorithm is required for distributing the tasks over the entire feasible resources,which is termed load balancing.The load balancing approach assures that the entire Virtual Machines(VMs)are utilized appropriately.So,it is highly essential to develop a load-balancing model in a cloud environment based on machine learning and optimization strategies.Here,the computing and networking data is utilized for the analysis to observe the traffic as well as performance patterns.The acquired data is offered to the machine learning decision to select the right server by predicting the performance effectively by employing an Optimal Kernel-based Extreme Learning Machine(OK-ELM)and their parameter is tuned by the developed hybrid approach Population Size-based Mud Ring Tunicate Swarm Algorithm(PS-MRTSA).Further,effective scheduling is performed to resolve the load balancing issues by employing the developed model MR-TSA.Here,the developed approach effectively resolves the multi-objective constraints such as Response time,Resource cost,and energy consumption.Thus,the recommended load balancing model securesan enhanced performance rate than the traditional approaches over several experimental analyses. 展开更多
关键词 Cloud environment load balancing problem optimal kernel-based extreme learning machine population size-based mud ring tunicate swarm algorithm multi-objective function
原文传递
Online payment fraud:from anomaly detection to risk management
9
作者 Paolo Vanini Sebastiano Rossi +1 位作者 Ermin Zvizdic Thomas Domenig 《Financial Innovation》 2023年第1期1788-1812,共25页
Online banking fraud occurs whenever a criminal can seize accounts and transfer funds from an individual’s online bank account.Successfully preventing this requires the detection of as many fraudsters as possible,wit... Online banking fraud occurs whenever a criminal can seize accounts and transfer funds from an individual’s online bank account.Successfully preventing this requires the detection of as many fraudsters as possible,without producing too many false alarms.This is a challenge for machine learning owing to the extremely imbalanced data and complexity of fraud.In addition,classical machine learning methods must be extended,minimizing expected financial losses.Finally,fraud can only be combated systematically and economically if the risks and costs in payment channels are known.We define three models that overcome these challenges:machine learning-based fraud detection,economic optimization of machine learning results,and a risk model to predict the risk of fraud while considering countermeasures.The models were tested utilizing real data.Our machine learning model alone reduces the expected and unexpected losses in the three aggregated payment channels by 15%compared to a benchmark consisting of static if-then rules.Optimizing the machine-learning model further reduces the expected losses by 52%.These results hold with a low false positive rate of 0.4%.Thus,the risk framework of the three models is viable from a business and risk perspective. 展开更多
关键词 Payment fraud risk management Anomaly detection Ensemble models Integration of machine learning and statistical risk modelling Economic optimization machine learning outputs
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部