期刊文献+
共找到9,544篇文章
< 1 2 250 >
每页显示 20 50 100
Energy Aware Task Scheduling of IoT Application Using a Hybrid Metaheuristic Algorithm in Cloud Computing
1
作者 Ahmed Awad Mohamed Eslam Abdelhakim Seyam +4 位作者 Ahmed R.Elsaeed Laith Abualigah Aseel Smerat Ahmed M.AbdelMouty Hosam E.Refaat 《Computers, Materials & Continua》 2026年第3期1786-1803,共18页
In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task schedul... In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption. 展开更多
关键词 Energy-efficient tasks internet of things(IoT) cloud fog computing artificial ecosystem-based optimization salp swarm algorithm cloud computing
在线阅读 下载PDF
Research on the Optimal Scheduling Model of Energy Storage Plant Based on Edge Computing and Improved Whale Optimization Algorithm
2
作者 Zhaoyu Zeng Fuyin Ni 《Energy Engineering》 2025年第3期1153-1174,共22页
Energy storage power plants are critical in balancing power supply and demand.However,the scheduling of these plants faces significant challenges,including high network transmission costs and inefficient inter-device ... Energy storage power plants are critical in balancing power supply and demand.However,the scheduling of these plants faces significant challenges,including high network transmission costs and inefficient inter-device energy utilization.To tackle these challenges,this study proposes an optimal scheduling model for energy storage power plants based on edge computing and the improved whale optimization algorithm(IWOA).The proposed model designs an edge computing framework,transferring a large share of data processing and storage tasks to the network edge.This architecture effectively reduces transmission costs by minimizing data travel time.In addition,the model considers demand response strategies and builds an objective function based on the minimization of the sum of electricity purchase cost and operation cost.The IWOA enhances the optimization process by utilizing adaptive weight adjustments and an optimal neighborhood perturbation strategy,preventing the algorithm from converging to suboptimal solutions.Experimental results demonstrate that the proposed scheduling model maximizes the flexibility of the energy storage plant,facilitating efficient charging and discharging.It successfully achieves peak shaving and valley filling for both electrical and heat loads,promoting the effective utilization of renewable energy sources.The edge-computing framework significantly reduces transmission delays between energy devices.Furthermore,IWOA outperforms traditional algorithms in optimizing the objective function. 展开更多
关键词 Energy storage plant edge computing optimal energy scheduling improved whale optimization algorithm
在线阅读 下载PDF
An Adaptive Firefly Algorithm for Dependent Task Scheduling in IoT-Fog Computing
3
作者 Adil Yousif 《Computer Modeling in Engineering & Sciences》 2025年第3期2869-2892,共24页
The Internet of Things(IoT)has emerged as an important future technology.IoT-Fog is a new computing paradigm that processes IoT data on servers close to the source of the data.In IoT-Fog computing,resource allocation ... The Internet of Things(IoT)has emerged as an important future technology.IoT-Fog is a new computing paradigm that processes IoT data on servers close to the source of the data.In IoT-Fog computing,resource allocation and independent task scheduling aim to deliver short response time services demanded by the IoT devices and performed by fog servers.The heterogeneity of the IoT-Fog resources and the huge amount of data that needs to be processed by the IoT-Fog tasks make scheduling fog computing tasks a challenging problem.This study proposes an Adaptive Firefly Algorithm(AFA)for dependent task scheduling in IoT-Fog computing.The proposed AFA is a modified version of the standard Firefly Algorithm(FA),considering the execution times of the submitted tasks,the impact of synchronization requirements,and the communication time between dependent tasks.As IoT-Fog computing depends mainly on distributed fog node servers that receive tasks in a dynamic manner,tackling the communications and synchronization issues between dependent tasks is becoming a challenging problem.The proposed AFA aims to address the dynamic nature of IoT-Fog computing environments.The proposed AFA mechanism considers a dynamic light absorption coefficient to control the decrease in attractiveness over iterations.The proposed AFA mechanism performance was benchmarked against the standard Firefly Algorithm(FA),Puma Optimizer(PO),Genetic Algorithm(GA),and Ant Colony Optimization(ACO)through simulations under light,typical,and heavy workload scenarios.In heavy workloads,the proposed AFA mechanism obtained the shortest average execution time,968.98 ms compared to 970.96,1352.87,1247.28,and 1773.62 of FA,PO,GA,and ACO,respectively.The simulation results demonstrate the proposed AFA’s ability to rapidly converge to optimal solutions,emphasizing its adaptability and efficiency in typical and heavy workloads. 展开更多
关键词 Fog computing sCHEDULING resource management firefly algorithm genetic algorithm ant colony optimization
在线阅读 下载PDF
Hybrid Spotted Hyena and Whale Optimization Algorithm-Based Dynamic Load Balancing Technique for Cloud Computing Environment
4
作者 N Jagadish Kumar R Praveen +1 位作者 D Selvaraj D Dhinakaran 《China Communications》 2025年第8期206-227,共22页
The uncertain nature of mapping user tasks to Virtual Machines(VMs) causes system failure or execution delay in Cloud Computing.To maximize cloud resource throughput and decrease user response time,load balancing is n... The uncertain nature of mapping user tasks to Virtual Machines(VMs) causes system failure or execution delay in Cloud Computing.To maximize cloud resource throughput and decrease user response time,load balancing is needed.Possible load balancing is needed to overcome user task execution delay and system failure.Most swarm intelligent dynamic load balancing solutions that used hybrid metaheuristic algorithms failed to balance exploitation and exploration.Most load balancing methods were insufficient to handle the growing uncertainty in job distribution to VMs.Thus,the Hybrid Spotted Hyena and Whale Optimization Algorithm-based Dynamic Load Balancing Mechanism(HSHWOA) partitions traffic among numerous VMs or servers to guarantee user chores are completed quickly.This load balancing approach improved performance by considering average network latency,dependability,and throughput.This hybridization of SHOA and WOA aims to improve the trade-off between exploration and exploitation,assign jobs to VMs with more solution diversity,and prevent the solution from reaching a local optimality.Pysim-based experimental verification and testing for the proposed HSHWOA showed a 12.38% improvement in minimized makespan,16.21% increase in mean throughput,and 14.84% increase in network stability compared to baseline load balancing strategies like Fractional Improved Whale Social Optimization Based VM Migration Strategy FIWSOA,HDWOA,and Binary Bird Swap. 展开更多
关键词 cloud computing load balancing spotted Hyena Optimization algorithm(sHOA) THrOUGHPUT Virtual Machines(VMs) Whale Optimization algorithm(WOA)
在线阅读 下载PDF
VHO Algorithm for Heterogeneous Networks of UAV-Hangar Cluster Based on GA Optimization and Edge Computing
5
作者 Siliang Chen Dongri Shan Yansheng Niu 《Computers, Materials & Continua》 2025年第12期5263-5286,共24页
With the increasing deployment of Unmanned Aerial Vehicle-Hangar(UAV-H)clusters in dynamic environments such as disaster response and precision agriculture,existing networking schemes often struggle with adaptability ... With the increasing deployment of Unmanned Aerial Vehicle-Hangar(UAV-H)clusters in dynamic environments such as disaster response and precision agriculture,existing networking schemes often struggle with adaptability to complex scenarios,while traditional Vertical Handoff(VHO)algorithms fail to fully address the unique challenges of UAV-H systems,including high-speed mobility and limited computational resources.To bridge this gap,this paper proposes a heterogeneous network architecture integrating 5th Generation Mobile Communication Technology(5G)cellular networks and self-organizing mesh networks for UAV-H clusters,accompanied by a novel VHO algorithm.The proposed algorithm leverages Multi-Attribute Decision-Making(MADM)theory combined with Genetic Algorithm(GA)optimization,incorporating edge computing to enable real-time decision-making and offload computational tasks efficiently.By constructing a utility function through attribute and weight matrices,the algorithm ensures UAV-H clusters dynamically select the optimal network access with the highest utility value.Simulation results demonstrate that the proposed method reduces network handoff times by 26.13%compared to the Decision Tree VHO(DT-VHO),effectively mitigating the ping-pong effect,and enhancing total system throughput by 19.99%under the same conditions.In terms of handoff delay,it outperforms the Artificial Neural Network VHO(ANN-VHO),significantly improving the Quality of Service(QoS).Finally,real-world hardware platform experiments validate the algorithm’s feasibility and superior performance in practical UAV-H cluster operations.This work provides a robust solution for seamless network connectivity in high-mobility UAV clusters,offering critical support for emerging applications requiring reliable and efficient wireless communication. 展开更多
关键词 Vertical handoff heterogeneous networks genetic algorithm multiple-attribute decision-making unmanned aerial vehicle edge computing
在线阅读 下载PDF
Innovative Approaches to Task Scheduling in Cloud Computing Environments Using an Advanced Willow Catkin Optimization Algorithm
6
作者 Jeng-Shyang Pan Na Yu +3 位作者 Shu-Chuan Chu An-Ning Zhang Bin Yan Junzo Watada 《Computers, Materials & Continua》 2025年第2期2495-2520,共26页
The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resource... The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment. 展开更多
关键词 Willow catkin optimization algorithm cloud computing task scheduling opposition-based learning strategy
在线阅读 下载PDF
The Application of Hybrid Krill Herd Artificial Hummingbird Algorithm for Scientific Workflow Scheduling in Fog Computing 被引量:1
7
作者 Aveen Othman Abdalrahman Daniel Pilevarzadeh +1 位作者 Shafi Ghafouri Ali Ghaffari 《Journal of Bionic Engineering》 SCIE EI CSCD 2023年第5期2443-2464,共22页
Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers an... Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers and computing resources is one of the factors that directly affect conservation costs in fog environments.Energy consumption can be reduced by efficacious scheduling methods so that tasks are offloaded on the best possible resources.To deal with this problem,a binary model based on the combination of the Krill Herd Algorithm(KHA)and the Artificial Hummingbird Algorithm(AHA)is introduced as Binary KHA-AHA(BAHA-KHA).KHA is used to improve AHA.Also,the BAHA-KHA local optimal problem for task scheduling in FC environments is solved using the dynamic voltage and frequency scaling(DVFS)method.The Heterogeneous Earliest Finish Time(HEFT)method is used to discover the order of task flow execution.The goal of the BAHA-KHA model is to minimize the number of resources,the communication between dependent tasks,and reduce energy consumption.In this paper,the FC environment is considered to address the workflow scheduling issue to reduce energy consumption and minimize makespan on fog resources.The results were tested on five different workflows(Montage,CyberShake,LIGO,SIPHT,and Epigenomics).The evaluations show that the BAHA-KHA model has the best performance in comparison with the AHA,KHA,PSO and GA algorithms.The BAHA-KHA model has reduced the makespan rate by about 18%and the energy consumption by about 24%in comparison with GA.This is a preview of subscription content,log in via an institution to check access. 展开更多
关键词 Workflow scheduling Fog computing Internet of Things Hummingbird algorithm Krill algorithm
在线阅读 下载PDF
The Objective Function Value Optimization of Cloud Computing Resources Security Allocation of Artificial Firefly Algorithm
8
作者 Xiaoxi Hu 《Open Journal of Optimization》 2015年第2期40-46,共7页
Based on the current cloud computing resources security distribution model’s problem that the optimization effect is not high and the convergence is not good, this paper puts forward a cloud computing resources secur... Based on the current cloud computing resources security distribution model’s problem that the optimization effect is not high and the convergence is not good, this paper puts forward a cloud computing resources security distribution model based on improved artificial firefly algorithm. First of all, according to characteristics of the artificial fireflies swarm algorithm and the complex method, it incorporates the ideas of complex method into the artificial firefly algorithm, uses the complex method to guide the search of artificial fireflies in population, and then introduces local search operator in the firefly mobile mechanism, in order to improve the searching efficiency and convergence precision of algorithm. Simulation results show that, the cloud computing resources security distribution model based on improved artificial firefly algorithm proposed in this paper has good convergence effect and optimum efficiency. 展开更多
关键词 Cloud computing rEsOUrCEs sECUrITY Distribution Improved Artificial FIrEFLY algorithm Complex Method Local search OPErATOr
在线阅读 下载PDF
Study on the Distributed Routing Algorithm and Its Security for Peer-to-Peer Computing
9
作者 ZHOUShi-jie 《Journal of Electronic Science and Technology of China》 2005年第2期187-188,共2页
关键词 peer-to-peer computing P2P distributed computing information security distributed routing algorithm bidding-electing algorithm one-way accumulator
在线阅读 下载PDF
Research of the DBN Algorithm Based on Multi-innovation Theory and Application of Social Computing
10
作者 Pinle Qin Meng Li +1 位作者 Qiguang Miao Chuanpeng Li 《国际计算机前沿大会会议论文集》 2016年第1期147-149,共3页
Aimed at the problems of small gradient, low learning rate, slow convergence error when the DBN using back-propagation process to fix the network connection weight and bias, proposing a new algorithm that combines wit... Aimed at the problems of small gradient, low learning rate, slow convergence error when the DBN using back-propagation process to fix the network connection weight and bias, proposing a new algorithm that combines with multi-innovation theory to improve standard DBN algorithm, that is the multi-innovation DBN(MI-DBN). It sets up a new model of back-propagation process in DBN algorithm, making the use of single innovation in previous algorithm extend to the use of innovation of the preceding multiple period, thus increasing convergence rate of error largely. To study the application of the algorithm in the social computing, and recognize the meaningful information about the handwritten numbers in social networking images. This paper compares MI-DBN algorithm with other representative classifiers through experiments. The result shows that MI-DBN algorithm, comparing with other representative classifiers, has a faster convergence rate and a smaller error for MNIST dataset recognition. And handwritten numbers on the image also have a precise degree of recognition. 展开更多
关键词 DBN algorithm CONVErGENCE error Multi-innovation theOrY MI-DBN algorithm sOCIAL computing
在线阅读 下载PDF
基于Spark的煤矿设备密封组件老化故障智能检测模型
11
作者 徐华 邵华 +2 位作者 宿国瑞 王泽 高文忠 《粘接》 2026年第1期61-64,共4页
煤矿设备的安全、高效运行对确保能源供应至关重要。鉴于煤矿设备的恶劣服役环境,尤其是含有有机材料组件易于老化失效的问题,构建基于Spark计算框架的XGBoost深度学习故障智能化诊断模型。所构建的模型采用Spark来处理煤矿设备的海量... 煤矿设备的安全、高效运行对确保能源供应至关重要。鉴于煤矿设备的恶劣服役环境,尤其是含有有机材料组件易于老化失效的问题,构建基于Spark计算框架的XGBoost深度学习故障智能化诊断模型。所构建的模型采用Spark来处理煤矿设备的海量运行数据,同时将数据作为XGBoost模型的训练和测试数据来深度学习,实现设备故障的智能化诊断。将提出的Spark计算框架的XGBoost深度学习故障智能化诊断模型与随机森林模型、Hadoop计算框架进行对比,结果表明所提出的设备故障智能化诊断模型对设备故障类型的识别准确率高,运行时间不足Hadoop计算框架的1/40。这对煤矿设备故障智能诊断,确保设备的安全、稳定运行具有一定的参考价值。 展开更多
关键词 sprak计算框架 XGBoost深度学习算法 煤矿设备 故障诊断
在线阅读 下载PDF
Random State Approach to Quantum Computation of Electronic-Structure Properties
12
作者 Yiran Bai Feng Xiong Xueheng Kuang 《Chinese Physics Letters》 2026年第1期89-104,共16页
Classical computation of electronic properties in large-scale materials remains challenging.Quantum computation has the potential to offer advantages in memory footprint and computational scaling.However,general and v... Classical computation of electronic properties in large-scale materials remains challenging.Quantum computation has the potential to offer advantages in memory footprint and computational scaling.However,general and viable quantum algorithms for simulating large-scale materials are still limited.We propose and implement random-state quantum algorithms to calculate electronic-structure properties of real materials.Using a random state circuit on a small number of qubits,we employ real-time evolution with first-order Trotter decomposition and Hadamard test to obtain electronic density of states,and we develop a modified quantum phase estimation algorithm to calculate real-space local density of states via direct quantum measurements.Furthermore,we validate these algorithms by numerically computing the density of states and spatial distributions of electronic states in graphene,twisted bilayer graphene quasicrystals,and fractal lattices,covering system sizes from hundreds to thousands of atoms.Our results manifest that the random-state quantum algorithms provide a general and qubit-efficient route to scalable simulations of electronic properties in large-scale periodic and aperiodic materials. 展开更多
关键词 periodic materials random state circuit random state quantum algorithms electronic structure properties density states aperiodic materials quantum algorithms quantum computation
原文传递
An Algorithm for Cloud-based Web Service Combination Optimization Through Plant Growth Simulation
13
作者 Li Qiang Qin Huawei +1 位作者 Qiao Bingqin Wu Ruifang 《系统仿真学报》 北大核心 2025年第2期462-473,共12页
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base... In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm. 展开更多
关键词 cloud-based service scheduling algorithm resource constraint load optimization cloud computing plant growth simulation algorithm
原文传递
Numbering and Generating Quantum Algorithms
14
作者 Mohamed A. El-Dosuky 《Journal of Computer and Communications》 2025年第2期126-141,共16页
Quantum computing offers unprecedented computational power, enabling simultaneous computations beyond traditional computers. Quantum computers differ significantly from classical computers, necessitating a distinct ap... Quantum computing offers unprecedented computational power, enabling simultaneous computations beyond traditional computers. Quantum computers differ significantly from classical computers, necessitating a distinct approach to algorithm design, which involves taming quantum mechanical phenomena. This paper extends the numbering of computable programs to be applied in the quantum computing context. Numbering computable programs is a theoretical computer science concept that assigns unique numbers to individual programs or algorithms. Common methods include Gödel numbering which encodes programs as strings of symbols or characters, often used in formal systems and mathematical logic. Based on the proposed numbering approach, this paper presents a mechanism to explore the set of possible quantum algorithms. The proposed approach is able to construct useful circuits such as Quantum Key Distribution BB84 protocol, which enables sender and receiver to establish a secure cryptographic key via a quantum channel. The proposed approach facilitates the process of exploring and constructing quantum algorithms. 展开更多
关键词 Quantum algorithms Numbering Computable Programs Quantum Key Distribution
在线阅读 下载PDF
Dynamic Multi-Objective Gannet Optimization(DMGO):An Adaptive Algorithm for Efficient Data Replication in Cloud Systems
15
作者 P.William Ved Prakash Mishra +3 位作者 Osamah Ibrahim Khalaf Arvind Mukundan Yogeesh N Riya Karmakar 《Computers, Materials & Continua》 2025年第9期5133-5156,共24页
Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple dat... Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance. 展开更多
关键词 Cloud computing data replication dynamic optimization multi-objective optimization gannet optimization algorithm adaptive algorithms resource efficiency sCALABILITY latency reduction energy-efficient computing
在线阅读 下载PDF
Priority-Based Scheduling and Orchestration in Edge-Cloud Computing:A Deep Reinforcement Learning-Enhanced Concurrency Control Approach
16
作者 Mohammad A Al Khaldy Ahmad Nabot +4 位作者 Ahmad Al-Qerem Mohammad Alauthman Amina Salhi Suhaila Abuowaida Naceur Chihaoui 《Computer Modeling in Engineering & Sciences》 2025年第10期673-697,共25页
The exponential growth of Internet of Things(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet... The exponential growth of Internet of Things(IoT)devices has created unprecedented challenges in data processing and resource management for time-critical applications.Traditional cloud computing paradigms cannot meet the stringent latency requirements of modern IoT systems,while pure edge computing faces resource constraints that limit processing capabilities.This paper addresses these challenges by proposing a novel Deep Reinforcement Learning(DRL)-enhanced priority-based scheduling framework for hybrid edge-cloud computing environments.Our approach integrates adaptive priority assignment with a two-level concurrency control protocol that ensures both optimal performance and data consistency.The framework introduces three key innovations:(1)a DRL-based dynamic priority assignmentmechanism that learns fromsystem behavior,(2)a hybrid concurrency control protocol combining local edge validation with global cloud coordination,and(3)an integrated mathematical model that formalizes sensor-driven transactions across edge-cloud architectures.Extensive simulations across diverse workload scenarios demonstrate significant quantitative improvements:40%latency reduction,25%throughput increase,85%resource utilization(compared to 60%for heuristicmethods),40%reduction in energy consumption(300 vs.500 J per task),and 50%improvement in scalability factor(1.8 vs.1.2 for EDF)compared to state-of-the-art heuristic and meta-heuristic approaches.These results establish the framework as a robust solution for large-scale IoT and autonomous applications requiring real-time processing with consistency guarantees. 展开更多
关键词 Edge computing cloud computing scheduling algorithms orchestration strategies deep reinforcement learning concurrency control real-time systems IoT
在线阅读 下载PDF
DNA Encoding Optimisation Based on Thermodynamics
17
作者 Xianhang Luo Kai Zhang +1 位作者 Enqiang Zhu Jin Xu 《CAAI Transactions on Intelligence Technology》 2025年第6期1829-1843,共15页
Due to their exceptional programmability,DNA molecules are widely employed in the design of molecular circuits for applications such as DNA computing,DNA storage and cancer diagnosis and treatment.The quality of DNA s... Due to their exceptional programmability,DNA molecules are widely employed in the design of molecular circuits for applications such as DNA computing,DNA storage and cancer diagnosis and treatment.The quality of DNA sequences directly determines the reliability of these molecular circuits.However,existing DNA encoding algorithms suffer from limitations such as reliance on Hamming distance and conflicts among multiple objectives,resulting in insufficient stability of the generated sequences.To address these issues,this paper proposes a thermodynamics-based multi-objective evolutionary optimisation algorithm(TEMOA).The core innovations of the proposed algorithm are as follows:First,a thermodynamics-based DNA encoding modelling strategy(TDEMS)is introduced,which simplifies the encoding process and significantly improves the sequence quality by incorporating thermodynamic stability constraints.Second,two diversity optimisation strategies—the diversity assessment strategy(DAS)and the front equalisation nondominated sorting(FENS)strategy—are designed to enhance the algorithm's global search capability.Finally,a flexible fitness function design is incorporated to accommodate diverse user requirements.Experimental results demonstrate that TEMOA is more effective than state-of-the-art methods on challenging multi-objective optimisation problems,whereas the DNA sequences generated by TEMOA exhibit greater reliability compared to those produced by traditional DNA encoding algorithms. 展开更多
关键词 biology computing genetic algorithms minimisation
在线阅读 下载PDF
Modified Neural Network Used for Host Utilization Predication in Cloud Computing Environment
18
作者 Arif Ullah Siti Fatimah Abdul Razak +1 位作者 Sumendra Yogarayan Md Shohel Sayeed 《Computers, Materials & Continua》 2025年第3期5185-5204,共20页
Networking,storage,and hardware are just a few of the virtual computing resources that the infrastruc-ture service model offers,depending on what the client needs.One essential aspect of cloud computing that improves ... Networking,storage,and hardware are just a few of the virtual computing resources that the infrastruc-ture service model offers,depending on what the client needs.One essential aspect of cloud computing that improves resource allocation techniques is host load prediction.This difficulty means that hardware resource allocation in cloud computing still results in hosting initialization issues,which add several minutes to response times.To solve this issue and accurately predict cloud capacity,cloud data centers use prediction algorithms.This permits dynamic cloud scalability while maintaining superior service quality.For host prediction,we therefore present a hybrid convolutional neural network long with short-term memory model in this work.First,the suggested hybrid model is input is subjected to the vector auto regression technique.The data in many variables that,prior to analysis,has been filtered to eliminate linear interdependencies.After that,the persisting data are processed and sent into the convolutional neural network layer,which gathers intricate details about the utilization of each virtual machine and central processing unit.The next step involves the use of extended short-term memory,which is suitable for representing the temporal information of irregular trends in time series components.The key to the entire process is that we used the most appropriate activation function for this type of model a scaled polynomial constant unit.Cloud systems require accurate prediction due to the increasing degrees of unpredictability in data centers.Because of this,two actual load traces were used in this study’s assessment of the performance.An example of the load trace is in the typical dispersed system.In comparison to CNN,VAR-GRU,VAR-MLP,ARIMA-LSTM,and other models,the experiment results demonstrate that our suggested approach offers state-of-the-art performance with higher accuracy in both datasets. 展开更多
关键词 Cloud computing DATACENTEr virtual machine(VM) PrEDICATION algorithm
在线阅读 下载PDF
Stochastic Fractal Search:A Decade Comprehensive Review on Its Theory,Variants,and Applications
19
作者 Mohammed A.El-Shorbagy Anas Bouaouda +1 位作者 Laith Abualigah Fatma A.Hashim 《Computer Modeling in Engineering & Sciences》 2025年第3期2339-2404,共66页
With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heurist... With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heuristic algorithms have shown remarkable effectiveness in solving these challenges across diverse domains,such as machine learning,process control,and engineering design,showcasing their capability to address complex optimization problems.The Stochastic Fractal Search(SFS)algorithm is one of the most popular meta-heuristic optimization methods inspired by the fractal growth patterns of natural materials.Since its introduction by Hamid Salimi in 2015,SFS has garnered significant attention from researchers and has been applied to diverse optimization problems acrossmultiple disciplines.Its popularity can be attributed to several factors,including its simplicity,practical computational efficiency,ease of implementation,rapid convergence,high effectiveness,and ability to address singleandmulti-objective optimization problems,often outperforming other established algorithms.This review paper offers a comprehensive and detailed analysis of the SFS algorithm,covering its standard version,modifications,hybridization,and multi-objective implementations.The paper also examines several SFS applications across diverse domains,including power and energy systems,image processing,machine learning,wireless sensor networks,environmental modeling,economics and finance,and numerous engineering challenges.Furthermore,the paper critically evaluates the SFS algorithm’s performance,benchmarking its effectiveness against recently published meta-heuristic algorithms.In conclusion,the review highlights key findings and suggests potential directions for future developments and modifications of the SFS algorithm. 展开更多
关键词 Meta-heuristic algorithms stochastic fractal search evolutionary computation engineering applications swarm intelligence optimization
在线阅读 下载PDF
Introduction to the Special Issue on Emerging Artificial Intelligence Technologies and Applications
20
作者 Wenfeng Zheng Chao Liu Lirong Yin 《Computer Modeling in Engineering & Sciences》 2025年第9期2705-2707,共3页
Artificial intelligence(AI)has evolved at an unprecedented pace in recent years.This rapid advancement includes algorithmic breakthroughs,cross-disciplinary integration,and diverse applications—driven by growing comp... Artificial intelligence(AI)has evolved at an unprecedented pace in recent years.This rapid advancement includes algorithmic breakthroughs,cross-disciplinary integration,and diverse applications—driven by growing computational power,massive datasets,and collaborative global research.This special issue of Emerging Artificial Intelligence Technologies and Applications was conceived to provide a platformfor cuttingedge AI research communication,developing novel methodologies,cross-domain applications,and critical advancements in addressing real-world challenges.Over the past months,we have witnessed a remarkable diversity of submissions,reflecting the global trend of AI innovation.Below,we synthesize the key insights from these works,highlighting their collective contribution to advancing AI’s theoretical frontiers and practical applications. 展开更多
关键词 cross disciplinary integration algorithmic breakthroughscross disciplinary novel methodologiescross domain artificial intelligence ai collaborative global researchthis algorithmic breakthroughs artificial intelligence computational power
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部