A generalization of supervised single-label learning based on the assumption that each sample in a dataset may belong to more than one class simultaneously is called multi-label learning.The main objective of this wor...A generalization of supervised single-label learning based on the assumption that each sample in a dataset may belong to more than one class simultaneously is called multi-label learning.The main objective of this work is to create a novel framework for learning and classifying imbalancedmulti-label data.This work proposes a framework of two phases.The imbalanced distribution of themulti-label dataset is addressed through the proposed Borderline MLSMOTE resampling method in phase 1.Later,an adaptive weighted l21 norm regularized(Elastic-net)multilabel logistic regression is used to predict unseen samples in phase 2.The proposed Borderline MLSMOTE resampling method focuses on samples with concurrent high labels in contrast to conventional MLSMOTE.The minority labels in these samples are called difficult minority labels and are more prone to penalize classification performance.The concurrentmeasure is considered borderline,and labels associated with samples are regarded as borderline labels in the decision boundary.In phase II,a novel adaptive l21 norm regularized weighted multi-label logistic regression is used to handle balanced data with different weighted synthetic samples.Experimentation on various benchmark datasets shows the outperformance of the proposed method and its powerful predictive performances over existing conventional state-of-the-art multi-label methods.展开更多
In recent times,sixth generation(6G)communication technologies have become a hot research topic because of maximum throughput and low delay services for mobile users.It encompasses several heterogeneous resource and c...In recent times,sixth generation(6G)communication technologies have become a hot research topic because of maximum throughput and low delay services for mobile users.It encompasses several heterogeneous resource and communication standard in ensuring incessant availability of service.At the same time,the development of 6G enables the Unmanned Aerial Vehicles(UAVs)in offering cost and time-efficient solution to several applications like healthcare,surveillance,disaster management,etc.In UAV networks,energy efficiency and data collection are considered the major process for high quality network communication.But these procedures are found to be challenging because of maximum mobility,unstable links,dynamic topology,and energy restricted UAVs.These issues are solved by the use of artificial intelligence(AI)and energy efficient clustering techniques for UAVs in the 6G environment.With this inspiration,this work designs an artificial intelligence enabled cooperative cluster-based data collection technique for unmanned aerial vehicles(AECCDC-UAV)in 6G environment.The proposed AECCDC-UAV technique purposes for dividing the UAV network as to different clusters and allocate a cluster head(CH)to each cluster in such a way that the energy consumption(ECM)gets minimized.The presented AECCDC-UAV technique involves a quasi-oppositional shuffled shepherd optimization(QOSSO)algorithm for selecting the CHs and construct clusters.The QOSSO algorithm derives a fitness function involving three input parameters residual energy of UAVs,distance to neighboring UAVs,and degree of UAVs.The performance of the AECCDC-UAV technique is validated in many aspects and the obtained experimental values demonstration promising results over the recent state of art methods.展开更多
Recently,a massive quantity of data is being produced from a distinct number of sources and the size of the daily created on the Internet has crossed two Exabytes.At the same time,clustering is one of the efficient te...Recently,a massive quantity of data is being produced from a distinct number of sources and the size of the daily created on the Internet has crossed two Exabytes.At the same time,clustering is one of the efficient techniques for mining big data to extract the useful and hidden patterns that exist in it.Density-based clustering techniques have gained significant attention owing to the fact that it helps to effectively recognize complex patterns in spatial dataset.Big data clustering is a trivial process owing to the increasing quantity of data which can be solved by the use of Map Reduce tool.With this motivation,this paper presents an efficient Map Reduce based hybrid density based clustering and classification algorithm for big data analytics(MR-HDBCC).The proposed MR-HDBCC technique is executed on Map Reduce tool for handling the big data.In addition,the MR-HDBCC technique involves three distinct processes namely pre-processing,clustering,and classification.The proposed model utilizes the Density-Based Spatial Clustering of Applications with Noise(DBSCAN)techni-que which is capable of detecting random shapes and diverse clusters with noisy data.For improving the performance of the DBSCAN technique,a hybrid model using cockroach swarm optimization(CSO)algorithm is developed for the exploration of the search space and determine the optimal parameters for density based clustering.Finally,bidirectional gated recurrent neural network(BGRNN)is employed for the classification of big data.The experimental validation of the proposed MR-HDBCC technique takes place using the benchmark dataset and the simulation outcomes demonstrate the promising performance of the proposed model interms of different measures.展开更多
The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine l...The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine learning(ML)models effectively deal with such challenges.This research paper comprehensively analyses recent advancements in crop yield prediction from January 2016 to March 2024.In addition,it analyses the effectiveness of various input parameters considered in crop yield prediction models.We conducted an in-depth search and gathered studies that employed crop modeling and AI-based methods to predict crop yield.The total number of articles reviewed for crop yield prediction using ML,meta-modeling(Crop models coupled with ML/DL),and DL-based prediction models and input parameter selection is 125.We conduct the research by setting up five objectives for this research and discussing them after analyzing the selected research papers.Each study is assessed based on the crop type,input parameters employed for prediction,the modeling techniques adopted,and the evaluation metrics used for estimatingmodel performance.We also discuss the ethical and social impacts of AI on agriculture.However,various approaches presented in the scientific literature have delivered impressive predictions,they are complicateddue to intricate,multifactorial influences oncropgrowthand theneed for accuratedata-driven models.Therefore,thorough research is required to deal with challenges in predicting agricultural output.展开更多
Detecting brain tumours is complex due to the natural variation in their location, shape, and intensity in images. While having accurate detection and segmentation of brain tumours would be beneficial, current methods...Detecting brain tumours is complex due to the natural variation in their location, shape, and intensity in images. While having accurate detection and segmentation of brain tumours would be beneficial, current methods still need to solve this problem despite the numerous available approaches. Precise analysis of Magnetic Resonance Imaging (MRI) is crucial for detecting, segmenting, and classifying brain tumours in medical diagnostics. Magnetic Resonance Imaging is a vital component in medical diagnosis, and it requires precise, efficient, careful, efficient, and reliable image analysis techniques. The authors developed a Deep Learning (DL) fusion model to classify brain tumours reliably. Deep Learning models require large amounts of training data to achieve good results, so the researchers utilised data augmentation techniques to increase the dataset size for training models. VGG16, ResNet50, and convolutional deep belief networks networks extracted deep features from MRI images. Softmax was used as the classifier, and the training set was supplemented with intentionally created MRI images of brain tumours in addition to the genuine ones. The features of two DL models were combined in the proposed model to generate a fusion model, which significantly increased classification accuracy. An openly accessible dataset from the internet was used to test the model's performance, and the experimental results showed that the proposed fusion model achieved a classification accuracy of 98.98%. Finally, the results were compared with existing methods, and the proposed model outperformed them significantly.展开更多
Tourism is a popular activity that allows individuals to escape their daily routines and explore new destinations for various reasons,including leisure,pleasure,or business.A recent study has proposed a unique mathema...Tourism is a popular activity that allows individuals to escape their daily routines and explore new destinations for various reasons,including leisure,pleasure,or business.A recent study has proposed a unique mathematical concept called a q−Rung orthopair fuzzy hypersoft set(q−ROFHS)to enhance the formal representation of human thought processes and evaluate tourism carrying capacity.This approach can capture the imprecision and ambiguity often present in human perception.With the advanced mathematical tools in this field,the study has also incorporated the Einstein aggregation operator and score function into the q−ROFHS values to supportmultiattribute decision-making algorithms.By implementing this technique,effective plans can be developed for social and economic development while avoiding detrimental effects such as overcrowding or environmental damage caused by tourism.A case study of selected tourism carrying capacity will demonstrate the proposed methodology.展开更多
In this article,multiple attribute decision-making problems are solved using the vague normal set(VNS).It is possible to generalize the vague set(VS)and q-rung fuzzy set(FS)into the q-rung vague set(VS).A log q-rung n...In this article,multiple attribute decision-making problems are solved using the vague normal set(VNS).It is possible to generalize the vague set(VS)and q-rung fuzzy set(FS)into the q-rung vague set(VS).A log q-rung normal vague weighted averaging(log q-rung NVWA),a log q-rung normal vague weighted geometric(log q-rung NVWG),a log generalized q-rung normal vague weighted averaging(log Gq-rung NVWA),and a log generalized q-rungnormal vagueweightedgeometric(logGq-rungNVWG)operator are discussed in this article.Adescription is provided of the scoring function,accuracy function and operational laws of the log q-rung VS.The algorithms underlying these functions are also described.A numerical example is provided to extend the Euclidean distance and the Humming distance.Additionally,idempotency,boundedness,commutativity,and monotonicity of the log q-rung VS are examined as they facilitate recognizing the optimal alternative more quickly and help clarify conceptualization.We chose five anemia patients with four types of symptoms including seizures,emotional shock or hysteria,brain cause,and high fever,who had either retrograde amnesia,anterograde amnesia,transient global amnesia,post-traumatic amnesia,or infantile amnesia.Natural numbers q are used to express the results of the models.To demonstrate the effectiveness and accuracy of the models we are investigating,we compare several existing models with those that have been developed.展开更多
In the present work,we have employed machine learning(ML)techniques to evaluate ductile-brittle(DB)behaviors in intermetallic compounds(IMCs)which can form magnesium(Mg)alloys.This procedure was mainly conducted by a ...In the present work,we have employed machine learning(ML)techniques to evaluate ductile-brittle(DB)behaviors in intermetallic compounds(IMCs)which can form magnesium(Mg)alloys.This procedure was mainly conducted by a proxy-based method,where the ratio of shear(G)/bulk(B)moduli was used as a proxy to identify whether the compound is ductile or brittle.Starting from compounds information(composition and crystal structure)and their moduli,as found in open databases(AFLOW),ML-based models were built,and those models were used to predict the moduli in other compounds,and accordingly,to foresee the ductile-brittle behaviors of these new compounds.The results reached in the present work showed that the built models can effectively catch the elastic moduli of new compounds.This was confirmed through moduli calculations done by density functional theory(DFT)on some compounds,where the DFT calculations were consistent with the ML prediction.A further confirmation on the reliability of the built ML models was considered through relating between the DB behavior in MgBe_(13) and MgPd_(2),as evaluated by the ML-predicted moduli,and the nature of chemical bonding in these two compounds,which in turn,was investigated by the charge density distribution(CDD)and electron localization function(ELF)obtained by DFT methodology.The ML-evaluated DB behaviors of the two compounds was also consistent with the DFT calculations of CDD and ELF.These findings and confirmations gave legitimacy to the built model to be employed in further prediction processes.Indeed,as examples,the DB characteristics were investigated in IMCs that might from in three Mg alloy series,involving AZ,ZX and WE.展开更多
Human Activity Recognition(HAR)has been made simple in recent years,thanks to recent advancements made in Artificial Intelligence(AI)techni-ques.These techniques are applied in several areas like security,surveillance,...Human Activity Recognition(HAR)has been made simple in recent years,thanks to recent advancements made in Artificial Intelligence(AI)techni-ques.These techniques are applied in several areas like security,surveillance,healthcare,human-robot interaction,and entertainment.Since wearable sensor-based HAR system includes in-built sensors,human activities can be categorized based on sensor values.Further,it can also be employed in other applications such as gait diagnosis,observation of children/adult’s cognitive nature,stroke-patient hospital direction,Epilepsy and Parkinson’s disease examination,etc.Recently-developed Artificial Intelligence(AI)techniques,especially Deep Learning(DL)models can be deployed to accomplish effective outcomes on HAR process.With this motivation,the current research paper focuses on designing Intelligent Hyperparameter Tuned Deep Learning-based HAR(IHPTDL-HAR)technique in healthcare environment.The proposed IHPTDL-HAR technique aims at recogniz-ing the human actions in healthcare environment and helps the patients in mana-ging their healthcare service.In addition,the presented model makes use of Hierarchical Clustering(HC)-based outlier detection technique to remove the out-liers.IHPTDL-HAR technique incorporates DL-based Deep Belief Network(DBN)model to recognize the activities of users.Moreover,Harris Hawks Opti-mization(HHO)algorithm is used for hyperparameter tuning of DBN model.Finally,a comprehensive experimental analysis was conducted upon benchmark dataset and the results were examined under different aspects.The experimental results demonstrate that the proposed IHPTDL-HAR technique is a superior per-former compared to other recent techniques under different measures.展开更多
Agriculture is an important research area in the field of visual recognition by computers.Plant diseases affect the quality and yields of agriculture.Early-stage identification of crop disease decreases financial loss...Agriculture is an important research area in the field of visual recognition by computers.Plant diseases affect the quality and yields of agriculture.Early-stage identification of crop disease decreases financial losses and positively impacts crop quality.The manual identification of crop diseases,which aremostly visible on leaves,is a very time-consuming and costly process.In this work,we propose a new framework for the recognition of cucumber leaf diseases.The proposed framework is based on deep learning and involves the fusion and selection of the best features.In the feature extraction phase,VGG(Visual Geometry Group)and Inception V3 deep learning models are considered and fine-tuned.Both fine-tuned models are trained using deep transfer learning.Features are extracted in the later step and fused using a parallel maximum fusion approach.In the later step,best features are selected usingWhale Optimization algorithm.The best-selected features are classified using supervised learning algorithms for the final classification process.The experimental process was conducted on a privately collected dataset that consists of five types of cucumber disease and achieved accuracy of 96.5%.A comparison with recent techniques shows the significance of the proposed method.展开更多
Speech emotion recognition(SER)is an important research problem in human-computer interaction systems.The representation and extraction of features are significant challenges in SER systems.Despite the promising resul...Speech emotion recognition(SER)is an important research problem in human-computer interaction systems.The representation and extraction of features are significant challenges in SER systems.Despite the promising results of recent studies,they generally do not leverage progressive fusion techniques for effective feature representation and increasing receptive fields.To mitigate this problem,this article proposes DeepCNN,which is a fusion of spectral and temporal features of emotional speech by parallelising convolutional neural networks(CNNs)and a convolution layer-based transformer.Two parallel CNNs are applied to extract the spectral features(2D-CNN)and temporal features(1D-CNN)representations.A 2D-convolution layer-based transformer module extracts spectro-temporal features and concatenates them with features from parallel CNNs.The learnt low-level concatenated features are then applied to a deep framework of convolutional blocks,which retrieves high-level feature representation and subsequently categorises the emotional states using an attention gated recurrent unit and classification layer.This fusion technique results in a deeper hierarchical feature representation at a lower computational cost while simultaneously expanding the filter depth and reducing the feature map.The Berlin Database of Emotional Speech(EMO-BD)and Interactive Emotional Dyadic Motion Capture(IEMOCAP)datasets are used in experiments to recognise distinct speech emotions.With efficient spectral and temporal feature representation,the proposed SER model achieves 94.2%accuracy for different emotions on the EMO-BD and 81.1%accuracy on the IEMOCAP dataset respectively.The proposed SER system,DeepCNN,outperforms the baseline SER systems in terms of emotion recognition accuracy on the EMO-BD and IEMOCAP datasets.展开更多
Recently,computation offloading has become an effective method for overcoming the constraint of a mobile device(MD)using computationintensivemobile and offloading delay-sensitive application tasks to the remote cloud-...Recently,computation offloading has become an effective method for overcoming the constraint of a mobile device(MD)using computationintensivemobile and offloading delay-sensitive application tasks to the remote cloud-based data center.Smart city benefitted from offloading to edge point.Consider a mobile edge computing(MEC)network in multiple regions.They comprise N MDs and many access points,in which everyMDhasM independent real-time tasks.This study designs a new Task Offloading and Resource Allocation in IoT-based MEC using Deep Learning with Seagull Optimization(TORA-DLSGO)algorithm.The proposed TORA-DLSGO technique addresses the resource management issue in the MEC server,which enables an optimum offloading decision to minimize the system cost.In addition,an objective function is derived based on minimizing energy consumption subject to the latency requirements and restricted resources.The TORA-DLSGO technique uses the deep belief network(DBN)model for optimum offloading decision-making.Finally,the SGO algorithm is used for the parameter tuning of the DBN model.The simulation results exemplify that the TORA-DLSGO technique outperformed the existing model in reducing client overhead in the MEC systems with a maximum reward of 0.8967.展开更多
Precision agriculture includes the optimum and adequate use of resources depending on several variables that govern crop yield.Precision agriculture offers a novel solution utilizing a systematic technique for current...Precision agriculture includes the optimum and adequate use of resources depending on several variables that govern crop yield.Precision agriculture offers a novel solution utilizing a systematic technique for current agricultural problems like balancing production and environmental concerns.Weed control has become one of the significant problems in the agricultural sector.In traditional weed control,the entire field is treated uniformly by spraying the soil,a single herbicide dose,weed,and crops in the same way.For more precise farming,robots could accomplish targeted weed treatment if they could specifically find the location of the dispensable plant and identify the weed type.This may lessen by large margin utilization of agrochemicals on agricultural fields and favour sustainable agriculture.This study presents a Harris Hawks Optimizer with Graph Convolutional Network based Weed Detection(HHOGCN-WD)technique for Precision Agriculture.The HHOGCN-WD technique mainly focuses on identifying and classifying weeds for precision agriculture.For image pre-processing,the HHOGCN-WD model utilizes a bilateral normal filter(BNF)for noise removal.In addition,coupled convolutional neural network(CCNet)model is utilized to derive a set of feature vectors.To detect and classify weed,the GCN model is utilized with the HHO algorithm as a hyperparameter optimizer to improve the detection performance.The experimental results of the HHOGCN-WD technique are investigated under the benchmark dataset.The results indicate the promising performance of the presented HHOGCN-WD model over other recent approaches,with increased accuracy of 99.13%.展开更多
Wireless sensor networks(WSN)comprise a set of numerous cheap sensors placed in the target region.A primary function of the WSN is to avail the location details of the event occurrences or the node.A major challenge i...Wireless sensor networks(WSN)comprise a set of numerous cheap sensors placed in the target region.A primary function of the WSN is to avail the location details of the event occurrences or the node.A major challenge in WSN is node localization which plays an important role in data gathering applications.Since GPS is expensive and inaccurate in indoor regions,effective node localization techniques are needed.The major intention of localization is for determining the place of node in short period with minimum computation.To achieve this,bio-inspired algorithms are used and node localization is assumed as an optimization problem in a multidimensional space.This paper introduces a new Sparrow Search Algorithm with Doppler Effect(SSA-DE)for Node Localization in Wireless Networks.The SSA is generally stimulated by the group wisdom,foraging,and anti-predation behaviors of sparrows.Besides,the Doppler Effect is incorporated into the SSA to further improve the node localization performance.In addition,the SSA-DE model defines the position of node in an iterative manner using Euclidian distance as the fitness function.The presented SSA-DE model is implanted in MATLAB R2014.An extensive set of experimentation is carried out and the results are examined under a varying number of anchor nodes and ranging error.The attained experimental outcome ensured the superior efficiency of the SSA-DE technique over the existing techniques.展开更多
Sustainable forest management heavily relies on the accurate estimation of tree parameters.Among others,the diameter at breast height(DBH) is important for extracting the volume and mass of an individual tree.For syst...Sustainable forest management heavily relies on the accurate estimation of tree parameters.Among others,the diameter at breast height(DBH) is important for extracting the volume and mass of an individual tree.For systematically estimating the volume of entire plots,airborne laser scanning(ALS) data are used.The estimation model is frequently calibrated using manual DBH measurements or static terrestrial laser scans(STLS) of sample plots.Although reliable,this method is time-consuming,which greatly hampers its use.Here,a handheld mobile terrestrial laser scanning(HMTLS) was demonstrated to be a useful alternative technique to precisely and efficiently calculate DBH.Different data acquisition techniques were applied at a sample plot,then the resulting parameters were comparatively analysed.The calculated DBH values were comparable to the manual measurements for HMTLS,STLS,and ALS data sets.Given the comparability of the extracted parameters,with a reduced point density of HTMLS compared to STLS data,and the reasonable increase of performance,with a reduction of acquisition time with a factor of5 compared to conventional STLS techniques and a factor of3 compared to manual measurements,HMTLS is considered a useful alternative technique.展开更多
Worldwide cotton is the most profitable cash crop.Each year the production of this crop suffers because of several diseases.At an early stage,computerized methods are used for disease detection that may reduce the los...Worldwide cotton is the most profitable cash crop.Each year the production of this crop suffers because of several diseases.At an early stage,computerized methods are used for disease detection that may reduce the loss in the production of cotton.Although several methods are proposed for the detection of cotton diseases,however,still there are limitations because of low-quality images,size,shape,variations in orientation,and complex background.Due to these factors,there is a need for novel methods for features extraction/selection for the accurate cotton disease classification.Therefore in this research,an optimized features fusion-based model is proposed,in which two pre-trained architectures called EfficientNet-b0 and Inception-v3 are utilized to extract features,each model extracts the feature vector of length N×1000.After that,the extracted features are serially concatenated having a feature vector lengthN×2000.Themost prominent features are selected usingEmperor PenguinOptimizer(EPO)method.The method is evaluated on two publically available datasets,such as Kaggle cotton disease dataset-I,and Kaggle cotton-leaf-infection-II.The EPO method returns the feature vector of length 1×755,and 1×824 using dataset-I,and dataset-II,respectively.The classification is performed using 5,7,and 10 folds cross-validation.The Quadratic Discriminant Analysis(QDA)classifier provides an accuracy of 98.9%on 5 fold,98.96%on 7 fold,and 99.07%on 10 fold using Kaggle cotton disease dataset-I while the Ensemble Subspace K Nearest Neighbor(KNN)provides 99.16%on 5 fold,98.99%on 7 fold,and 99.27%on 10 fold using Kaggle cotton-leaf-infection dataset-II.展开更多
Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theI...Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theIoT enabled models are resource-limited and necessitate crisp responses, lowlatencies, and high bandwidth, which are beyond their abilities. Cloud computing (CC) is treated as a resource-rich solution to the above mentionedchallenges. But the intrinsic high latency of CC makes it nonviable. The longerlatency degrades the outcome of IoT based smart systems. CC is an emergentdispersed, inexpensive computing pattern with massive assembly of heterogeneous autonomous systems. The effective use of task scheduling minimizes theenergy utilization of the cloud infrastructure and rises the income of serviceproviders by the minimization of the processing time of the user job. Withthis motivation, this paper presents an intelligent Chaotic Artificial ImmuneOptimization Algorithm for Task Scheduling (CAIOA-RS) in IoT enabledcloud environment. The proposed CAIOA-RS algorithm solves the issue ofresource allocation in the IoT enabled cloud environment. It also satisfiesthe makespan by carrying out the optimum task scheduling process with thedistinct strategies of incoming tasks. The design of CAIOA-RS techniqueincorporates the concept of chaotic maps into the conventional AIOA toenhance its performance. A series of experiments were carried out on theCloudSim platform. The simulation results demonstrate that the CAIOA-RStechnique indicates that the proposed model outperforms the original version,as well as other heuristics and metaheuristics.展开更多
With the incorporation of distributed energy systems in the electric grid,transactive energy market(TEM)has become popular in balancing the demand as well as supply adaptively over the grid.The classical grid can be u...With the incorporation of distributed energy systems in the electric grid,transactive energy market(TEM)has become popular in balancing the demand as well as supply adaptively over the grid.The classical grid can be updated to the smart grid by the integration of Information and Communication Technology(ICT)over the grids.The TEM allows the Peerto-Peer(P2P)energy trading in the grid that effectually connects the consumer and prosumer to trade energy among them.At the same time,there is a need to predict the load for effectual P2P energy trading and can be accomplished by the use of machine learning(DML)models.Though some of the short term load prediction techniques have existed in the literature,there is still essential to consider the intrinsic features,parameter optimization,etc.into account.In this aspect,this study devises new deep learning enabled short term load forecasting model for P2P energy trading(DLSTLF-P2P)in TEM.The proposed model involves the design of oppositional coyote optimization algorithm(OCOA)based feature selection technique in which the OCOA is derived by the integration of oppositional based learning(OBL)concept with COA for improved convergence rate.Moreover,deep belief networks(DBN)are employed for the prediction of load in the P2P energy trading systems.In order to additional improve the predictive performance of the DBN model,a hyperparameter optimizer is introduced using chicken swarm optimization(CSO)algorithm is applied for the optimal choice of DBN parameters to improve the predictive outcome.The simulation analysis of the proposed DLSTLF-P2P is validated using the UK Smart Meter dataset and the obtained outcomes demonstrate the superiority of the DLSTLF-P2P technique with the maximum training,testing,and validation accuracy of 90.17%,87.39%,and 87.86%.展开更多
In recent times,Internet of Things(IoT)has become a hot research topic and it aims at interlinking several sensor-enabled devices mainly for data gathering and tracking applications.Wireless Sensor Network(WSN)is an i...In recent times,Internet of Things(IoT)has become a hot research topic and it aims at interlinking several sensor-enabled devices mainly for data gathering and tracking applications.Wireless Sensor Network(WSN)is an important component in IoT paradigm since its inception and has become the most preferred platform to deploy several smart city application areas like home automation,smart buildings,intelligent transportation,disaster management,and other such IoT-based applications.Clustering methods are widely-employed energy efficient techniques with a primary purpose i.e.,to balance the energy among sensor nodes.Clustering and routing processes are considered as Non-Polynomial(NP)hard problems whereas bio-inspired techniques have been employed for a known time to resolve such problems.The current research paper designs an Energy Efficient Two-Tier Clustering with Multi-hop Routing Protocol(EETTC-MRP)for IoT networks.The presented EETTC-MRP technique operates on different stages namely,tentative Cluster Head(CH)selection,final CH selection,and routing.In first stage of the proposed EETTC-MRP technique,a type II fuzzy logic-based tentative CH(T2FL-TCH)selection is used.Subsequently,Quantum Group Teaching Optimization Algorithm-based Final CH selection(QGTOA-FCH)technique is deployed to derive an optimum group of CHs in the network.Besides,Political Optimizer based Multihop Routing(PO-MHR)technique is also employed to derive an optimal selection of routes between CHs in the network.In order to validate the efficacy of EETTC-MRP method,a series of experiments was conducted and the outcomes were examined under distinct measures.The experimental analysis infers that the proposed EETTC-MRP technique is superior to other methods under different measures.展开更多
基金partly supported by the Technology Development Program of MSS(No.S3033853)by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1A4A1031509).
文摘A generalization of supervised single-label learning based on the assumption that each sample in a dataset may belong to more than one class simultaneously is called multi-label learning.The main objective of this work is to create a novel framework for learning and classifying imbalancedmulti-label data.This work proposes a framework of two phases.The imbalanced distribution of themulti-label dataset is addressed through the proposed Borderline MLSMOTE resampling method in phase 1.Later,an adaptive weighted l21 norm regularized(Elastic-net)multilabel logistic regression is used to predict unseen samples in phase 2.The proposed Borderline MLSMOTE resampling method focuses on samples with concurrent high labels in contrast to conventional MLSMOTE.The minority labels in these samples are called difficult minority labels and are more prone to penalize classification performance.The concurrentmeasure is considered borderline,and labels associated with samples are regarded as borderline labels in the decision boundary.In phase II,a novel adaptive l21 norm regularized weighted multi-label logistic regression is used to handle balanced data with different weighted synthetic samples.Experimentation on various benchmark datasets shows the outperformance of the proposed method and its powerful predictive performances over existing conventional state-of-the-art multi-label methods.
基金This work was supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1F1A1063319).
文摘In recent times,sixth generation(6G)communication technologies have become a hot research topic because of maximum throughput and low delay services for mobile users.It encompasses several heterogeneous resource and communication standard in ensuring incessant availability of service.At the same time,the development of 6G enables the Unmanned Aerial Vehicles(UAVs)in offering cost and time-efficient solution to several applications like healthcare,surveillance,disaster management,etc.In UAV networks,energy efficiency and data collection are considered the major process for high quality network communication.But these procedures are found to be challenging because of maximum mobility,unstable links,dynamic topology,and energy restricted UAVs.These issues are solved by the use of artificial intelligence(AI)and energy efficient clustering techniques for UAVs in the 6G environment.With this inspiration,this work designs an artificial intelligence enabled cooperative cluster-based data collection technique for unmanned aerial vehicles(AECCDC-UAV)in 6G environment.The proposed AECCDC-UAV technique purposes for dividing the UAV network as to different clusters and allocate a cluster head(CH)to each cluster in such a way that the energy consumption(ECM)gets minimized.The presented AECCDC-UAV technique involves a quasi-oppositional shuffled shepherd optimization(QOSSO)algorithm for selecting the CHs and construct clusters.The QOSSO algorithm derives a fitness function involving three input parameters residual energy of UAVs,distance to neighboring UAVs,and degree of UAVs.The performance of the AECCDC-UAV technique is validated in many aspects and the obtained experimental values demonstration promising results over the recent state of art methods.
基金supported by a grant of the Korea Health Technology R&D Project through the Korea Health Industry Development Institute(KHIDI),funded by the Ministry of Health&Welfare,Republic of Korea(Grant Number:HI21C1831)the Soonchunhyang University Research Fund.
文摘Recently,a massive quantity of data is being produced from a distinct number of sources and the size of the daily created on the Internet has crossed two Exabytes.At the same time,clustering is one of the efficient techniques for mining big data to extract the useful and hidden patterns that exist in it.Density-based clustering techniques have gained significant attention owing to the fact that it helps to effectively recognize complex patterns in spatial dataset.Big data clustering is a trivial process owing to the increasing quantity of data which can be solved by the use of Map Reduce tool.With this motivation,this paper presents an efficient Map Reduce based hybrid density based clustering and classification algorithm for big data analytics(MR-HDBCC).The proposed MR-HDBCC technique is executed on Map Reduce tool for handling the big data.In addition,the MR-HDBCC technique involves three distinct processes namely pre-processing,clustering,and classification.The proposed model utilizes the Density-Based Spatial Clustering of Applications with Noise(DBSCAN)techni-que which is capable of detecting random shapes and diverse clusters with noisy data.For improving the performance of the DBSCAN technique,a hybrid model using cockroach swarm optimization(CSO)algorithm is developed for the exploration of the search space and determine the optimal parameters for density based clustering.Finally,bidirectional gated recurrent neural network(BGRNN)is employed for the classification of big data.The experimental validation of the proposed MR-HDBCC technique takes place using the benchmark dataset and the simulation outcomes demonstrate the promising performance of the proposed model interms of different measures.
文摘The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine learning(ML)models effectively deal with such challenges.This research paper comprehensively analyses recent advancements in crop yield prediction from January 2016 to March 2024.In addition,it analyses the effectiveness of various input parameters considered in crop yield prediction models.We conducted an in-depth search and gathered studies that employed crop modeling and AI-based methods to predict crop yield.The total number of articles reviewed for crop yield prediction using ML,meta-modeling(Crop models coupled with ML/DL),and DL-based prediction models and input parameter selection is 125.We conduct the research by setting up five objectives for this research and discussing them after analyzing the selected research papers.Each study is assessed based on the crop type,input parameters employed for prediction,the modeling techniques adopted,and the evaluation metrics used for estimatingmodel performance.We also discuss the ethical and social impacts of AI on agriculture.However,various approaches presented in the scientific literature have delivered impressive predictions,they are complicateddue to intricate,multifactorial influences oncropgrowthand theneed for accuratedata-driven models.Therefore,thorough research is required to deal with challenges in predicting agricultural output.
基金Ministry of Education,Youth and Sports of the Chezk Republic,Grant/Award Numbers:SP2023/039,SP2023/042the European Union under the REFRESH,Grant/Award Number:CZ.10.03.01/00/22_003/0000048。
文摘Detecting brain tumours is complex due to the natural variation in their location, shape, and intensity in images. While having accurate detection and segmentation of brain tumours would be beneficial, current methods still need to solve this problem despite the numerous available approaches. Precise analysis of Magnetic Resonance Imaging (MRI) is crucial for detecting, segmenting, and classifying brain tumours in medical diagnostics. Magnetic Resonance Imaging is a vital component in medical diagnosis, and it requires precise, efficient, careful, efficient, and reliable image analysis techniques. The authors developed a Deep Learning (DL) fusion model to classify brain tumours reliably. Deep Learning models require large amounts of training data to achieve good results, so the researchers utilised data augmentation techniques to increase the dataset size for training models. VGG16, ResNet50, and convolutional deep belief networks networks extracted deep features from MRI images. Softmax was used as the classifier, and the training set was supplemented with intentionally created MRI images of brain tumours in addition to the genuine ones. The features of two DL models were combined in the proposed model to generate a fusion model, which significantly increased classification accuracy. An openly accessible dataset from the internet was used to test the model's performance, and the experimental results showed that the proposed fusion model achieved a classification accuracy of 98.98%. Finally, the results were compared with existing methods, and the proposed model outperformed them significantly.
基金the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1A4A1031509).
文摘Tourism is a popular activity that allows individuals to escape their daily routines and explore new destinations for various reasons,including leisure,pleasure,or business.A recent study has proposed a unique mathematical concept called a q−Rung orthopair fuzzy hypersoft set(q−ROFHS)to enhance the formal representation of human thought processes and evaluate tourism carrying capacity.This approach can capture the imprecision and ambiguity often present in human perception.With the advanced mathematical tools in this field,the study has also incorporated the Einstein aggregation operator and score function into the q−ROFHS values to supportmultiattribute decision-making algorithms.By implementing this technique,effective plans can be developed for social and economic development while avoiding detrimental effects such as overcrowding or environmental damage caused by tourism.A case study of selected tourism carrying capacity will demonstrate the proposed methodology.
基金supported by the National Research Foundation of Korea(NRF)Grant funded by the Korea government(MSIT)(No.RS-2023-00218176)Korea Institute for Advancement of Technology(KIAT)Grant funded by the Korea government(MOTIE)(P0012724)The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘In this article,multiple attribute decision-making problems are solved using the vague normal set(VNS).It is possible to generalize the vague set(VS)and q-rung fuzzy set(FS)into the q-rung vague set(VS).A log q-rung normal vague weighted averaging(log q-rung NVWA),a log q-rung normal vague weighted geometric(log q-rung NVWG),a log generalized q-rung normal vague weighted averaging(log Gq-rung NVWA),and a log generalized q-rungnormal vagueweightedgeometric(logGq-rungNVWG)operator are discussed in this article.Adescription is provided of the scoring function,accuracy function and operational laws of the log q-rung VS.The algorithms underlying these functions are also described.A numerical example is provided to extend the Euclidean distance and the Humming distance.Additionally,idempotency,boundedness,commutativity,and monotonicity of the log q-rung VS are examined as they facilitate recognizing the optimal alternative more quickly and help clarify conceptualization.We chose five anemia patients with four types of symptoms including seizures,emotional shock or hysteria,brain cause,and high fever,who had either retrograde amnesia,anterograde amnesia,transient global amnesia,post-traumatic amnesia,or infantile amnesia.Natural numbers q are used to express the results of the models.To demonstrate the effectiveness and accuracy of the models we are investigating,we compare several existing models with those that have been developed.
基金supported by National Research Foundation(NRF)of South Korea(2020R1A2C1004720)。
文摘In the present work,we have employed machine learning(ML)techniques to evaluate ductile-brittle(DB)behaviors in intermetallic compounds(IMCs)which can form magnesium(Mg)alloys.This procedure was mainly conducted by a proxy-based method,where the ratio of shear(G)/bulk(B)moduli was used as a proxy to identify whether the compound is ductile or brittle.Starting from compounds information(composition and crystal structure)and their moduli,as found in open databases(AFLOW),ML-based models were built,and those models were used to predict the moduli in other compounds,and accordingly,to foresee the ductile-brittle behaviors of these new compounds.The results reached in the present work showed that the built models can effectively catch the elastic moduli of new compounds.This was confirmed through moduli calculations done by density functional theory(DFT)on some compounds,where the DFT calculations were consistent with the ML prediction.A further confirmation on the reliability of the built ML models was considered through relating between the DB behavior in MgBe_(13) and MgPd_(2),as evaluated by the ML-predicted moduli,and the nature of chemical bonding in these two compounds,which in turn,was investigated by the charge density distribution(CDD)and electron localization function(ELF)obtained by DFT methodology.The ML-evaluated DB behaviors of the two compounds was also consistent with the DFT calculations of CDD and ELF.These findings and confirmations gave legitimacy to the built model to be employed in further prediction processes.Indeed,as examples,the DB characteristics were investigated in IMCs that might from in three Mg alloy series,involving AZ,ZX and WE.
基金supported by Korea Institute for Advancement of Technology(KIAT)grant fundedthe Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)the Soonchunhyang University Research Fund.
文摘Human Activity Recognition(HAR)has been made simple in recent years,thanks to recent advancements made in Artificial Intelligence(AI)techni-ques.These techniques are applied in several areas like security,surveillance,healthcare,human-robot interaction,and entertainment.Since wearable sensor-based HAR system includes in-built sensors,human activities can be categorized based on sensor values.Further,it can also be employed in other applications such as gait diagnosis,observation of children/adult’s cognitive nature,stroke-patient hospital direction,Epilepsy and Parkinson’s disease examination,etc.Recently-developed Artificial Intelligence(AI)techniques,especially Deep Learning(DL)models can be deployed to accomplish effective outcomes on HAR process.With this motivation,the current research paper focuses on designing Intelligent Hyperparameter Tuned Deep Learning-based HAR(IHPTDL-HAR)technique in healthcare environment.The proposed IHPTDL-HAR technique aims at recogniz-ing the human actions in healthcare environment and helps the patients in mana-ging their healthcare service.In addition,the presented model makes use of Hierarchical Clustering(HC)-based outlier detection technique to remove the out-liers.IHPTDL-HAR technique incorporates DL-based Deep Belief Network(DBN)model to recognize the activities of users.Moreover,Harris Hawks Opti-mization(HHO)algorithm is used for hyperparameter tuning of DBN model.Finally,a comprehensive experimental analysis was conducted upon benchmark dataset and the results were examined under different aspects.The experimental results demonstrate that the proposed IHPTDL-HAR technique is a superior per-former compared to other recent techniques under different measures.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Saud University for funding this work through research group number RG-1441-425.
文摘Agriculture is an important research area in the field of visual recognition by computers.Plant diseases affect the quality and yields of agriculture.Early-stage identification of crop disease decreases financial losses and positively impacts crop quality.The manual identification of crop diseases,which aremostly visible on leaves,is a very time-consuming and costly process.In this work,we propose a new framework for the recognition of cucumber leaf diseases.The proposed framework is based on deep learning and involves the fusion and selection of the best features.In the feature extraction phase,VGG(Visual Geometry Group)and Inception V3 deep learning models are considered and fine-tuned.Both fine-tuned models are trained using deep transfer learning.Features are extracted in the later step and fused using a parallel maximum fusion approach.In the later step,best features are selected usingWhale Optimization algorithm.The best-selected features are classified using supervised learning algorithms for the final classification process.The experimental process was conducted on a privately collected dataset that consists of five types of cucumber disease and achieved accuracy of 96.5%.A comparison with recent techniques shows the significance of the proposed method.
基金Biotechnology and Biological Sciences Research Council,Grant/Award Number:RM32G0178B8MRC,Grant/Award Number:MC_PC_17171+8 种基金Royal Society,Grant/Award Number:RP202G0230BHF,Grant/Award Number:AA/18/3/34220Hope Foundation for Cancer Research,Grant/Award Number:RM60G0680GCRF,Grant/Award Number:P202PF11Sino-UK Industrial Fund,Grant/Award Number:RP202G0289LIAS,Grant/Award Numbers:P202ED10,P202RE969Data Science Enhancement Fund,Grant/Award Number:P202RE237Fight for Sight,Grant/Award Number:24NN201Sino-UK Education Fund,Grant/Award Number:OP202006。
文摘Speech emotion recognition(SER)is an important research problem in human-computer interaction systems.The representation and extraction of features are significant challenges in SER systems.Despite the promising results of recent studies,they generally do not leverage progressive fusion techniques for effective feature representation and increasing receptive fields.To mitigate this problem,this article proposes DeepCNN,which is a fusion of spectral and temporal features of emotional speech by parallelising convolutional neural networks(CNNs)and a convolution layer-based transformer.Two parallel CNNs are applied to extract the spectral features(2D-CNN)and temporal features(1D-CNN)representations.A 2D-convolution layer-based transformer module extracts spectro-temporal features and concatenates them with features from parallel CNNs.The learnt low-level concatenated features are then applied to a deep framework of convolutional blocks,which retrieves high-level feature representation and subsequently categorises the emotional states using an attention gated recurrent unit and classification layer.This fusion technique results in a deeper hierarchical feature representation at a lower computational cost while simultaneously expanding the filter depth and reducing the feature map.The Berlin Database of Emotional Speech(EMO-BD)and Interactive Emotional Dyadic Motion Capture(IEMOCAP)datasets are used in experiments to recognise distinct speech emotions.With efficient spectral and temporal feature representation,the proposed SER model achieves 94.2%accuracy for different emotions on the EMO-BD and 81.1%accuracy on the IEMOCAP dataset respectively.The proposed SER system,DeepCNN,outperforms the baseline SER systems in terms of emotion recognition accuracy on the EMO-BD and IEMOCAP datasets.
基金supported by the Technology Development Program of MSS(No.S3033853).
文摘Recently,computation offloading has become an effective method for overcoming the constraint of a mobile device(MD)using computationintensivemobile and offloading delay-sensitive application tasks to the remote cloud-based data center.Smart city benefitted from offloading to edge point.Consider a mobile edge computing(MEC)network in multiple regions.They comprise N MDs and many access points,in which everyMDhasM independent real-time tasks.This study designs a new Task Offloading and Resource Allocation in IoT-based MEC using Deep Learning with Seagull Optimization(TORA-DLSGO)algorithm.The proposed TORA-DLSGO technique addresses the resource management issue in the MEC server,which enables an optimum offloading decision to minimize the system cost.In addition,an objective function is derived based on minimizing energy consumption subject to the latency requirements and restricted resources.The TORA-DLSGO technique uses the deep belief network(DBN)model for optimum offloading decision-making.Finally,the SGO algorithm is used for the parameter tuning of the DBN model.The simulation results exemplify that the TORA-DLSGO technique outperformed the existing model in reducing client overhead in the MEC systems with a maximum reward of 0.8967.
基金This research was partly supported by the Technology Development Program of MSS[No.S3033853]by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(No.2020R1I1A3069700).
文摘Precision agriculture includes the optimum and adequate use of resources depending on several variables that govern crop yield.Precision agriculture offers a novel solution utilizing a systematic technique for current agricultural problems like balancing production and environmental concerns.Weed control has become one of the significant problems in the agricultural sector.In traditional weed control,the entire field is treated uniformly by spraying the soil,a single herbicide dose,weed,and crops in the same way.For more precise farming,robots could accomplish targeted weed treatment if they could specifically find the location of the dispensable plant and identify the weed type.This may lessen by large margin utilization of agrochemicals on agricultural fields and favour sustainable agriculture.This study presents a Harris Hawks Optimizer with Graph Convolutional Network based Weed Detection(HHOGCN-WD)technique for Precision Agriculture.The HHOGCN-WD technique mainly focuses on identifying and classifying weeds for precision agriculture.For image pre-processing,the HHOGCN-WD model utilizes a bilateral normal filter(BNF)for noise removal.In addition,coupled convolutional neural network(CCNet)model is utilized to derive a set of feature vectors.To detect and classify weed,the GCN model is utilized with the HHO algorithm as a hyperparameter optimizer to improve the detection performance.The experimental results of the HHOGCN-WD technique are investigated under the benchmark dataset.The results indicate the promising performance of the presented HHOGCN-WD model over other recent approaches,with increased accuracy of 99.13%.
基金This research was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘Wireless sensor networks(WSN)comprise a set of numerous cheap sensors placed in the target region.A primary function of the WSN is to avail the location details of the event occurrences or the node.A major challenge in WSN is node localization which plays an important role in data gathering applications.Since GPS is expensive and inaccurate in indoor regions,effective node localization techniques are needed.The major intention of localization is for determining the place of node in short period with minimum computation.To achieve this,bio-inspired algorithms are used and node localization is assumed as an optimization problem in a multidimensional space.This paper introduces a new Sparrow Search Algorithm with Doppler Effect(SSA-DE)for Node Localization in Wireless Networks.The SSA is generally stimulated by the group wisdom,foraging,and anti-predation behaviors of sparrows.Besides,the Doppler Effect is incorporated into the SSA to further improve the node localization performance.In addition,the SSA-DE model defines the position of node in an iterative manner using Euclidian distance as the fitness function.The presented SSA-DE model is implanted in MATLAB R2014.An extensive set of experimentation is carried out and the results are examined under a varying number of anchor nodes and ranging error.The attained experimental outcome ensured the superior efficiency of the SSA-DE technique over the existing techniques.
基金funded by University College GhentGhent University。
文摘Sustainable forest management heavily relies on the accurate estimation of tree parameters.Among others,the diameter at breast height(DBH) is important for extracting the volume and mass of an individual tree.For systematically estimating the volume of entire plots,airborne laser scanning(ALS) data are used.The estimation model is frequently calibrated using manual DBH measurements or static terrestrial laser scans(STLS) of sample plots.Although reliable,this method is time-consuming,which greatly hampers its use.Here,a handheld mobile terrestrial laser scanning(HMTLS) was demonstrated to be a useful alternative technique to precisely and efficiently calculate DBH.Different data acquisition techniques were applied at a sample plot,then the resulting parameters were comparatively analysed.The calculated DBH values were comparable to the manual measurements for HMTLS,STLS,and ALS data sets.Given the comparability of the extracted parameters,with a reduced point density of HTMLS compared to STLS data,and the reasonable increase of performance,with a reduction of acquisition time with a factor of5 compared to conventional STLS techniques and a factor of3 compared to manual measurements,HMTLS is considered a useful alternative technique.
基金supported by the Technology Development Program of MSS[No.S3033853]by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1A4A1031509).
文摘Worldwide cotton is the most profitable cash crop.Each year the production of this crop suffers because of several diseases.At an early stage,computerized methods are used for disease detection that may reduce the loss in the production of cotton.Although several methods are proposed for the detection of cotton diseases,however,still there are limitations because of low-quality images,size,shape,variations in orientation,and complex background.Due to these factors,there is a need for novel methods for features extraction/selection for the accurate cotton disease classification.Therefore in this research,an optimized features fusion-based model is proposed,in which two pre-trained architectures called EfficientNet-b0 and Inception-v3 are utilized to extract features,each model extracts the feature vector of length N×1000.After that,the extracted features are serially concatenated having a feature vector lengthN×2000.Themost prominent features are selected usingEmperor PenguinOptimizer(EPO)method.The method is evaluated on two publically available datasets,such as Kaggle cotton disease dataset-I,and Kaggle cotton-leaf-infection-II.The EPO method returns the feature vector of length 1×755,and 1×824 using dataset-I,and dataset-II,respectively.The classification is performed using 5,7,and 10 folds cross-validation.The Quadratic Discriminant Analysis(QDA)classifier provides an accuracy of 98.9%on 5 fold,98.96%on 7 fold,and 99.07%on 10 fold using Kaggle cotton disease dataset-I while the Ensemble Subspace K Nearest Neighbor(KNN)provides 99.16%on 5 fold,98.99%on 7 fold,and 99.27%on 10 fold using Kaggle cotton-leaf-infection dataset-II.
基金This research was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theIoT enabled models are resource-limited and necessitate crisp responses, lowlatencies, and high bandwidth, which are beyond their abilities. Cloud computing (CC) is treated as a resource-rich solution to the above mentionedchallenges. But the intrinsic high latency of CC makes it nonviable. The longerlatency degrades the outcome of IoT based smart systems. CC is an emergentdispersed, inexpensive computing pattern with massive assembly of heterogeneous autonomous systems. The effective use of task scheduling minimizes theenergy utilization of the cloud infrastructure and rises the income of serviceproviders by the minimization of the processing time of the user job. Withthis motivation, this paper presents an intelligent Chaotic Artificial ImmuneOptimization Algorithm for Task Scheduling (CAIOA-RS) in IoT enabledcloud environment. The proposed CAIOA-RS algorithm solves the issue ofresource allocation in the IoT enabled cloud environment. It also satisfiesthe makespan by carrying out the optimum task scheduling process with thedistinct strategies of incoming tasks. The design of CAIOA-RS techniqueincorporates the concept of chaotic maps into the conventional AIOA toenhance its performance. A series of experiments were carried out on theCloudSim platform. The simulation results demonstrate that the CAIOA-RStechnique indicates that the proposed model outperforms the original version,as well as other heuristics and metaheuristics.
基金This research was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘With the incorporation of distributed energy systems in the electric grid,transactive energy market(TEM)has become popular in balancing the demand as well as supply adaptively over the grid.The classical grid can be updated to the smart grid by the integration of Information and Communication Technology(ICT)over the grids.The TEM allows the Peerto-Peer(P2P)energy trading in the grid that effectually connects the consumer and prosumer to trade energy among them.At the same time,there is a need to predict the load for effectual P2P energy trading and can be accomplished by the use of machine learning(DML)models.Though some of the short term load prediction techniques have existed in the literature,there is still essential to consider the intrinsic features,parameter optimization,etc.into account.In this aspect,this study devises new deep learning enabled short term load forecasting model for P2P energy trading(DLSTLF-P2P)in TEM.The proposed model involves the design of oppositional coyote optimization algorithm(OCOA)based feature selection technique in which the OCOA is derived by the integration of oppositional based learning(OBL)concept with COA for improved convergence rate.Moreover,deep belief networks(DBN)are employed for the prediction of load in the P2P energy trading systems.In order to additional improve the predictive performance of the DBN model,a hyperparameter optimizer is introduced using chicken swarm optimization(CSO)algorithm is applied for the optimal choice of DBN parameters to improve the predictive outcome.The simulation analysis of the proposed DLSTLF-P2P is validated using the UK Smart Meter dataset and the obtained outcomes demonstrate the superiority of the DLSTLF-P2P technique with the maximum training,testing,and validation accuracy of 90.17%,87.39%,and 87.86%.
基金Shabnam Mohamed Aslam would like to thank the Deanship of Scientific Research at Majmaah University for supporting this work under Project No.R-2021-242.
文摘In recent times,Internet of Things(IoT)has become a hot research topic and it aims at interlinking several sensor-enabled devices mainly for data gathering and tracking applications.Wireless Sensor Network(WSN)is an important component in IoT paradigm since its inception and has become the most preferred platform to deploy several smart city application areas like home automation,smart buildings,intelligent transportation,disaster management,and other such IoT-based applications.Clustering methods are widely-employed energy efficient techniques with a primary purpose i.e.,to balance the energy among sensor nodes.Clustering and routing processes are considered as Non-Polynomial(NP)hard problems whereas bio-inspired techniques have been employed for a known time to resolve such problems.The current research paper designs an Energy Efficient Two-Tier Clustering with Multi-hop Routing Protocol(EETTC-MRP)for IoT networks.The presented EETTC-MRP technique operates on different stages namely,tentative Cluster Head(CH)selection,final CH selection,and routing.In first stage of the proposed EETTC-MRP technique,a type II fuzzy logic-based tentative CH(T2FL-TCH)selection is used.Subsequently,Quantum Group Teaching Optimization Algorithm-based Final CH selection(QGTOA-FCH)technique is deployed to derive an optimum group of CHs in the network.Besides,Political Optimizer based Multihop Routing(PO-MHR)technique is also employed to derive an optimal selection of routes between CHs in the network.In order to validate the efficacy of EETTC-MRP method,a series of experiments was conducted and the outcomes were examined under distinct measures.The experimental analysis infers that the proposed EETTC-MRP technique is superior to other methods under different measures.