Over the last decade,mobile Adhoc networks have expanded dramati-cally in popularity,and their impact on the communication sector on a variety of levels is enormous.Its uses have expanded in lockstep with its growth.D...Over the last decade,mobile Adhoc networks have expanded dramati-cally in popularity,and their impact on the communication sector on a variety of levels is enormous.Its uses have expanded in lockstep with its growth.Due to its instability in usage and the fact that numerous nodes communicate data concur-rently,adequate channel and forwarder selection is essential.In this proposed design for a Cognitive Radio Cognitive Network(CRCN),we gain the confidence of each forwarding node by contacting one-hop and second level nodes,obtaining reports from them,and selecting the forwarder appropriately with the use of an optimization technique.At that point,we concentrate our efforts on their channel,selection,and lastly,the transmission of data packets via the designated forwarder.The simulation work is validated in this section using the MATLAB program.Additionally,steps show how the node acts as a confident forwarder and shares the channel in a compatible method to communicate,allowing for more packet bits to be transmitted by conveniently picking the channel between them.We cal-culate the confidence of the node at the start of the network by combining the reliability report for thefirst hop and the reliability report for the secondary hop.We then refer to the same node as the confident node in order to operate as a forwarder.As a result,we witness an increase in the leftover energy in the output.The percentage of data packets delivered has also increased.展开更多
Soil is the major source of infinite lives on Earth and the quality of soil plays significant role on Agriculture practices all around.Hence,the evaluation of soil quality is very important for determining the amount ...Soil is the major source of infinite lives on Earth and the quality of soil plays significant role on Agriculture practices all around.Hence,the evaluation of soil quality is very important for determining the amount of nutrients that the soil require for proper yield.In present decade,the application of deep learning models in many fields of research has created greater impact.The increasing soil data availability of soil data there is a greater demand for the remotely avail open source model,leads to the incorporation of deep learning method to predict the soil quality.With that concern,this paper proposes a novel model called Improved Soil Quality Prediction Model using Deep Learning(ISQP-DL).The work considers the chemical,physical and biological factors of soil in particular area to estimate the soil quality.Firstly,pH rating of soil samples has been collected from the soil testing laboratory from which the acidic range has been categorized through soil test and the same data has been taken as input to the Deep Neural Network Regression(DNNR)model.Secondly,soil nutrient data has been given as second input to the DNNR model.By utilizing this data set,the DNNR method is used to evaluate the fertility rate by which the soil quality has been estimated.For training and testing,the model uses Deep Neural Network Regression(DNNR),by utilizing the dataset.The results show that the proposed model is effective for SQP(Soil Quality Prediction Model)with efficient good fitting and generality is enhanced with input features with higher rate of classification accuracy.The results show that the proposed model achieves 96.7%of accuracy rate compared with existing models.展开更多
Automated biomedical signal processing becomes an essential process to determine the indicators of diseased states.At the same time,latest develop-ments of artificial intelligence(AI)techniques have the ability to mana...Automated biomedical signal processing becomes an essential process to determine the indicators of diseased states.At the same time,latest develop-ments of artificial intelligence(AI)techniques have the ability to manage and ana-lyzing massive amounts of biomedical datasets results in clinical decisions and real time applications.They can be employed for medical imaging;however,the 1D biomedical signal recognition process is still needing to be improved.Electrocardiogram(ECG)is one of the widely used 1-dimensional biomedical sig-nals,which is used to diagnose cardiovascular diseases.Computer assisted diag-nostic modelsfind it difficult to automatically classify the 1D ECG signals owing to time-varying dynamics and diverse profiles of ECG signals.To resolve these issues,this study designs automated deep learning based 1D biomedical ECG sig-nal recognition for cardiovascular disease diagnosis(DLECG-CVD)model.The DLECG-CVD model involves different stages of operations such as pre-proces-sing,feature extraction,hyperparameter tuning,and classification.At the initial stage,data pre-processing takes place to convert the ECG report to valuable data and transform it into a compatible format for further processing.In addition,deep belief network(DBN)model is applied to derive a set of feature vectors.Besides,improved swallow swarm optimization(ISSO)algorithm is used for the hyper-parameter tuning of the DBN model.Lastly,extreme gradient boosting(XGBoost)classifier is employed to allocate proper class labels to the test ECG signals.In order to verify the improved diagnostic performance of the DLECG-CVD model,a set of simulations is carried out on the benchmark PTB-XL dataset.A detailed comparative study highlighted the betterment of the DLECG-CVD model interms of accuracy,sensitivity,specificity,kappa,Mathew correlation coefficient,and Hamming loss.展开更多
Owing to massive technological developments in Internet of Things(IoT)and cloud environment,cloud computing(CC)offers a highly flexible heterogeneous resource pool over the network,and clients could exploit various re...Owing to massive technological developments in Internet of Things(IoT)and cloud environment,cloud computing(CC)offers a highly flexible heterogeneous resource pool over the network,and clients could exploit various resources on demand.Since IoT-enabled models are restricted to resources and require crisp response,minimum latency,and maximum bandwidth,which are outside the capabilities.CC was handled as a resource-rich solution to aforementioned challenge.As high delay reduces the performance of the IoT enabled cloud platform,efficient utilization of task scheduling(TS)reduces the energy usage of the cloud infrastructure and increases the income of service provider via minimizing processing time of user job.Therefore,this article concentration on the design of an oppositional red fox optimization based task scheduling scheme(ORFOTSS)for IoT enabled cloud environment.The presented ORFO-TSS model resolves the problem of allocating resources from the IoT based cloud platform.It achieves the makespan by performing optimum TS procedures with various aspects of incoming task.The designing of ORFO-TSS method includes the idea of oppositional based learning(OBL)as to traditional RFO approach in enhancing their efficiency.A wide-ranging experimental analysis was applied on the CloudSim platform.The experimental outcome highlighted the efficacy of the ORFO-TSS technique over existing approaches.展开更多
Hyperspectral(HS)image classification is a hot research area due to challenging issues such as existence of high dimensionality,restricted training data,etc.Precise recognition of features from the HS images is importa...Hyperspectral(HS)image classification is a hot research area due to challenging issues such as existence of high dimensionality,restricted training data,etc.Precise recognition of features from the HS images is important for effective classification outcomes.Additionally,the recent advancements of deep learning(DL)models make it possible in several application areas.In addition,the performance of the DL models is mainly based on the hyperparameter setting which can be resolved by the design of metaheuristics.In this view,this article develops an automated red deer algorithm with deep learning enabled hyperspec-tral image(HSI)classification(RDADL-HIC)technique.The proposed RDADL-HIC technique aims to effectively determine the HSI images.In addition,the RDADL-HIC technique comprises a NASNetLarge model with Adagrad optimi-zer.Moreover,RDA with gated recurrent unit(GRU)approach is used for the identification and classification of HSIs.The design of Adagrad optimizer with RDA helps to optimally tune the hyperparameters of the NASNetLarge and GRU models respectively.The experimental results stated the supremacy of the RDADL-HIC model and the results are inspected interms of different measures.The comparison study of the RDADL-HIC model demonstrated the enhanced per-formance over its recent state of art approaches.展开更多
In today’s world, Cloud Computing (CC) enables the users to accesscomputing resources and services over cloud without any need to own the infrastructure. Cloud Computing is a concept in which a network of devices, lo...In today’s world, Cloud Computing (CC) enables the users to accesscomputing resources and services over cloud without any need to own the infrastructure. Cloud Computing is a concept in which a network of devices, located inremote locations, is integrated to perform operations like data collection, processing, data profiling and data storage. In this context, resource allocation and taskscheduling are important processes which must be managed based on the requirements of a user. In order to allocate the resources effectively, hybrid cloud isemployed since it is a capable solution to process large-scale consumer applications in a pay-by-use manner. Hence, the model is to be designed as a profit-driven framework to reduce cost and make span. With this motivation, the currentresearch work develops a Cost-Effective Optimal Task Scheduling Model(CEOTS). A novel algorithm called Target-based Cost Derivation (TCD) modelis used in the proposed work for hybrid clouds. Moreover, the algorithm workson the basis of multi-intentional task completion process with optimal resourceallocation. The model was successfully simulated to validate its effectivenessbased on factors such as processing time, make span and efficient utilization ofvirtual machines. The results infer that the proposed model outperformed theexisting works and can be relied in future for real-time applications.展开更多
文摘Over the last decade,mobile Adhoc networks have expanded dramati-cally in popularity,and their impact on the communication sector on a variety of levels is enormous.Its uses have expanded in lockstep with its growth.Due to its instability in usage and the fact that numerous nodes communicate data concur-rently,adequate channel and forwarder selection is essential.In this proposed design for a Cognitive Radio Cognitive Network(CRCN),we gain the confidence of each forwarding node by contacting one-hop and second level nodes,obtaining reports from them,and selecting the forwarder appropriately with the use of an optimization technique.At that point,we concentrate our efforts on their channel,selection,and lastly,the transmission of data packets via the designated forwarder.The simulation work is validated in this section using the MATLAB program.Additionally,steps show how the node acts as a confident forwarder and shares the channel in a compatible method to communicate,allowing for more packet bits to be transmitted by conveniently picking the channel between them.We cal-culate the confidence of the node at the start of the network by combining the reliability report for thefirst hop and the reliability report for the secondary hop.We then refer to the same node as the confident node in order to operate as a forwarder.As a result,we witness an increase in the leftover energy in the output.The percentage of data packets delivered has also increased.
文摘Soil is the major source of infinite lives on Earth and the quality of soil plays significant role on Agriculture practices all around.Hence,the evaluation of soil quality is very important for determining the amount of nutrients that the soil require for proper yield.In present decade,the application of deep learning models in many fields of research has created greater impact.The increasing soil data availability of soil data there is a greater demand for the remotely avail open source model,leads to the incorporation of deep learning method to predict the soil quality.With that concern,this paper proposes a novel model called Improved Soil Quality Prediction Model using Deep Learning(ISQP-DL).The work considers the chemical,physical and biological factors of soil in particular area to estimate the soil quality.Firstly,pH rating of soil samples has been collected from the soil testing laboratory from which the acidic range has been categorized through soil test and the same data has been taken as input to the Deep Neural Network Regression(DNNR)model.Secondly,soil nutrient data has been given as second input to the DNNR model.By utilizing this data set,the DNNR method is used to evaluate the fertility rate by which the soil quality has been estimated.For training and testing,the model uses Deep Neural Network Regression(DNNR),by utilizing the dataset.The results show that the proposed model is effective for SQP(Soil Quality Prediction Model)with efficient good fitting and generality is enhanced with input features with higher rate of classification accuracy.The results show that the proposed model achieves 96.7%of accuracy rate compared with existing models.
文摘Automated biomedical signal processing becomes an essential process to determine the indicators of diseased states.At the same time,latest develop-ments of artificial intelligence(AI)techniques have the ability to manage and ana-lyzing massive amounts of biomedical datasets results in clinical decisions and real time applications.They can be employed for medical imaging;however,the 1D biomedical signal recognition process is still needing to be improved.Electrocardiogram(ECG)is one of the widely used 1-dimensional biomedical sig-nals,which is used to diagnose cardiovascular diseases.Computer assisted diag-nostic modelsfind it difficult to automatically classify the 1D ECG signals owing to time-varying dynamics and diverse profiles of ECG signals.To resolve these issues,this study designs automated deep learning based 1D biomedical ECG sig-nal recognition for cardiovascular disease diagnosis(DLECG-CVD)model.The DLECG-CVD model involves different stages of operations such as pre-proces-sing,feature extraction,hyperparameter tuning,and classification.At the initial stage,data pre-processing takes place to convert the ECG report to valuable data and transform it into a compatible format for further processing.In addition,deep belief network(DBN)model is applied to derive a set of feature vectors.Besides,improved swallow swarm optimization(ISSO)algorithm is used for the hyper-parameter tuning of the DBN model.Lastly,extreme gradient boosting(XGBoost)classifier is employed to allocate proper class labels to the test ECG signals.In order to verify the improved diagnostic performance of the DLECG-CVD model,a set of simulations is carried out on the benchmark PTB-XL dataset.A detailed comparative study highlighted the betterment of the DLECG-CVD model interms of accuracy,sensitivity,specificity,kappa,Mathew correlation coefficient,and Hamming loss.
文摘Owing to massive technological developments in Internet of Things(IoT)and cloud environment,cloud computing(CC)offers a highly flexible heterogeneous resource pool over the network,and clients could exploit various resources on demand.Since IoT-enabled models are restricted to resources and require crisp response,minimum latency,and maximum bandwidth,which are outside the capabilities.CC was handled as a resource-rich solution to aforementioned challenge.As high delay reduces the performance of the IoT enabled cloud platform,efficient utilization of task scheduling(TS)reduces the energy usage of the cloud infrastructure and increases the income of service provider via minimizing processing time of user job.Therefore,this article concentration on the design of an oppositional red fox optimization based task scheduling scheme(ORFOTSS)for IoT enabled cloud environment.The presented ORFO-TSS model resolves the problem of allocating resources from the IoT based cloud platform.It achieves the makespan by performing optimum TS procedures with various aspects of incoming task.The designing of ORFO-TSS method includes the idea of oppositional based learning(OBL)as to traditional RFO approach in enhancing their efficiency.A wide-ranging experimental analysis was applied on the CloudSim platform.The experimental outcome highlighted the efficacy of the ORFO-TSS technique over existing approaches.
文摘Hyperspectral(HS)image classification is a hot research area due to challenging issues such as existence of high dimensionality,restricted training data,etc.Precise recognition of features from the HS images is important for effective classification outcomes.Additionally,the recent advancements of deep learning(DL)models make it possible in several application areas.In addition,the performance of the DL models is mainly based on the hyperparameter setting which can be resolved by the design of metaheuristics.In this view,this article develops an automated red deer algorithm with deep learning enabled hyperspec-tral image(HSI)classification(RDADL-HIC)technique.The proposed RDADL-HIC technique aims to effectively determine the HSI images.In addition,the RDADL-HIC technique comprises a NASNetLarge model with Adagrad optimi-zer.Moreover,RDA with gated recurrent unit(GRU)approach is used for the identification and classification of HSIs.The design of Adagrad optimizer with RDA helps to optimally tune the hyperparameters of the NASNetLarge and GRU models respectively.The experimental results stated the supremacy of the RDADL-HIC model and the results are inspected interms of different measures.The comparison study of the RDADL-HIC model demonstrated the enhanced per-formance over its recent state of art approaches.
文摘In today’s world, Cloud Computing (CC) enables the users to accesscomputing resources and services over cloud without any need to own the infrastructure. Cloud Computing is a concept in which a network of devices, located inremote locations, is integrated to perform operations like data collection, processing, data profiling and data storage. In this context, resource allocation and taskscheduling are important processes which must be managed based on the requirements of a user. In order to allocate the resources effectively, hybrid cloud isemployed since it is a capable solution to process large-scale consumer applications in a pay-by-use manner. Hence, the model is to be designed as a profit-driven framework to reduce cost and make span. With this motivation, the currentresearch work develops a Cost-Effective Optimal Task Scheduling Model(CEOTS). A novel algorithm called Target-based Cost Derivation (TCD) modelis used in the proposed work for hybrid clouds. Moreover, the algorithm workson the basis of multi-intentional task completion process with optimal resourceallocation. The model was successfully simulated to validate its effectivenessbased on factors such as processing time, make span and efficient utilization ofvirtual machines. The results infer that the proposed model outperformed theexisting works and can be relied in future for real-time applications.