Three kinds of quantum relay communication models are proposed,i.e.,the quantum single relay model,quantum serial multi-relay model and quantum parallel multi-relay model.The channel capacities of those three kinds of...Three kinds of quantum relay communication models are proposed,i.e.,the quantum single relay model,quantum serial multi-relay model and quantum parallel multi-relay model.The channel capacities of those three kinds of systems are analyzed with the theory of quantum Markov trace-preserving process and the generalized theory of simple multi-hop channel in quantum system.Motivated by the quantum Fano inequality,the lower bounds of that channel capacities are derived.The illustration and simulation present the trends of the lower bounds on the channel capacities of different quantum relay systems based on the depolarizing noisy channel.展开更多
Human Adaptive Mechatronics(HAM)includes human and computer system in a closed loop.Elderly person with disabilities,normally carry out their daily routines with some assistance to move their limbs.With the short fall...Human Adaptive Mechatronics(HAM)includes human and computer system in a closed loop.Elderly person with disabilities,normally carry out their daily routines with some assistance to move their limbs.With the short fall of human care takers,mechatronics devices are used with the likes of exoskeleton and exosuits to assist them.The rehabilitation and occupational therapy equipments utilize the electromyography(EMG)signals to measure the muscle activity potential.This paper focuses on optimizing the HAM model in prediction of intended motion of upper limb with high accuracy and to increase the response time of the system.Limb characteristics extraction from EMG signal and prediction of optimal controller parameters are modeled.Time and frequency based approach of EMG signal are considered for feature extraction.The models used for estimating motion and muscle parameters from EMG signal for carrying out limb movement predictions are validated.Based on the extracted features,optimal parameters are selected by Modified Lion Optimization(MLO)for controlling the HAM system.Finally,supervised machine learning makes predictions at different points in time for individual sensing using Support Vector Neural Network(SVNN).This model is also evaluated based on optimal parameters of motion estimation and the accuracy level along with different optimization models for various upper limb movements.The proposed model of human adaptive controller predicts the limb movement by 96%accuracy.展开更多
A vector control based on the extended equivalent circuit and virtual circuits is proposed for the single-phase inverter.By the extended circuit,the other two phase voltages can be extended by the output voltage of th...A vector control based on the extended equivalent circuit and virtual circuits is proposed for the single-phase inverter.By the extended circuit,the other two phase voltages can be extended by the output voltage of the single-phase inverter so as to construct the voltage vector.The voltage outer-loop is to control the voltage vector in dq coordinate system,and the output voltage can track the target value without deviation in steady state.By designing the virtual circuit,the voltage inner-loop can achieve approximate decoupling and improve the dynamic response under the changeable load.Compared with the traditional dual closed-loop control,the proposed dual closed-loop control scheme only needs to detect and control the voltage without the current.It not only can achieve good control effect,but also reduce the complexity of the hardware.Finally,the simulation and experimental results show that the single-phase inverter has good static and dynamic characteristics regardless of stable load or changeable load.展开更多
Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theI...Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theIoT enabled models are resource-limited and necessitate crisp responses, lowlatencies, and high bandwidth, which are beyond their abilities. Cloud computing (CC) is treated as a resource-rich solution to the above mentionedchallenges. But the intrinsic high latency of CC makes it nonviable. The longerlatency degrades the outcome of IoT based smart systems. CC is an emergentdispersed, inexpensive computing pattern with massive assembly of heterogeneous autonomous systems. The effective use of task scheduling minimizes theenergy utilization of the cloud infrastructure and rises the income of serviceproviders by the minimization of the processing time of the user job. Withthis motivation, this paper presents an intelligent Chaotic Artificial ImmuneOptimization Algorithm for Task Scheduling (CAIOA-RS) in IoT enabledcloud environment. The proposed CAIOA-RS algorithm solves the issue ofresource allocation in the IoT enabled cloud environment. It also satisfiesthe makespan by carrying out the optimum task scheduling process with thedistinct strategies of incoming tasks. The design of CAIOA-RS techniqueincorporates the concept of chaotic maps into the conventional AIOA toenhance its performance. A series of experiments were carried out on theCloudSim platform. The simulation results demonstrate that the CAIOA-RStechnique indicates that the proposed model outperforms the original version,as well as other heuristics and metaheuristics.展开更多
Data mining and analytics involve inspecting and modeling large pre-existing datasets to discover decision-making information.Precision agriculture uses datamining to advance agricultural developments.Many farmers are...Data mining and analytics involve inspecting and modeling large pre-existing datasets to discover decision-making information.Precision agriculture uses datamining to advance agricultural developments.Many farmers aren’t getting the most out of their land because they don’t use precision agriculture.They harvest crops without a well-planned recommendation system.Future crop production is calculated by combining environmental conditions and management behavior,yielding numerical and categorical data.Most existing research still needs to address data preprocessing and crop categorization/classification.Furthermore,statistical analysis receives less attention,despite producing more accurate and valid results.The study was conducted on a dataset about Karnataka state,India,with crops of eight parameters taken into account,namely the minimum amount of fertilizers required,such as nitrogen,phosphorus,potassium,and pH values.The research considers rainfall,season,soil type,and temperature parameters to provide precise cultivation recommendations for high productivity.The presented algorithm converts discrete numerals to factors first,then reduces levels.Second,the algorithm generates six datasets,two fromCase-1(dataset withmany numeric variables),two from Case-2(dataset with many categorical variables),and one from Case-3(dataset with reduced factor variables).Finally,the algorithm outputs a class membership allocation based on an extended version of the K-means partitioning method with lambda estimation.The presented work produces mixed-type datasets with precisely categorized crops by organizing data based on environmental conditions,soil nutrients,and geo-location.Finally,the prepared dataset solves the classification problem,leading to a model evaluation that selects the best dataset for precise crop prediction.展开更多
Internet of things (IoT) has been significantly raised owing to thedevelopment of broadband access network, machine learning (ML), big dataanalytics (BDA), cloud computing (CC), and so on. The development of IoTtechno...Internet of things (IoT) has been significantly raised owing to thedevelopment of broadband access network, machine learning (ML), big dataanalytics (BDA), cloud computing (CC), and so on. The development of IoTtechnologies has resulted in a massive quantity of data due to the existenceof several people linking through distinct physical components, indicatingthe status of the CC environment. In the IoT, load scheduling is realistictechnique in distinct data center to guarantee the network suitability by fallingthe computer hardware and software catastrophe and with right utilize ofresource. The ideal load balancer improves many factors of Quality of Service(QoS) like resource performance, scalability, response time, error tolerance,and efficiency. The scholar is assumed as load scheduling a vital problem inIoT environment. There are many techniques accessible to load scheduling inIoT environments. With this motivation, this paper presents an improved deerhunting optimization algorithm with Type II fuzzy logic (IDHOA-T2F) modelfor load scheduling in IoT environment. The goal of the IDHOA-T2F is todiminish the energy utilization of integrated circuit of IoT node and enhancethe load scheduling in IoT environments. The IDHOA technique is derivedby integrating the concepts of Nelder Mead (NM) with the DHOA. Theproposed model also synthesized the T2L based on fuzzy logic (FL) systemsto counterbalance the load distribution. The proposed model finds usefulto improve the efficiency of IoT system. For validating the enhanced loadscheduling performance of the IDHOA-T2F technique, a series of simulationstake place to highlight the improved performance. The experimental outcomesdemonstrate the capable outcome of the IDHOA-T2F technique over therecent techniques.展开更多
The Internet of Things(IoT)technologies has gained significant interest in the design of smart grids(SGs).The increasing amount of distributed generations,maturity of existing grid infrastructures,and demand network t...The Internet of Things(IoT)technologies has gained significant interest in the design of smart grids(SGs).The increasing amount of distributed generations,maturity of existing grid infrastructures,and demand network transformation have received maximum attention.An essential energy storing model mostly the electrical energy stored methods are developing as the diagnoses for its procedure was becoming further compelling.The dynamic electrical energy stored model using Electric Vehicles(EVs)is comparatively standard because of its excellent electrical property and flexibility however the chance of damage to its battery was there in event of overcharging or deep discharging and its mass penetration deeply influences the grids.This paper offers a new Hybridization of Bacterial foraging optimization with Sparse Autoencoder(HBFOA-SAE)model for IoT Enabled energy systems.The proposed HBFOA-SAE model majorly intends to effectually estimate the state of charge(SOC)values in the IoT based energy system.To accomplish this,the SAE technique was executed to proper determination of the SOC values in the energy systems.Next,for improving the performance of the SOC estimation process,the HBFOA is employed.In addition,the HBFOA technique is derived by the integration of the hill climbing(HC)concepts with the BFOA to improve the overall efficiency.For ensuring better outcomes for the HBFOA-SAE model,a comprehensive set of simulations were performed and the outcomes are inspected under several aspects.The experimental results reported the supremacy of the HBFOA-SAE model over the recent state of art approaches.展开更多
基金Supported by the National Natural Science Foundation of China under Grant No.60902044the Program for New Century Excellent Talents in University of Ministry of Education of China(NCET-11-0510)+5 种基金the Hunan Provincial Innovation Foundation For Postgraduate under Grant No.CX2011B087the State Scholarship Fund organized by the China Scholarship Council under Grant No.2011637096Excellent Doctoral Dissertation Fund of Central South University under Grant No.2011ybjz030WCU R32-2010-000-20014-0(Korea)FR 2010-0020942(Korea)MEST 2012-002521(NRF Korea)
文摘Three kinds of quantum relay communication models are proposed,i.e.,the quantum single relay model,quantum serial multi-relay model and quantum parallel multi-relay model.The channel capacities of those three kinds of systems are analyzed with the theory of quantum Markov trace-preserving process and the generalized theory of simple multi-hop channel in quantum system.Motivated by the quantum Fano inequality,the lower bounds of that channel capacities are derived.The illustration and simulation present the trends of the lower bounds on the channel capacities of different quantum relay systems based on the depolarizing noisy channel.
基金This work was supported by the Deanship of Scientific Research,King Khalid University,Kingdom of Saudi Arabia under research Grant Number(R.G.P.2/100/41).
文摘Human Adaptive Mechatronics(HAM)includes human and computer system in a closed loop.Elderly person with disabilities,normally carry out their daily routines with some assistance to move their limbs.With the short fall of human care takers,mechatronics devices are used with the likes of exoskeleton and exosuits to assist them.The rehabilitation and occupational therapy equipments utilize the electromyography(EMG)signals to measure the muscle activity potential.This paper focuses on optimizing the HAM model in prediction of intended motion of upper limb with high accuracy and to increase the response time of the system.Limb characteristics extraction from EMG signal and prediction of optimal controller parameters are modeled.Time and frequency based approach of EMG signal are considered for feature extraction.The models used for estimating motion and muscle parameters from EMG signal for carrying out limb movement predictions are validated.Based on the extracted features,optimal parameters are selected by Modified Lion Optimization(MLO)for controlling the HAM system.Finally,supervised machine learning makes predictions at different points in time for individual sensing using Support Vector Neural Network(SVNN).This model is also evaluated based on optimal parameters of motion estimation and the accuracy level along with different optimization models for various upper limb movements.The proposed model of human adaptive controller predicts the limb movement by 96%accuracy.
基金This work was supported in part by the National Natural Science Foundation of China under Grant 61773006.
文摘A vector control based on the extended equivalent circuit and virtual circuits is proposed for the single-phase inverter.By the extended circuit,the other two phase voltages can be extended by the output voltage of the single-phase inverter so as to construct the voltage vector.The voltage outer-loop is to control the voltage vector in dq coordinate system,and the output voltage can track the target value without deviation in steady state.By designing the virtual circuit,the voltage inner-loop can achieve approximate decoupling and improve the dynamic response under the changeable load.Compared with the traditional dual closed-loop control,the proposed dual closed-loop control scheme only needs to detect and control the voltage without the current.It not only can achieve good control effect,but also reduce the complexity of the hardware.Finally,the simulation and experimental results show that the single-phase inverter has good static and dynamic characteristics regardless of stable load or changeable load.
基金This research was supported by Korea Institute for Advancement of Technology(KIAT)grant funded by the Korea Government(MOTIE)(P0012724,The Competency Development Program for Industry Specialist)and the Soonchunhyang University Research Fund.
文摘Internet of Things (IoT) is transforming the technical setting ofconventional systems and finds applicability in smart cities, smart healthcare, smart industry, etc. In addition, the application areas relating to theIoT enabled models are resource-limited and necessitate crisp responses, lowlatencies, and high bandwidth, which are beyond their abilities. Cloud computing (CC) is treated as a resource-rich solution to the above mentionedchallenges. But the intrinsic high latency of CC makes it nonviable. The longerlatency degrades the outcome of IoT based smart systems. CC is an emergentdispersed, inexpensive computing pattern with massive assembly of heterogeneous autonomous systems. The effective use of task scheduling minimizes theenergy utilization of the cloud infrastructure and rises the income of serviceproviders by the minimization of the processing time of the user job. Withthis motivation, this paper presents an intelligent Chaotic Artificial ImmuneOptimization Algorithm for Task Scheduling (CAIOA-RS) in IoT enabledcloud environment. The proposed CAIOA-RS algorithm solves the issue ofresource allocation in the IoT enabled cloud environment. It also satisfiesthe makespan by carrying out the optimum task scheduling process with thedistinct strategies of incoming tasks. The design of CAIOA-RS techniqueincorporates the concept of chaotic maps into the conventional AIOA toenhance its performance. A series of experiments were carried out on theCloudSim platform. The simulation results demonstrate that the CAIOA-RStechnique indicates that the proposed model outperforms the original version,as well as other heuristics and metaheuristics.
基金This research work was funded by the Institutional Fund Projects under Grant No.(IFPIP:959-611-1443)The authors gratefully acknowledge the technical and financial support provided by the Ministry of Education and King Abdulaziz University,DSR,Jeddah,Saudi Arabia.
文摘Data mining and analytics involve inspecting and modeling large pre-existing datasets to discover decision-making information.Precision agriculture uses datamining to advance agricultural developments.Many farmers aren’t getting the most out of their land because they don’t use precision agriculture.They harvest crops without a well-planned recommendation system.Future crop production is calculated by combining environmental conditions and management behavior,yielding numerical and categorical data.Most existing research still needs to address data preprocessing and crop categorization/classification.Furthermore,statistical analysis receives less attention,despite producing more accurate and valid results.The study was conducted on a dataset about Karnataka state,India,with crops of eight parameters taken into account,namely the minimum amount of fertilizers required,such as nitrogen,phosphorus,potassium,and pH values.The research considers rainfall,season,soil type,and temperature parameters to provide precise cultivation recommendations for high productivity.The presented algorithm converts discrete numerals to factors first,then reduces levels.Second,the algorithm generates six datasets,two fromCase-1(dataset withmany numeric variables),two from Case-2(dataset with many categorical variables),and one from Case-3(dataset with reduced factor variables).Finally,the algorithm outputs a class membership allocation based on an extended version of the K-means partitioning method with lambda estimation.The presented work produces mixed-type datasets with precisely categorized crops by organizing data based on environmental conditions,soil nutrients,and geo-location.Finally,the prepared dataset solves the classification problem,leading to a model evaluation that selects the best dataset for precise crop prediction.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under grant number(RGP 2/209/42)This research was funded by the Deanship of Scientific Research at Princess Nourah bint Abdulrahman University through the Fast-Track Path of Research Funding Program.
文摘Internet of things (IoT) has been significantly raised owing to thedevelopment of broadband access network, machine learning (ML), big dataanalytics (BDA), cloud computing (CC), and so on. The development of IoTtechnologies has resulted in a massive quantity of data due to the existenceof several people linking through distinct physical components, indicatingthe status of the CC environment. In the IoT, load scheduling is realistictechnique in distinct data center to guarantee the network suitability by fallingthe computer hardware and software catastrophe and with right utilize ofresource. The ideal load balancer improves many factors of Quality of Service(QoS) like resource performance, scalability, response time, error tolerance,and efficiency. The scholar is assumed as load scheduling a vital problem inIoT environment. There are many techniques accessible to load scheduling inIoT environments. With this motivation, this paper presents an improved deerhunting optimization algorithm with Type II fuzzy logic (IDHOA-T2F) modelfor load scheduling in IoT environment. The goal of the IDHOA-T2F is todiminish the energy utilization of integrated circuit of IoT node and enhancethe load scheduling in IoT environments. The IDHOA technique is derivedby integrating the concepts of Nelder Mead (NM) with the DHOA. Theproposed model also synthesized the T2L based on fuzzy logic (FL) systemsto counterbalance the load distribution. The proposed model finds usefulto improve the efficiency of IoT system. For validating the enhanced loadscheduling performance of the IDHOA-T2F technique, a series of simulationstake place to highlight the improved performance. The experimental outcomesdemonstrate the capable outcome of the IDHOA-T2F technique over therecent techniques.
文摘The Internet of Things(IoT)technologies has gained significant interest in the design of smart grids(SGs).The increasing amount of distributed generations,maturity of existing grid infrastructures,and demand network transformation have received maximum attention.An essential energy storing model mostly the electrical energy stored methods are developing as the diagnoses for its procedure was becoming further compelling.The dynamic electrical energy stored model using Electric Vehicles(EVs)is comparatively standard because of its excellent electrical property and flexibility however the chance of damage to its battery was there in event of overcharging or deep discharging and its mass penetration deeply influences the grids.This paper offers a new Hybridization of Bacterial foraging optimization with Sparse Autoencoder(HBFOA-SAE)model for IoT Enabled energy systems.The proposed HBFOA-SAE model majorly intends to effectually estimate the state of charge(SOC)values in the IoT based energy system.To accomplish this,the SAE technique was executed to proper determination of the SOC values in the energy systems.Next,for improving the performance of the SOC estimation process,the HBFOA is employed.In addition,the HBFOA technique is derived by the integration of the hill climbing(HC)concepts with the BFOA to improve the overall efficiency.For ensuring better outcomes for the HBFOA-SAE model,a comprehensive set of simulations were performed and the outcomes are inspected under several aspects.The experimental results reported the supremacy of the HBFOA-SAE model over the recent state of art approaches.