Load balancing is a technique for identifying overloaded and underloaded nodes and balancing the load between them.To maximize various performance parameters in cloud computing,researchers suggested various load balan...Load balancing is a technique for identifying overloaded and underloaded nodes and balancing the load between them.To maximize various performance parameters in cloud computing,researchers suggested various load balancing approaches.To store and access data and services provided by the different service providers through the network over different regions,cloud computing is one of the latest technology systems for both end-users and service providers.The volume of data is increasing due to the pandemic and a significant increase in usage of the internet has also been experienced.Users of the cloud are looking for services that are intelligent,and,can balance the traffic load by service providers,resulting in seamless and uninterrupted services.Different types of algorithms and techniques are available that can manage the load balancing in the cloud services.In this paper,a newly proposed method for load balancing in cloud computing at the database level is introduced.The database cloud services are frequently employed by companies of all sizes,for application development and business process.Load balancing for distributed applications can be used to maintain an efficient task scheduling process that also meets the user requirements and improves resource utilization.Load balancing is the process of distributing the load on various nodes to ensure that no single node is overloaded.To avoid the nodes from being overloaded,the load balancer divides an equal amount of computing time to all nodes.The results of two different scenarios showed the cross-region traffic management and significant growth in revenue of restaurants by using load balancer decisions on application traffic gateways.展开更多
The application of Information and Communication Technologies has transformed traditional Teaching and Learning in the past decade to computerized-based era. This evolution has resulted from the emergence of the digit...The application of Information and Communication Technologies has transformed traditional Teaching and Learning in the past decade to computerized-based era. This evolution has resulted from the emergence of the digital system and has greatly impacted on the global education and socio-cultural development. Multimedia has been absorbed into the education sector for producing a new learning concept and a combination of educational and entertainment approach. This research is concerned with the application of Window Speech Recognition and Microsoft Visual Basic 2008 Integrated/Interactive Development Environment in Multimedia-Assisted Courseware prototype development for Primary School Mathematics contents, namely, single digits and the addition. The Teaching and Learning techniques—Explain, Instruct and Facilitate are proposed and these could be viewed as instructors’ centered strategy, instructors’—learners’ dual communication and learners' active participation. The prototype is called M-EIF and deployed only users' voices;hence the activation of Window Speech Recognition is required prior to a test run.展开更多
The networks are fundamental to our modern world and they appear throughout science and society.Access to a massive amount of data presents a unique opportunity to the researcher’s community.As networks grow in size ...The networks are fundamental to our modern world and they appear throughout science and society.Access to a massive amount of data presents a unique opportunity to the researcher’s community.As networks grow in size the complexity increases and our ability to analyze them using the current state of the art is at severe risk of failing to keep pace.Therefore,this paper initiates a discussion on graph signal processing for large-scale data analysis.We first provide a comprehensive overview of core ideas in Graph signal processing(GSP)and their connection to conventional digital signal processing(DSP).We then summarize recent developments in developing basic GSP tools,including methods for graph filtering or graph learning,graph signal,graph Fourier transform(GFT),spectrum,graph frequency,etc.Graph filtering is a basic task that allows for isolating the contribution of individual frequencies and therefore enables the removal of noise.We then consider a graph filter as a model that helps to extend the application of GSP methods to large datasets.To show the suitability and the effeteness,we first created a noisy graph signal and then applied it to the filter.After several rounds of simulation results.We see that the filtered signal appears to be smoother and is closer to the original noise-free distance-based signal.By using this example application,we thoroughly demonstrated that graph filtration is efficient for big data analytics.展开更多
Cloud computing is an emerging domain that is capturing global users from all walks of life—the corporate sector,government sector,and social arena as well.Various cloud providers have offered multiple services and f...Cloud computing is an emerging domain that is capturing global users from all walks of life—the corporate sector,government sector,and social arena as well.Various cloud providers have offered multiple services and facilities to this audience and the number of providers is increasing very swiftly.This enormous pace is generating the requirement of a comprehensive ecosystem that shall provide a seamless and customized user environment not only to enhance the user experience but also to improve security,availability,accessibility,and latency.Emerging technology is providing robust solutions to many of our problems,the cloud platform is one of them.It is worth mentioning that these solutions are also amplifying the complexity and need of sustenance of these rapid solutions.As with cloud computing,new entrants as cloud service providers,resellers,tech-support,hardware manufacturers,and software developers appear on a daily basis.These actors playing their role in the growth and sustenance of the cloud ecosystem.Our objective is to use convergence for cloud services,software-defined networks,network function virtualization for infrastructure,cognition for pattern development,and knowledge repository.In order to gear up these processes,machine learning to induce intelligence to maintain ecosystem growth,to monitor performance,and to become able to make decisions for the sustenance of the ecosystem.Workloads may be programmed to“superficially”imitate most business applications and create large numbers using lightweight workload generators that merely stress the storage.In today’s current IT environment,when many enterprises use the cloud to service some of their application demands,a different performance testing technique that assesses more than the storage is necessary.Compute and storage are merged into a single building block with HCI(Hyper-converged infrastructure),resulting in a huge pool of compute and storage resources when clustered with other building blocks.The novelty of thiswork to design and test cloud storage using themeasurement of availability,downtime,and outage parameters.Results showed that the storage reliability in a hyper-converged system is above 92%.展开更多
Cloud systems are tools and software for cloud computing that are deployed on the Internet or a cloud computing network,and users can use them at any time.After assessing and choosing cloud providers,however,customers...Cloud systems are tools and software for cloud computing that are deployed on the Internet or a cloud computing network,and users can use them at any time.After assessing and choosing cloud providers,however,customers confront the variety and difficulty of quality of service(QoS).To increase customer retention and engagement success rates,it is critical to research and develops an accurate and objective evaluation model.Cloud is the emerging environment for distributed services at various layers.Due to the benefits of this environment,globally cloud is being taken as a standard environment for individuals as well as for the corporate sector as it reduces capital expenditure and provides secure,accessible,and manageable services to all stakeholders but Cloud computing has security challenges,including vulnerability for clients and association acknowledgment,that delay the rapid adoption of computing models.Allocation of resources in the Cloud is difficult because resources provide numerous measures of quality of service.In this paper,the proposed resource allocation approach is based on attribute QoS Scoring that takes into account parameters the reputation of the asset,task completion time,task completion ratio,and resource loading.This article is focused on the cloud service’s security,cloud reliability,and could performance.In this paper,the machine learning algorithm neuro-fuzzy has been used to address the cloud security issues to measure the parameter security and privacy,trust issues.The findings reveal that the ANFIS-dependent parameters are primarily designed to discern anomalies in cloud security and features output normally yields better results and guarantees data consistency and computational power.展开更多
The past two decades witnessed a broad-increase in web technology and on-line gaming.Enhancing the broadband confinements is viewed as one of the most significant variables that prompted new gaming technology.The imme...The past two decades witnessed a broad-increase in web technology and on-line gaming.Enhancing the broadband confinements is viewed as one of the most significant variables that prompted new gaming technology.The immense utilization of web applications and games additionally prompted growth in the handled devices and moving the limited gaming experience from user devices to online cloud servers.As internet capabilities are enhanced new ways of gaming are being used to improve the gaming experience.In cloud-based video gaming,game engines are hosted in cloud gaming data centers,and compressed gaming scenes are rendered to the players over the internet with updated controls.In such systems,the task of transferring games and video compression imposes huge computational complexity is required on cloud servers.The basic problems in cloud gaming in particular are high encoding time,latency,and low frame rates which require a new methodology for a better solution.To improve the bandwidth issue in cloud games,the compression of video sequences requires an alternative mechanism to improve gaming adaption without input delay.In this paper,the proposed improved methodology is used for automatic unnecessary scene detection,scene removing and bit rate reduction using an adaptive algorithm for object detection in a game scene.As a result,simulations showed without much impact on the players’quality experience,the selective object encoding method and object adaption technique decrease the network latency issue,reduce the game streaming bitrate at a remarkable scale on different games.The proposed algorithm was evaluated for three video game scenes.In this paper,achieved 14.6%decrease in encoding and 45.6%decrease in bit rate for the first video game scene.展开更多
文摘Load balancing is a technique for identifying overloaded and underloaded nodes and balancing the load between them.To maximize various performance parameters in cloud computing,researchers suggested various load balancing approaches.To store and access data and services provided by the different service providers through the network over different regions,cloud computing is one of the latest technology systems for both end-users and service providers.The volume of data is increasing due to the pandemic and a significant increase in usage of the internet has also been experienced.Users of the cloud are looking for services that are intelligent,and,can balance the traffic load by service providers,resulting in seamless and uninterrupted services.Different types of algorithms and techniques are available that can manage the load balancing in the cloud services.In this paper,a newly proposed method for load balancing in cloud computing at the database level is introduced.The database cloud services are frequently employed by companies of all sizes,for application development and business process.Load balancing for distributed applications can be used to maintain an efficient task scheduling process that also meets the user requirements and improves resource utilization.Load balancing is the process of distributing the load on various nodes to ensure that no single node is overloaded.To avoid the nodes from being overloaded,the load balancer divides an equal amount of computing time to all nodes.The results of two different scenarios showed the cross-region traffic management and significant growth in revenue of restaurants by using load balancer decisions on application traffic gateways.
文摘The application of Information and Communication Technologies has transformed traditional Teaching and Learning in the past decade to computerized-based era. This evolution has resulted from the emergence of the digital system and has greatly impacted on the global education and socio-cultural development. Multimedia has been absorbed into the education sector for producing a new learning concept and a combination of educational and entertainment approach. This research is concerned with the application of Window Speech Recognition and Microsoft Visual Basic 2008 Integrated/Interactive Development Environment in Multimedia-Assisted Courseware prototype development for Primary School Mathematics contents, namely, single digits and the addition. The Teaching and Learning techniques—Explain, Instruct and Facilitate are proposed and these could be viewed as instructors’ centered strategy, instructors’—learners’ dual communication and learners' active participation. The prototype is called M-EIF and deployed only users' voices;hence the activation of Window Speech Recognition is required prior to a test run.
基金supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(NRF-2019R1A2C1006159)and(NRF-2021R1A6A1A03039493)by the 2021 Yeungnam University Research Grant.
文摘The networks are fundamental to our modern world and they appear throughout science and society.Access to a massive amount of data presents a unique opportunity to the researcher’s community.As networks grow in size the complexity increases and our ability to analyze them using the current state of the art is at severe risk of failing to keep pace.Therefore,this paper initiates a discussion on graph signal processing for large-scale data analysis.We first provide a comprehensive overview of core ideas in Graph signal processing(GSP)and their connection to conventional digital signal processing(DSP).We then summarize recent developments in developing basic GSP tools,including methods for graph filtering or graph learning,graph signal,graph Fourier transform(GFT),spectrum,graph frequency,etc.Graph filtering is a basic task that allows for isolating the contribution of individual frequencies and therefore enables the removal of noise.We then consider a graph filter as a model that helps to extend the application of GSP methods to large datasets.To show the suitability and the effeteness,we first created a noisy graph signal and then applied it to the filter.After several rounds of simulation results.We see that the filtered signal appears to be smoother and is closer to the original noise-free distance-based signal.By using this example application,we thoroughly demonstrated that graph filtration is efficient for big data analytics.
文摘Cloud computing is an emerging domain that is capturing global users from all walks of life—the corporate sector,government sector,and social arena as well.Various cloud providers have offered multiple services and facilities to this audience and the number of providers is increasing very swiftly.This enormous pace is generating the requirement of a comprehensive ecosystem that shall provide a seamless and customized user environment not only to enhance the user experience but also to improve security,availability,accessibility,and latency.Emerging technology is providing robust solutions to many of our problems,the cloud platform is one of them.It is worth mentioning that these solutions are also amplifying the complexity and need of sustenance of these rapid solutions.As with cloud computing,new entrants as cloud service providers,resellers,tech-support,hardware manufacturers,and software developers appear on a daily basis.These actors playing their role in the growth and sustenance of the cloud ecosystem.Our objective is to use convergence for cloud services,software-defined networks,network function virtualization for infrastructure,cognition for pattern development,and knowledge repository.In order to gear up these processes,machine learning to induce intelligence to maintain ecosystem growth,to monitor performance,and to become able to make decisions for the sustenance of the ecosystem.Workloads may be programmed to“superficially”imitate most business applications and create large numbers using lightweight workload generators that merely stress the storage.In today’s current IT environment,when many enterprises use the cloud to service some of their application demands,a different performance testing technique that assesses more than the storage is necessary.Compute and storage are merged into a single building block with HCI(Hyper-converged infrastructure),resulting in a huge pool of compute and storage resources when clustered with other building blocks.The novelty of thiswork to design and test cloud storage using themeasurement of availability,downtime,and outage parameters.Results showed that the storage reliability in a hyper-converged system is above 92%.
文摘Cloud systems are tools and software for cloud computing that are deployed on the Internet or a cloud computing network,and users can use them at any time.After assessing and choosing cloud providers,however,customers confront the variety and difficulty of quality of service(QoS).To increase customer retention and engagement success rates,it is critical to research and develops an accurate and objective evaluation model.Cloud is the emerging environment for distributed services at various layers.Due to the benefits of this environment,globally cloud is being taken as a standard environment for individuals as well as for the corporate sector as it reduces capital expenditure and provides secure,accessible,and manageable services to all stakeholders but Cloud computing has security challenges,including vulnerability for clients and association acknowledgment,that delay the rapid adoption of computing models.Allocation of resources in the Cloud is difficult because resources provide numerous measures of quality of service.In this paper,the proposed resource allocation approach is based on attribute QoS Scoring that takes into account parameters the reputation of the asset,task completion time,task completion ratio,and resource loading.This article is focused on the cloud service’s security,cloud reliability,and could performance.In this paper,the machine learning algorithm neuro-fuzzy has been used to address the cloud security issues to measure the parameter security and privacy,trust issues.The findings reveal that the ANFIS-dependent parameters are primarily designed to discern anomalies in cloud security and features output normally yields better results and guarantees data consistency and computational power.
文摘The past two decades witnessed a broad-increase in web technology and on-line gaming.Enhancing the broadband confinements is viewed as one of the most significant variables that prompted new gaming technology.The immense utilization of web applications and games additionally prompted growth in the handled devices and moving the limited gaming experience from user devices to online cloud servers.As internet capabilities are enhanced new ways of gaming are being used to improve the gaming experience.In cloud-based video gaming,game engines are hosted in cloud gaming data centers,and compressed gaming scenes are rendered to the players over the internet with updated controls.In such systems,the task of transferring games and video compression imposes huge computational complexity is required on cloud servers.The basic problems in cloud gaming in particular are high encoding time,latency,and low frame rates which require a new methodology for a better solution.To improve the bandwidth issue in cloud games,the compression of video sequences requires an alternative mechanism to improve gaming adaption without input delay.In this paper,the proposed improved methodology is used for automatic unnecessary scene detection,scene removing and bit rate reduction using an adaptive algorithm for object detection in a game scene.As a result,simulations showed without much impact on the players’quality experience,the selective object encoding method and object adaption technique decrease the network latency issue,reduce the game streaming bitrate at a remarkable scale on different games.The proposed algorithm was evaluated for three video game scenes.In this paper,achieved 14.6%decrease in encoding and 45.6%decrease in bit rate for the first video game scene.