Recently,there has been a notable surge of interest in scientific research regarding spectral images.The potential of these images to revolutionize the digital photography industry,like aerial photography through Unma...Recently,there has been a notable surge of interest in scientific research regarding spectral images.The potential of these images to revolutionize the digital photography industry,like aerial photography through Unmanned Aerial Vehicles(UAVs),has captured considerable attention.One encouraging aspect is their combination with machine learning and deep learning algorithms,which have demonstrated remarkable outcomes in image classification.As a result of this powerful amalgamation,the adoption of spectral images has experienced exponential growth across various domains,with agriculture being one of the prominent beneficiaries.This paper presents an extensive survey encompassing multispectral and hyperspectral images,focusing on their applications for classification challenges in diverse agricultural areas,including plants,grains,fruits,and vegetables.By meticulously examining primary studies,we delve into the specific agricultural domains where multispectral and hyperspectral images have found practical use.Additionally,our attention is directed towards utilizing machine learning techniques for effectively classifying hyperspectral images within the agricultural context.The findings of our investigation reveal that deep learning and support vector machines have emerged as widely employed methods for hyperspectral image classification in agriculture.Nevertheless,we also shed light on the various issues and limitations of working with spectral images.This comprehensive analysis aims to provide valuable insights into the current state of spectral imaging in agriculture and its potential for future advancements.展开更多
Software cost estimation is a crucial aspect of software project management,significantly impacting productivity and planning.This research investigates the impact of various feature selection techniques on software c...Software cost estimation is a crucial aspect of software project management,significantly impacting productivity and planning.This research investigates the impact of various feature selection techniques on software cost estimation accuracy using the CoCoMo NASA dataset,which comprises data from 93 unique software projects with 24 attributes.By applying multiple machine learning algorithms alongside three feature selection methods,this study aims to reduce data redundancy and enhance model accuracy.Our findings reveal that the principal component analysis(PCA)-based feature selection technique achieved the highest performance,underscoring the importance of optimal feature selection in improving software cost estimation accuracy.It is demonstrated that our proposed method outperforms the existing method while achieving the highest precision,accuracy,and recall rates.展开更多
COVID-19 pandemic restrictions limited all social activities to curtail the spread of the virus.The foremost and most prime sector among those affected were schools,colleges,and universities.The education system of en...COVID-19 pandemic restrictions limited all social activities to curtail the spread of the virus.The foremost and most prime sector among those affected were schools,colleges,and universities.The education system of entire nations had shifted to online education during this time.Many shortcomings of Learning Management Systems(LMSs)were detected to support education in an online mode that spawned the research in Artificial Intelligence(AI)based tools that are being developed by the research community to improve the effectiveness of LMSs.This paper presents a detailed survey of the different enhancements to LMSs,which are led by key advances in the area of AI to enhance the real-time and non-real-time user experience.The AI-based enhancements proposed to the LMSs start from the Application layer and Presentation layer in the form of flipped classroom models for the efficient learning environment and appropriately designed UI/UX for efficient utilization of LMS utilities and resources,including AI-based chatbots.Session layer enhancements are also required,such as AI-based online proctoring and user authentication using Biometrics.These extend to the Transport layer to support real-time and rate adaptive encrypted video transmission for user security/privacy and satisfactory working of AI-algorithms.It also needs the support of the Networking layer for IP-based geolocation features,the Virtual Private Network(VPN)feature,and the support of Software-Defined Networks(SDN)for optimum Quality of Service(QoS).Finally,in addition to these,non-real-time user experience is enhanced by other AI-based enhancements such as Plagiarism detection algorithms and Data Analytics.展开更多
In the contemporary era,the death rate is increasing due to lung cancer.However,technology is continuously enhancing the quality of well-being.To improve the survival rate,radiologists rely on Computed Tomography(CT)s...In the contemporary era,the death rate is increasing due to lung cancer.However,technology is continuously enhancing the quality of well-being.To improve the survival rate,radiologists rely on Computed Tomography(CT)scans for early detection and diagnosis of lung nodules.This paper presented a detailed,systematic review of several identification and categorization techniques for lung nodules.The analysis of the report explored the challenges,advancements,and future opinions in computer-aided diagnosis CAD systems for detecting and classifying lung nodules employing the deep learning(DL)algorithm.The findings also highlighted the usefulness of DL networks,especially convolutional neural networks(CNNs)in elevating sensitivity,accuracy,and specificity as well as overcoming false positives in the initial stages of lung cancer detection.This paper further presented the integral nodule classification stage,which stressed the importance of differentiating between benign and malignant nodules for initial cancer diagnosis.Moreover,the findings presented a comprehensive analysis of multiple techniques and studies for nodule classification,highlighting the evolution of methodologies from conventional machine learning(ML)classifiers to transfer learning and integrated CNNs.Interestingly,while accepting the strides formed by CAD systems,the review addressed persistent challenges.展开更多
In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of ...In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.展开更多
In recent years,there has been a rapid growth in Underwater Wireless Sensor Networks(UWSNs).The focus of research in this area is now on solving the problems associated with large-scale UWSN.One of the major issues in...In recent years,there has been a rapid growth in Underwater Wireless Sensor Networks(UWSNs).The focus of research in this area is now on solving the problems associated with large-scale UWSN.One of the major issues in such a network is the localization of underwater nodes.Localization is required for tracking objects and detecting the target.It is also considered tagging of data where sensed contents are not found of any use without localization.This is useless for application until the position of sensed content is confirmed.This article’s major goal is to review and analyze underwater node localization to solve the localization issues in UWSN.The present paper describes various existing localization schemes and broadly categorizes these schemes as Centralized and Distributed localization schemes underwater.Also,a detailed subdivision of these localization schemes is given.Further,these localization schemes are compared from different perspectives.The detailed analysis of these schemes in terms of certain performance metrics has been discussed in this paper.At the end,the paper addresses several future directions for potential research in improving localization problems of UWSN.展开更多
This study was undertaken to examine the options and feasibility of deploying new technologies for transforming the aquaculture sector with the objective of increasing the production efficiency.Selection o...This study was undertaken to examine the options and feasibility of deploying new technologies for transforming the aquaculture sector with the objective of increasing the production efficiency.Selection of technologies to obtain the expected outcome should,obviously,be consistent with the criteria of sustainable development.There is a range of technologies being suggested for driving change in aquaculture to enhance its contribution to food security.It is necessary to highlight the complexity of issues for systems approach that can shape the course of development of aquaculture so that it can live-up to the expected fish demand by 2030 in addition to the current quantity of 82.1 million tons.Some of the Fourth Industrial Revolution(IR4.0)technologies suggested to achieve this target envisage the use of real-time monitoring,integration of a constant stream of data from connected production systems and intelligent automation in controls.This requires application of mobile devices,internet of things(IoT),smart sensors,artificial intelligence(AI),big data analytics,robotics as well as augmented virtual and mixed reality.AI is receiving more attention due to many reasons.Its use in aquaculture can happen in many ways,for example,in detecting and mitigating stress on the captive fish which is considered critical for the success of aquaculture.While the technology intensification in aquaculture holds a great potential but there are constraints in deploying IR4.0 tools in aquaculture.Possible solutions and practical options,especially with respect to future food choices are highlighted in this paper.展开更多
In situations when the precise position of a machine is unknown,localization becomes crucial.This research focuses on improving the position prediction accuracy over long-range(LoRa)network using an optimized machine ...In situations when the precise position of a machine is unknown,localization becomes crucial.This research focuses on improving the position prediction accuracy over long-range(LoRa)network using an optimized machine learning-based technique.In order to increase the prediction accuracy of the reference point position on the data collected using the fingerprinting method over LoRa technology,this study proposed an optimized machine learning(ML)based algorithm.Received signal strength indicator(RSSI)data from the sensors at different positions was first gathered via an experiment through the LoRa network in a multistory round layout building.The noise factor is also taken into account,and the signal-to-noise ratio(SNR)value is recorded for every RSSI measurement.This study concludes the examination of reference point accuracy with the modified KNN method(MKNN).MKNN was created to more precisely anticipate the position of the reference point.The findings showed that MKNN outperformed other algorithms in terms of accuracy and complexity.展开更多
This study focuses on testing and quality measurement and analysis of VoIPv6 performance. A client, server codes were developed using FreeBSD. This is a step before analyzing the Architectures of VoIPv6 in the current...This study focuses on testing and quality measurement and analysis of VoIPv6 performance. A client, server codes were developed using FreeBSD. This is a step before analyzing the Architectures of VoIPv6 in the current internet in order for it to cope with IPv6 traffic transmission requirements in general and specifically voice traffic, which is being attracting the efforts of research, bodes currently. These tests were conducted in the application level without looking into the network level of the network. VoIPv6 performance tests were conducted in the current tunneled and native IPv6 aiming for better end-to-end VoIPv6 performance. The results obtained in this study were shown in deferent codec's for different bit rates in Kilo bits per second, which act as an indicator for the better performance of G.711 compared with the rest of the tested codes.展开更多
Visible light communication(VLC)has a paramount role in industrial implementations,especially for better energy efficiency,high speed-data rates,and low susceptibility to interference.However,since studies on VLC for ...Visible light communication(VLC)has a paramount role in industrial implementations,especially for better energy efficiency,high speed-data rates,and low susceptibility to interference.However,since studies on VLC for industrial implementations are in scarcity,areas concerning illumination optimisation and communication performances demand further investigation.As such,this paper presents a new modelling of light fixture distribution for a warehouse model to provide acceptable illumination and communication performances.The proposed model was evaluated based on various semi-angles at half power(SAAHP)and different height levels for several parameters,including received power,signal to noise ratio(SNR),and bit error rate(BER).The results revealed improvement in terms of received power and SNR with 30 Mbps data rate.Various modulations were studied to improve the link quality,whereby better average BER values of 5.55×10^(−15) and 1.06×10^(−10) had been achieved with 4 PAM and 8 PPM,respectively.The simulation outcomes are indeed viable for the practical warehouse model.展开更多
It is estimated that only 15 percent of Kenyans have made plans for retirement, and many people fall into poverty once they retire. A 2018 survey by the Unclaimed Property Asset register found that insurance companies...It is estimated that only 15 percent of Kenyans have made plans for retirement, and many people fall into poverty once they retire. A 2018 survey by the Unclaimed Property Asset register found that insurance companies hold 25 percent of unclaimed funds with 10 percent belonging to pensioners. This was attributed to a lack of effective information flow between insurance companies and the customers and also between various departments in the insurance companies. Further, there were numerous cases of loss of documents and files and certain files were untraceable in the departments. This paper investigates ways in which mobile technology influences dissemination of information for processing pension claims in the insurance industry. An improvement in dissemination of information for processing of pension claims can carry out a key function in increasing percentage of Kenyans making plans for retirement. The study deployed a descriptive study design. The target population in this study was 561 pensioners in Jubilee Insurance and 8 heads of pensions business, finance, legal services, internal audit, operations, information and communication technology, actuary, business development and strategy and business development departments. The sample size of this study was obtained by use of Krejcie and Morgan formula of determining sample size. As a result of the small number of heads of departments, they were not sampled. Through systematic sampling a sample of 288 pensioners was selected from the list of pensioners in Jubilee Insurance. The findings from the study led to a conclusion that mobile application has a positive and significant association with dissemination of information for pension claims processing in Jubilee Insurance. It was further revealed that text messages have a positive and significant influence on dissemination of information. Concerning unstructured supplementary service data (USSD) it was concluded that it has a positive and significant influence on dissemination of information. The study findings also revealed that voice calls have a positive and significant influence on dissemination of information for pension claims processing in Jubilee Insurance.展开更多
Operating System(OS)is a critical piece of software that manages a computer’s hardware and resources,acting as the intermediary between the computer and the user.The existing OS is not designed for Big Data and Cloud...Operating System(OS)is a critical piece of software that manages a computer’s hardware and resources,acting as the intermediary between the computer and the user.The existing OS is not designed for Big Data and Cloud Computing,resulting in data processing and management inefficiency.This paper proposes a simplified and improved kernel on an x86 system designed for Big Data and Cloud Computing purposes.The proposed algorithm utilizes the performance benefits from the improved Input/Output(I/O)performance.The performance engineering runs the data-oriented design on traditional data management to improve data processing speed by reducing memory access overheads in conventional data management.The OS incorporates a data-oriented design to“modernize”various Data Science and management aspects.The resulting OS contains a basic input/output system(BIOS)bootloader that boots into Intel 32-bit protected mode,a text display terminal,4 GB paging memory,4096 heap block size,a Hard Disk Drive(HDD)I/O Advanced Technology Attachment(ATA)driver and more.There are also I/O scheduling algorithm prototypes that demonstrate how a simple Sweeping algorithm is superior to more conventionally known I/O scheduling algorithms.A MapReduce prototype is implemented using Message Passing Interface(MPI)for big data purposes.An attempt was made to optimize binary search using modern performance engineering and data-oriented design.展开更多
Traditional rule-based IntrusionDetection Systems(IDS)are commonly employed owing to their simple design and ability to detect known threats.Nevertheless,as dynamic network traffic and a new degree of threats exist in...Traditional rule-based IntrusionDetection Systems(IDS)are commonly employed owing to their simple design and ability to detect known threats.Nevertheless,as dynamic network traffic and a new degree of threats exist in IoT environments,these systems do not perform well and have elevated false positive rates—consequently decreasing detection accuracy.In this study,we try to overcome these restrictions by employing fuzzy logic and machine learning to develop an Enhanced Rule-Based Model(ERBM)to classify the packets better and identify intrusions.The ERBM developed for this approach improves data preprocessing and feature selections by utilizing fuzzy logic,where three membership functions are created to classify all the network traffic features as low,medium,or high to remain situationally aware of the environment.Such fuzzy logic sets produce adaptive detection rules by reducing data uncertainty.Also,for further classification,machine learning classifiers such as Decision Tree(DT),Random Forest(RF),and Neural Networks(NN)learn complex ways of attacks and make the detection process more precise.A thorough performance evaluation using different metrics,including accuracy,precision,recall,F1 Score,detection rate,and false-positive rate,verifies the supremacy of ERBM over classical IDS.Under extensive experiments,the ERBM enables a remarkable detection rate of 99%with considerably fewer false positives than the conventional models.Integrating the ability for uncertain reasoning with fuzzy logic and an adaptable component via machine learning solutions,the ERBM systemprovides a unique,scalable,data-driven approach to IoT intrusion detection.This research presents a major enhancement initiative in the context of rule-based IDS,introducing improvements in accuracy to evolving IoT threats.展开更多
In software-defined networks(SDNs),controller placement is a critical factor in the design and planning for the future Internet of Things(IoT),telecommunication,and satellite communication systems.Existing research ha...In software-defined networks(SDNs),controller placement is a critical factor in the design and planning for the future Internet of Things(IoT),telecommunication,and satellite communication systems.Existing research has concentrated largely on factors such as reliability,latency,controller capacity,propagation delay,and energy consumption.However,SDNs are vulnerable to distributed denial of service(DDoS)attacks that interfere with legitimate use of the network.The ever-increasing frequency of DDoS attacks has made it necessary to consider them in network design,especially in critical applications such as military,health care,and financial services networks requiring high availability.We propose a mathematical model for planning the deployment of SDN smart backup controllers(SBCs)to preserve service in the presence of DDoS attacks.Given a number of input parameters,our model has two distinct capabilities.First,it determines the optimal number of primary controllers to place at specific locations or nodes under normal operating conditions.Second,it recommends an optimal number of smart backup controllers for use with different levels of DDoS attacks.The goal of the model is to improve resistance to DDoS attacks while optimizing the overall cost based on the parameters.Our simulated results demonstrate that the model is useful in planning for SDN reliability in the presence of DDoS attacks while managing the overall cost.展开更多
Wrist cracks are the most common sort of cracks with an excessive occurrence rate.For the routine detection of wrist cracks,conventional radiography(X-ray medical imaging)is used but periodically issues are presented ...Wrist cracks are the most common sort of cracks with an excessive occurrence rate.For the routine detection of wrist cracks,conventional radiography(X-ray medical imaging)is used but periodically issues are presented by crack depiction.Wrist cracks often appear in the human arbitrary bone due to accidental injuries such as slipping.Indeed,many hospitals lack experienced clinicians to diagnose wrist cracks.Therefore,an automated system is required to reduce the burden on clinicians and identify cracks.In this study,we have designed a novel residual network-based convolutional neural network(CNN)for the crack detection of the wrist.For the classification of wrist cracks medical imaging,the diagnostics accuracy of the RN-21CNN model is compared with four well-known transfer learning(TL)models such as Inception V3,Vgg16,ResNet-50,and Vgg19,to assist the medical imaging technologist in identifying the cracks that occur due to wrist fractures.The RN-21CNN model achieved an accuracy of 0.97 which is much better than its competitor`s approaches.The results reveal that implementing a correct generalization that a computer-aided recognition system precisely designed for the assistance of clinician would limit the number of incorrect diagnoses and also saves a lot of time.展开更多
With the help of computer-aided diagnostic systems,cardiovascular diseases can be identified timely manner to minimize the mortality rate of patients suffering from cardiac disease.However,the early diagnosis of cardi...With the help of computer-aided diagnostic systems,cardiovascular diseases can be identified timely manner to minimize the mortality rate of patients suffering from cardiac disease.However,the early diagnosis of cardiac arrhythmia is one of the most challenging tasks.The manual analysis of electrocardiogram(ECG)data with the help of the Holter monitor is challenging.Currently,the Convolutional Neural Network(CNN)is receiving considerable attention from researchers for automatically identifying ECG signals.This paper proposes a 9-layer-based CNN model to classify the ECG signals into five primary categories according to the American National Standards Institute(ANSI)standards and the Association for the Advancement of Medical Instruments(AAMI).The Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia dataset is used for the experiment.The proposed model outperformed the previous model in terms of accuracy and achieved a sensitivity of 99.0%and a positivity predictively 99.2%in the detection of a Ventricular Ectopic Beat(VEB).Moreover,it also gained a sensitivity of 99.0%and positivity predictively of 99.2%for the detection of a supraventricular ectopic beat(SVEB).The overall accuracy of the proposed model is 99.68%.展开更多
Visible light communication(VLC),which is a prominent emerging solution that complements the radio frequency(RF)technology,exhibits the potential to meet the demands of fifth-generation(5G)and beyond technologies.The ...Visible light communication(VLC),which is a prominent emerging solution that complements the radio frequency(RF)technology,exhibits the potential to meet the demands of fifth-generation(5G)and beyond technologies.The random movement of mobile terminals in the indoor environment is a challenge in the VLC system.The model of optical attocells has a critical role in the uniform distribution and the quality of communication links in terms of received power and signal-to-noise ratio(SNR).As such,the optical attocells positions were optimized in this study with a developed try and error(TE)algorithm.The optimized optical attocells were examined and compared with previous models.This novel approach had successfully increased minimum received power from−1.29 to−0.225 dBm,along with enhanced SNR performance by 2.06 dB.The bit error rate(BER)was reduced to 4.42×10−8 and 6.63×10−14 by utilizing OOK-NRZ and BPSK modulation techniques,respectively.The optimized attocells positions displayed better uniform distribution,as both received power and SNR performances improved by 0.45 and 0.026,respectively.As the results of the proposed model are optimal,it is suitable for standard office and room model applications.展开更多
The computational complexity of resource allocation processes,in cognitive radio networks(CRNs),is a major issue to be managed.Furthermore,the complicated solution of the optimal algorithm for handling resource alloca...The computational complexity of resource allocation processes,in cognitive radio networks(CRNs),is a major issue to be managed.Furthermore,the complicated solution of the optimal algorithm for handling resource allocation in CRNs makes it unsuitable to adopt in real-world applications where both cognitive users,CRs,and primary users,PUs,exist in the identical geographical area.Hence,this work offers a primarily price-based power algorithm to reduce computational complexity in uplink scenarioswhile limiting interference to PUs to allowable threshold.Hence,this paper,compared to other frameworks proposed in the literature,proposes a two-step approach to reduce the complexity of the proposed mathematical model.In the first step,the subcarriers are assigned to the users of the CRN,while the cost function includes a pricing scheme to provide better power control algorithm with improved reliability proposed in the second stage.The main contribution of this paper is to lessen the complexity of the proposed algorithm and to offer flexibility in controlling the interference produced to the users of the primary networks,which has been achieved by including a pricing function in the proposed cost function.Finally,the performance of the proposed power and subcarrier algorithm is confirmed for orthogonal frequency-division multiplexing(OFDM).Simulation results prove that the performance of the proposed algorithm is better than other algorithms,albeit with a lesser complexity of O(NM)+O(Nlog(N)).展开更多
Due to the fact that network space is becoming more limited,the implementation of ultra-dense networks(UDNs)has the potential to enhance not only network coverage but also network throughput.Unmanned Aerial Vehicle(UA...Due to the fact that network space is becoming more limited,the implementation of ultra-dense networks(UDNs)has the potential to enhance not only network coverage but also network throughput.Unmanned Aerial Vehicle(UAV)communications have recently garnered a lot of attention due to the fact that they are extremely versatile and may be applied to a wide variety of contexts and purposes.A cognitive UAV is proposed as a solution for the Internet of Things ground terminal’s wireless nodes in this article.In the IoT system,the UAV is utilised not only to determine how the resources should be distributed but also to provide power to the wireless nodes.The quality of service(QoS)offered by the cognitive node was interpreted as a price-based utility function,which was demonstrated in the form of a non-cooperative game theory in order to maximise customers’net utility functions.An energyefficient non-cooperative game theory power allocation with pricing strategy abbreviated as(EE-NGPAP)is implemented in this study with two trajectories Spiral and Sigmoidal in order to facilitate effective power management in Internet of Things(IoT)wireless nodes.It has also been demonstrated,theoretically and by the use of simulations,that the Nash equilibrium does exist and that it is one of a kind.The proposed energy harvesting approach was shown,through simulations,to significantly reduce the typical amount of power thatwas sent.This is taken into consideration to agree with the objective of 5G networks.In order to converge to Nash Equilibrium(NE),the method that is advised only needs roughly 4 iterations,which makes it easier to utilise in the real world,where things aren’t always the same.展开更多
文摘Recently,there has been a notable surge of interest in scientific research regarding spectral images.The potential of these images to revolutionize the digital photography industry,like aerial photography through Unmanned Aerial Vehicles(UAVs),has captured considerable attention.One encouraging aspect is their combination with machine learning and deep learning algorithms,which have demonstrated remarkable outcomes in image classification.As a result of this powerful amalgamation,the adoption of spectral images has experienced exponential growth across various domains,with agriculture being one of the prominent beneficiaries.This paper presents an extensive survey encompassing multispectral and hyperspectral images,focusing on their applications for classification challenges in diverse agricultural areas,including plants,grains,fruits,and vegetables.By meticulously examining primary studies,we delve into the specific agricultural domains where multispectral and hyperspectral images have found practical use.Additionally,our attention is directed towards utilizing machine learning techniques for effectively classifying hyperspectral images within the agricultural context.The findings of our investigation reveal that deep learning and support vector machines have emerged as widely employed methods for hyperspectral image classification in agriculture.Nevertheless,we also shed light on the various issues and limitations of working with spectral images.This comprehensive analysis aims to provide valuable insights into the current state of spectral imaging in agriculture and its potential for future advancements.
文摘Software cost estimation is a crucial aspect of software project management,significantly impacting productivity and planning.This research investigates the impact of various feature selection techniques on software cost estimation accuracy using the CoCoMo NASA dataset,which comprises data from 93 unique software projects with 24 attributes.By applying multiple machine learning algorithms alongside three feature selection methods,this study aims to reduce data redundancy and enhance model accuracy.Our findings reveal that the principal component analysis(PCA)-based feature selection technique achieved the highest performance,underscoring the importance of optimal feature selection in improving software cost estimation accuracy.It is demonstrated that our proposed method outperforms the existing method while achieving the highest precision,accuracy,and recall rates.
文摘COVID-19 pandemic restrictions limited all social activities to curtail the spread of the virus.The foremost and most prime sector among those affected were schools,colleges,and universities.The education system of entire nations had shifted to online education during this time.Many shortcomings of Learning Management Systems(LMSs)were detected to support education in an online mode that spawned the research in Artificial Intelligence(AI)based tools that are being developed by the research community to improve the effectiveness of LMSs.This paper presents a detailed survey of the different enhancements to LMSs,which are led by key advances in the area of AI to enhance the real-time and non-real-time user experience.The AI-based enhancements proposed to the LMSs start from the Application layer and Presentation layer in the form of flipped classroom models for the efficient learning environment and appropriately designed UI/UX for efficient utilization of LMS utilities and resources,including AI-based chatbots.Session layer enhancements are also required,such as AI-based online proctoring and user authentication using Biometrics.These extend to the Transport layer to support real-time and rate adaptive encrypted video transmission for user security/privacy and satisfactory working of AI-algorithms.It also needs the support of the Networking layer for IP-based geolocation features,the Virtual Private Network(VPN)feature,and the support of Software-Defined Networks(SDN)for optimum Quality of Service(QoS).Finally,in addition to these,non-real-time user experience is enhanced by other AI-based enhancements such as Plagiarism detection algorithms and Data Analytics.
文摘In the contemporary era,the death rate is increasing due to lung cancer.However,technology is continuously enhancing the quality of well-being.To improve the survival rate,radiologists rely on Computed Tomography(CT)scans for early detection and diagnosis of lung nodules.This paper presented a detailed,systematic review of several identification and categorization techniques for lung nodules.The analysis of the report explored the challenges,advancements,and future opinions in computer-aided diagnosis CAD systems for detecting and classifying lung nodules employing the deep learning(DL)algorithm.The findings also highlighted the usefulness of DL networks,especially convolutional neural networks(CNNs)in elevating sensitivity,accuracy,and specificity as well as overcoming false positives in the initial stages of lung cancer detection.This paper further presented the integral nodule classification stage,which stressed the importance of differentiating between benign and malignant nodules for initial cancer diagnosis.Moreover,the findings presented a comprehensive analysis of multiple techniques and studies for nodule classification,highlighting the evolution of methodologies from conventional machine learning(ML)classifiers to transfer learning and integrated CNNs.Interestingly,while accepting the strides formed by CAD systems,the review addressed persistent challenges.
文摘In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.
文摘In recent years,there has been a rapid growth in Underwater Wireless Sensor Networks(UWSNs).The focus of research in this area is now on solving the problems associated with large-scale UWSN.One of the major issues in such a network is the localization of underwater nodes.Localization is required for tracking objects and detecting the target.It is also considered tagging of data where sensed contents are not found of any use without localization.This is useless for application until the position of sensed content is confirmed.This article’s major goal is to review and analyze underwater node localization to solve the localization issues in UWSN.The present paper describes various existing localization schemes and broadly categorizes these schemes as Centralized and Distributed localization schemes underwater.Also,a detailed subdivision of these localization schemes is given.Further,these localization schemes are compared from different perspectives.The detailed analysis of these schemes in terms of certain performance metrics has been discussed in this paper.At the end,the paper addresses several future directions for potential research in improving localization problems of UWSN.
基金Aquaculture Flagship program of Universiti Malaysia Sabah.
文摘This study was undertaken to examine the options and feasibility of deploying new technologies for transforming the aquaculture sector with the objective of increasing the production efficiency.Selection of technologies to obtain the expected outcome should,obviously,be consistent with the criteria of sustainable development.There is a range of technologies being suggested for driving change in aquaculture to enhance its contribution to food security.It is necessary to highlight the complexity of issues for systems approach that can shape the course of development of aquaculture so that it can live-up to the expected fish demand by 2030 in addition to the current quantity of 82.1 million tons.Some of the Fourth Industrial Revolution(IR4.0)technologies suggested to achieve this target envisage the use of real-time monitoring,integration of a constant stream of data from connected production systems and intelligent automation in controls.This requires application of mobile devices,internet of things(IoT),smart sensors,artificial intelligence(AI),big data analytics,robotics as well as augmented virtual and mixed reality.AI is receiving more attention due to many reasons.Its use in aquaculture can happen in many ways,for example,in detecting and mitigating stress on the captive fish which is considered critical for the success of aquaculture.While the technology intensification in aquaculture holds a great potential but there are constraints in deploying IR4.0 tools in aquaculture.Possible solutions and practical options,especially with respect to future food choices are highlighted in this paper.
基金The research will be funded by the Multimedia University,Department of Information Technology,Persiaran Multimedia,63100,Cyberjaya,Selangor,Malaysia.
文摘In situations when the precise position of a machine is unknown,localization becomes crucial.This research focuses on improving the position prediction accuracy over long-range(LoRa)network using an optimized machine learning-based technique.In order to increase the prediction accuracy of the reference point position on the data collected using the fingerprinting method over LoRa technology,this study proposed an optimized machine learning(ML)based algorithm.Received signal strength indicator(RSSI)data from the sensors at different positions was first gathered via an experiment through the LoRa network in a multistory round layout building.The noise factor is also taken into account,and the signal-to-noise ratio(SNR)value is recorded for every RSSI measurement.This study concludes the examination of reference point accuracy with the modified KNN method(MKNN).MKNN was created to more precisely anticipate the position of the reference point.The findings showed that MKNN outperformed other algorithms in terms of accuracy and complexity.
文摘This study focuses on testing and quality measurement and analysis of VoIPv6 performance. A client, server codes were developed using FreeBSD. This is a step before analyzing the Architectures of VoIPv6 in the current internet in order for it to cope with IPv6 traffic transmission requirements in general and specifically voice traffic, which is being attracting the efforts of research, bodes currently. These tests were conducted in the application level without looking into the network level of the network. VoIPv6 performance tests were conducted in the current tunneled and native IPv6 aiming for better end-to-end VoIPv6 performance. The results obtained in this study were shown in deferent codec's for different bit rates in Kilo bits per second, which act as an indicator for the better performance of G.711 compared with the rest of the tested codes.
基金supported by Professional Development Research University Grant(UTM Vot No.06E59).
文摘Visible light communication(VLC)has a paramount role in industrial implementations,especially for better energy efficiency,high speed-data rates,and low susceptibility to interference.However,since studies on VLC for industrial implementations are in scarcity,areas concerning illumination optimisation and communication performances demand further investigation.As such,this paper presents a new modelling of light fixture distribution for a warehouse model to provide acceptable illumination and communication performances.The proposed model was evaluated based on various semi-angles at half power(SAAHP)and different height levels for several parameters,including received power,signal to noise ratio(SNR),and bit error rate(BER).The results revealed improvement in terms of received power and SNR with 30 Mbps data rate.Various modulations were studied to improve the link quality,whereby better average BER values of 5.55×10^(−15) and 1.06×10^(−10) had been achieved with 4 PAM and 8 PPM,respectively.The simulation outcomes are indeed viable for the practical warehouse model.
文摘It is estimated that only 15 percent of Kenyans have made plans for retirement, and many people fall into poverty once they retire. A 2018 survey by the Unclaimed Property Asset register found that insurance companies hold 25 percent of unclaimed funds with 10 percent belonging to pensioners. This was attributed to a lack of effective information flow between insurance companies and the customers and also between various departments in the insurance companies. Further, there were numerous cases of loss of documents and files and certain files were untraceable in the departments. This paper investigates ways in which mobile technology influences dissemination of information for processing pension claims in the insurance industry. An improvement in dissemination of information for processing of pension claims can carry out a key function in increasing percentage of Kenyans making plans for retirement. The study deployed a descriptive study design. The target population in this study was 561 pensioners in Jubilee Insurance and 8 heads of pensions business, finance, legal services, internal audit, operations, information and communication technology, actuary, business development and strategy and business development departments. The sample size of this study was obtained by use of Krejcie and Morgan formula of determining sample size. As a result of the small number of heads of departments, they were not sampled. Through systematic sampling a sample of 288 pensioners was selected from the list of pensioners in Jubilee Insurance. The findings from the study led to a conclusion that mobile application has a positive and significant association with dissemination of information for pension claims processing in Jubilee Insurance. It was further revealed that text messages have a positive and significant influence on dissemination of information. Concerning unstructured supplementary service data (USSD) it was concluded that it has a positive and significant influence on dissemination of information. The study findings also revealed that voice calls have a positive and significant influence on dissemination of information for pension claims processing in Jubilee Insurance.
文摘Operating System(OS)is a critical piece of software that manages a computer’s hardware and resources,acting as the intermediary between the computer and the user.The existing OS is not designed for Big Data and Cloud Computing,resulting in data processing and management inefficiency.This paper proposes a simplified and improved kernel on an x86 system designed for Big Data and Cloud Computing purposes.The proposed algorithm utilizes the performance benefits from the improved Input/Output(I/O)performance.The performance engineering runs the data-oriented design on traditional data management to improve data processing speed by reducing memory access overheads in conventional data management.The OS incorporates a data-oriented design to“modernize”various Data Science and management aspects.The resulting OS contains a basic input/output system(BIOS)bootloader that boots into Intel 32-bit protected mode,a text display terminal,4 GB paging memory,4096 heap block size,a Hard Disk Drive(HDD)I/O Advanced Technology Attachment(ATA)driver and more.There are also I/O scheduling algorithm prototypes that demonstrate how a simple Sweeping algorithm is superior to more conventionally known I/O scheduling algorithms.A MapReduce prototype is implemented using Message Passing Interface(MPI)for big data purposes.An attempt was made to optimize binary search using modern performance engineering and data-oriented design.
基金A research grant from the Multimedia University,Malaysia supports this work。
文摘Traditional rule-based IntrusionDetection Systems(IDS)are commonly employed owing to their simple design and ability to detect known threats.Nevertheless,as dynamic network traffic and a new degree of threats exist in IoT environments,these systems do not perform well and have elevated false positive rates—consequently decreasing detection accuracy.In this study,we try to overcome these restrictions by employing fuzzy logic and machine learning to develop an Enhanced Rule-Based Model(ERBM)to classify the packets better and identify intrusions.The ERBM developed for this approach improves data preprocessing and feature selections by utilizing fuzzy logic,where three membership functions are created to classify all the network traffic features as low,medium,or high to remain situationally aware of the environment.Such fuzzy logic sets produce adaptive detection rules by reducing data uncertainty.Also,for further classification,machine learning classifiers such as Decision Tree(DT),Random Forest(RF),and Neural Networks(NN)learn complex ways of attacks and make the detection process more precise.A thorough performance evaluation using different metrics,including accuracy,precision,recall,F1 Score,detection rate,and false-positive rate,verifies the supremacy of ERBM over classical IDS.Under extensive experiments,the ERBM enables a remarkable detection rate of 99%with considerably fewer false positives than the conventional models.Integrating the ability for uncertain reasoning with fuzzy logic and an adaptable component via machine learning solutions,the ERBM systemprovides a unique,scalable,data-driven approach to IoT intrusion detection.This research presents a major enhancement initiative in the context of rule-based IDS,introducing improvements in accuracy to evolving IoT threats.
基金This research work was funded by TMR&D Sdn Bhd under project code RDTC160902.
文摘In software-defined networks(SDNs),controller placement is a critical factor in the design and planning for the future Internet of Things(IoT),telecommunication,and satellite communication systems.Existing research has concentrated largely on factors such as reliability,latency,controller capacity,propagation delay,and energy consumption.However,SDNs are vulnerable to distributed denial of service(DDoS)attacks that interfere with legitimate use of the network.The ever-increasing frequency of DDoS attacks has made it necessary to consider them in network design,especially in critical applications such as military,health care,and financial services networks requiring high availability.We propose a mathematical model for planning the deployment of SDN smart backup controllers(SBCs)to preserve service in the presence of DDoS attacks.Given a number of input parameters,our model has two distinct capabilities.First,it determines the optimal number of primary controllers to place at specific locations or nodes under normal operating conditions.Second,it recommends an optimal number of smart backup controllers for use with different levels of DDoS attacks.The goal of the model is to improve resistance to DDoS attacks while optimizing the overall cost based on the parameters.Our simulated results demonstrate that the model is useful in planning for SDN reliability in the presence of DDoS attacks while managing the overall cost.
文摘Wrist cracks are the most common sort of cracks with an excessive occurrence rate.For the routine detection of wrist cracks,conventional radiography(X-ray medical imaging)is used but periodically issues are presented by crack depiction.Wrist cracks often appear in the human arbitrary bone due to accidental injuries such as slipping.Indeed,many hospitals lack experienced clinicians to diagnose wrist cracks.Therefore,an automated system is required to reduce the burden on clinicians and identify cracks.In this study,we have designed a novel residual network-based convolutional neural network(CNN)for the crack detection of the wrist.For the classification of wrist cracks medical imaging,the diagnostics accuracy of the RN-21CNN model is compared with four well-known transfer learning(TL)models such as Inception V3,Vgg16,ResNet-50,and Vgg19,to assist the medical imaging technologist in identifying the cracks that occur due to wrist fractures.The RN-21CNN model achieved an accuracy of 0.97 which is much better than its competitor`s approaches.The results reveal that implementing a correct generalization that a computer-aided recognition system precisely designed for the assistance of clinician would limit the number of incorrect diagnoses and also saves a lot of time.
基金supported by Faculty of Computing and Informatics,University Malaysia Sabah,Jalan UMS,Kota Kinabalu Sabah 88400,Malaysia.
文摘With the help of computer-aided diagnostic systems,cardiovascular diseases can be identified timely manner to minimize the mortality rate of patients suffering from cardiac disease.However,the early diagnosis of cardiac arrhythmia is one of the most challenging tasks.The manual analysis of electrocardiogram(ECG)data with the help of the Holter monitor is challenging.Currently,the Convolutional Neural Network(CNN)is receiving considerable attention from researchers for automatically identifying ECG signals.This paper proposes a 9-layer-based CNN model to classify the ECG signals into five primary categories according to the American National Standards Institute(ANSI)standards and the Association for the Advancement of Medical Instruments(AAMI).The Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia dataset is used for the experiment.The proposed model outperformed the previous model in terms of accuracy and achieved a sensitivity of 99.0%and a positivity predictively 99.2%in the detection of a Ventricular Ectopic Beat(VEB).Moreover,it also gained a sensitivity of 99.0%and positivity predictively of 99.2%for the detection of a supraventricular ectopic beat(SVEB).The overall accuracy of the proposed model is 99.68%.
基金the grant names“ProfessionalDevelopment Research University Grant”(“UTM Vot No.05E69”and“TDR grant Vot No.05G27”).
文摘Visible light communication(VLC),which is a prominent emerging solution that complements the radio frequency(RF)technology,exhibits the potential to meet the demands of fifth-generation(5G)and beyond technologies.The random movement of mobile terminals in the indoor environment is a challenge in the VLC system.The model of optical attocells has a critical role in the uniform distribution and the quality of communication links in terms of received power and signal-to-noise ratio(SNR).As such,the optical attocells positions were optimized in this study with a developed try and error(TE)algorithm.The optimized optical attocells were examined and compared with previous models.This novel approach had successfully increased minimum received power from−1.29 to−0.225 dBm,along with enhanced SNR performance by 2.06 dB.The bit error rate(BER)was reduced to 4.42×10−8 and 6.63×10−14 by utilizing OOK-NRZ and BPSK modulation techniques,respectively.The optimized attocells positions displayed better uniform distribution,as both received power and SNR performances improved by 0.45 and 0.026,respectively.As the results of the proposed model are optimal,it is suitable for standard office and room model applications.
基金Authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through Large Groups Project under Grant Number RGP.2/111/43supported in part by the Agencia Estatal de Investigación,Ministerio de Ciencia e Innovación(MCIN/AEI/10.13039/501100011033)+1 种基金the R+D+i Project under Grant PID2020-115323RB-C31in part by the Grant from the Spanish Ministry of Economic Affairs and Digital Transformation and the European Union-NextGenerationEU under Grant UNICO-5G I+D/AROMA3D-Hybrid TSI-063000-2021-71.
文摘The computational complexity of resource allocation processes,in cognitive radio networks(CRNs),is a major issue to be managed.Furthermore,the complicated solution of the optimal algorithm for handling resource allocation in CRNs makes it unsuitable to adopt in real-world applications where both cognitive users,CRs,and primary users,PUs,exist in the identical geographical area.Hence,this work offers a primarily price-based power algorithm to reduce computational complexity in uplink scenarioswhile limiting interference to PUs to allowable threshold.Hence,this paper,compared to other frameworks proposed in the literature,proposes a two-step approach to reduce the complexity of the proposed mathematical model.In the first step,the subcarriers are assigned to the users of the CRN,while the cost function includes a pricing scheme to provide better power control algorithm with improved reliability proposed in the second stage.The main contribution of this paper is to lessen the complexity of the proposed algorithm and to offer flexibility in controlling the interference produced to the users of the primary networks,which has been achieved by including a pricing function in the proposed cost function.Finally,the performance of the proposed power and subcarrier algorithm is confirmed for orthogonal frequency-division multiplexing(OFDM).Simulation results prove that the performance of the proposed algorithm is better than other algorithms,albeit with a lesser complexity of O(NM)+O(Nlog(N)).
基金The authors are grateful to the Taif University Researchers Supporting Project number(TURSP-2020/36),Taif University,Taif,Saudi Arabia.
文摘Due to the fact that network space is becoming more limited,the implementation of ultra-dense networks(UDNs)has the potential to enhance not only network coverage but also network throughput.Unmanned Aerial Vehicle(UAV)communications have recently garnered a lot of attention due to the fact that they are extremely versatile and may be applied to a wide variety of contexts and purposes.A cognitive UAV is proposed as a solution for the Internet of Things ground terminal’s wireless nodes in this article.In the IoT system,the UAV is utilised not only to determine how the resources should be distributed but also to provide power to the wireless nodes.The quality of service(QoS)offered by the cognitive node was interpreted as a price-based utility function,which was demonstrated in the form of a non-cooperative game theory in order to maximise customers’net utility functions.An energyefficient non-cooperative game theory power allocation with pricing strategy abbreviated as(EE-NGPAP)is implemented in this study with two trajectories Spiral and Sigmoidal in order to facilitate effective power management in Internet of Things(IoT)wireless nodes.It has also been demonstrated,theoretically and by the use of simulations,that the Nash equilibrium does exist and that it is one of a kind.The proposed energy harvesting approach was shown,through simulations,to significantly reduce the typical amount of power thatwas sent.This is taken into consideration to agree with the objective of 5G networks.In order to converge to Nash Equilibrium(NE),the method that is advised only needs roughly 4 iterations,which makes it easier to utilise in the real world,where things aren’t always the same.