The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for he...The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for healthcare systems,particularly for identifying actions critical to patient well-being.However,challenges such as high computational demands,low accuracy,and limited adaptability persist in Human Motion Recognition(HMR).While some studies have integrated HMR with IoT for real-time healthcare applications,limited research has focused on recognizing MRHA as essential for effective patient monitoring.This study proposes a novel HMR method tailored for MRHA detection,leveraging multi-stage deep learning techniques integrated with IoT.The approach employs EfficientNet to extract optimized spatial features from skeleton frame sequences using seven Mobile Inverted Bottleneck Convolutions(MBConv)blocks,followed by Convolutional Long Short Term Memory(ConvLSTM)to capture spatio-temporal patterns.A classification module with global average pooling,a fully connected layer,and a dropout layer generates the final predictions.The model is evaluated on the NTU RGB+D 120 and HMDB51 datasets,focusing on MRHA such as sneezing,falling,walking,sitting,etc.It achieves 94.85%accuracy for cross-subject evaluations and 96.45%for cross-view evaluations on NTU RGB+D 120,along with 89.22%accuracy on HMDB51.Additionally,the system integrates IoT capabilities using a Raspberry Pi and GSM module,delivering real-time alerts via Twilios SMS service to caregivers and patients.This scalable and efficient solution bridges the gap between HMR and IoT,advancing patient monitoring,improving healthcare outcomes,and reducing costs.展开更多
The explosive expansion of the Internet of Things(IoT)systems has increased the imperative to have strong and robust solutions to cyber Security,especially to curtail Distributed Denial of Service(DDoS)attacks,which c...The explosive expansion of the Internet of Things(IoT)systems has increased the imperative to have strong and robust solutions to cyber Security,especially to curtail Distributed Denial of Service(DDoS)attacks,which can cripple critical infrastructure.The proposed framework presented in the current paper is a new hybrid scheme that induces deep learning-based traffic classification and blockchain-enabledmitigation tomake intelligent,decentralized,and real-time DDoS countermeasures in an IoT network.The proposed model fuses the extracted deep features with statistical features and trains them by using traditional machine-learning algorithms,which makes them more accurate in detection than statistical features alone,based on the Convolutional Neural Network(CNN)architecture,which can extract deep features.A permissioned blockchain will be included to record the threat cases immutably and automatically execute mitigation measures through smart contracts to provide transparency and resilience.When tested on two test sets,BoT-IoT and IoT-23,the framework obtains a maximum F1-score at 97.5 percent and only a 1.8 percent false positive rate,which compares favorably to other solutions regarding effectiveness and the amount of time required to respond.Our findings support the feasibility of our method as an extensible and secure paradigm of nextgeneration IoT security,which has constrictive utility in mission-critical or resource-constrained settings.The work is a substantial milestone in autonomous and trustful mitigation against DDoS attacks through intelligent learning and decentralized enforcement.展开更多
Internet of things networks often suffer from early node failures and short lifespan due to energy limits.Traditional routing methods are not enough.This work proposes a new hybrid algorithm called ACOGA.It combines A...Internet of things networks often suffer from early node failures and short lifespan due to energy limits.Traditional routing methods are not enough.This work proposes a new hybrid algorithm called ACOGA.It combines Ant Colony Optimization(ACO)and the Greedy Algorithm(GA).ACO finds smart paths while Greedy makes quick decisions.This improves energy use and performance.ACOGA outperforms Hybrid Energy-Efficient(HEE)and Adaptive Lossless Data Compression(ALDC)algorithms.After 500 rounds,only 5%of ACOGA’s nodes are dead,compared to 15%for HEE and 20%for ALDC.The network using ACOGA runs for 1200 rounds before the first nodes fail.HEE lasts 900 rounds and ALDC only 850.ACOGA saves at least 15%more energy by better distributing the load.It also achieves a 98%packet delivery rate.The method works well in mixed IoT networks like Smart Water Management Systems(SWMS).These systems have different power levels and communication ranges.The simulation of proposed model has been done in MATLAB simulator.The results show that that the proposed model outperform then the existing models.展开更多
The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud d...The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud data centres and an important concern of research for many researchers.In this paper,we proposed a cuckoo search(CS)-based optimisation technique for the virtual machine(VM)selection and a novel placement algorithm considering the different constraints.The energy consumption model and the simulation model have been implemented for the efficient selection of VM.The proposed model CSOA-VM not only lessens the violations at the service level agreement(SLA)level but also minimises the VM migrations.The proposed model also saves energy and the performance analysis shows that energy consumption obtained is 1.35 kWh,SLA violation is 9.2 and VM migration is about 268.Thus,there is an improvement in energy consumption of about 1.8%and a 2.1%improvement(reduction)in violations of SLA in comparison to existing techniques.展开更多
Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability...Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability.In this paper,Hybrid Golden Jackal,and Improved Whale Optimization Algorithm(HGJIWOA)is proposed as an effective and optimal routing protocol that guarantees efficient routing of data packets in the established between the CHs and the movable sink.This HGJIWOA included the phases of Dynamic Lens-Imaging Learning Strategy and Novel Update Rules for determining the reliable route essential for data packets broadcasting attained through fitness measure estimation-based CH selection.The process of CH selection achieved using Golden Jackal Optimization Algorithm(GJOA)completely depends on the factors of maintainability,consistency,trust,delay,and energy.The adopted GJOA algorithm play a dominant role in determining the optimal path of routing depending on the parameter of reduced delay and minimal distance.It further utilized Improved Whale Optimisation Algorithm(IWOA)for forwarding the data from chosen CHs to the BS via optimized route depending on the parameters of energy and distance.It also included a reliable route maintenance process that aids in deciding the selected route through which data need to be transmitted or re-routed.The simulation outcomes of the proposed HGJIWOA mechanism with different sensor nodes confirmed an improved mean throughput of 18.21%,sustained residual energy of 19.64%with minimized end-to-end delay of 21.82%,better than the competitive CH selection approaches.展开更多
During its growth stage,the plant is exposed to various diseases.Detection and early detection of crop diseases is amajor challenge in the horticulture industry.Crop infections can harmtotal crop yield and reduce farm...During its growth stage,the plant is exposed to various diseases.Detection and early detection of crop diseases is amajor challenge in the horticulture industry.Crop infections can harmtotal crop yield and reduce farmers’income if not identified early.Today’s approved method involves a professional plant pathologist to diagnose the disease by visual inspection of the afflicted plant leaves.This is an excellent use case for Community Assessment and Treatment Services(CATS)due to the lengthy manual disease diagnosis process and the accuracy of identification is directly proportional to the skills of pathologists.An alternative to conventional Machine Learning(ML)methods,which require manual identification of parameters for exact results,is to develop a prototype that can be classified without pre-processing.To automatically diagnose tomato leaf disease,this research proposes a hybrid model using the Convolutional Auto-Encoders(CAE)network and the CNN-based deep learning architecture of DenseNet.To date,none of the modern systems described in this paper have a combined model based on DenseNet,CAE,and ConvolutionalNeuralNetwork(CNN)todiagnose the ailments of tomato leaves automatically.Themodelswere trained on a dataset obtained from the Plant Village repository.The dataset consisted of 9920 tomato leaves,and the model-tomodel accuracy ratio was 98.35%.Unlike other approaches discussed in this paper,this hybrid strategy requires fewer training components.Therefore,the training time to classify plant diseases with the trained algorithm,as well as the training time to automatically detect the ailments of tomato leaves,is significantly reduced.展开更多
The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardio...The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardiovascular disease(CVD)diagnosis,but fluctuating signal patterns make classification challenging.Computer-assisted automated diagnostic tools that enhance ECG signal categorization using sophisticated algorithms and machine learning are helping healthcare practitioners manage greater patient populations.With this motivation,the study proposes a DL framework leveraging the PTB-XL ECG dataset to improve CVD diagnosis.Deep Transfer Learning(DTL)techniques extract features,followed by feature fusion to eliminate redundancy and retain the most informative features.Utilizing the African Vulture Optimization Algorithm(AVOA)for feature selection is more effective than the standard methods,as it offers an ideal balance between exploration and exploitation that results in an optimal set of features,improving classification performance while reducing redundancy.Various machine learning classifiers,including Support Vector Machine(SVM),eXtreme Gradient Boosting(XGBoost),Adaptive Boosting(AdaBoost),and Extreme Learning Machine(ELM),are used for further classification.Additionally,an ensemble model is developed to further improve accuracy.Experimental results demonstrate that the proposed model achieves the highest accuracy of 96.31%,highlighting its effectiveness in enhancing CVD diagnosis.展开更多
The Sine and Wormhole Energy Whale Optimization Algorithm(SWEWOA)represents an advanced solution method for resolving Optimal Power Flow(OPF)problems in power systems equipped with Flexible AC Transmission System(FACT...The Sine and Wormhole Energy Whale Optimization Algorithm(SWEWOA)represents an advanced solution method for resolving Optimal Power Flow(OPF)problems in power systems equipped with Flexible AC Transmission System(FACTS)devices which include Thyristor-Controlled Series Compensator(TCSC),Thyristor-Controlled Phase Shifter(TCPS),and Static Var Compensator(SVC).SWEWOA expands Whale Optimization Algorithm(WOA)through the integration of sine and wormhole energy features thus improving exploration and exploitation capabilities for efficient convergence in complex non-linear OPF problems.A performance evaluation of SWEWOA takes place on the IEEE-30 bus test system through static and dynamic loading scenarios where it demonstrates better results than five contemporary algorithms:Adaptive Chaotic WOA(ACWOA),WOA,Chaotic WOA(CWOA),Sine Cosine Algorithm Differential Evolution(SCADE),and Hybrid Grey Wolf Optimization(HGWO).The research shows that SWEWOA delivers superior generation cost reduction than other algorithms by reaching a minimum of 0.9%better performance.SWEWOA demonstrates superior power loss performance by achieving(P_(loss,min))at the lowest level compared to all other tested algorithms which leads to better system energy efficiency.The dynamic loading performance of SWEWOA leads to a 4.38%reduction in gross costs which proves its capability to handle different operating conditions.The algorithm achieves top performance in Friedman Rank Test(FRT)assessments through multiple performance metrics which verifies its consistent reliability and strong stability during changing power demands.The repeated simulations show that SWEWOA generates mean costs(C_(gen,min))and mean power loss values(P_(loss,min))with small deviations which indicate its capability to maintain cost-effective solutions in each simulation run.SWEWOA demonstrates great potential as an advanced optimization solution for power system operations through the results presented in this study.展开更多
As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)system...As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)systems.These systems are essential for monitoring and controlling industrial operations,making their security paramount.A key threat arises from Shor’s algorithm,a powerful quantum computing tool that can compromise current hash functions,leading to significant concerns about data integrity and confidentiality.To tackle these issues,this article introduces a novel Quantum-Resistant Hash Algorithm(QRHA)known as the Modular Hash Learning Algorithm(MHLA).This algorithm is meticulously crafted to withstand potential quantum attacks by incorporating advanced mathematical and algorithmic techniques,enhancing its overall security framework.Our research delves into the effectiveness ofMHLA in defending against both traditional and quantum-based threats,with a particular emphasis on its resilience to Shor’s algorithm.The findings from our study demonstrate that MHLA significantly enhances the security of SCADA systems in the context of quantum technology.By ensuring that sensitive data remains protected and confidential,MHLA not only fortifies individual systems but also contributes to the broader efforts of safeguarding industrial and infrastructure control systems against future quantumthreats.Our evaluation demonstrates that MHLA improves security by 38%against quantumattack simulations compared to traditional hash functionswhilemaintaining a computational efficiency ofO(m⋅n⋅k+v+n).The algorithm achieved a 98%success rate in detecting data tampering during integrity testing.These findings underline MHLA’s effectiveness in enhancing SCADA system security amidst evolving quantum technologies.This research represents a crucial step toward developing more secure cryptographic systems that can adapt to the rapidly changing technological landscape,ultimately ensuring the reliability and integrity of critical infrastructure in an era where quantum computing poses a growing risk.展开更多
The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated cha...The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated challenges,such as data security,interoperability,and ethical concerns,is crucial to realizing the full potential of IoT in healthcare.Real-time anomaly detection plays a key role in protecting patient data and maintaining device integrity amidst the additional security risks posed by interconnected systems.In this context,this paper presents a novelmethod for healthcare data privacy analysis.The technique is based on the identification of anomalies in cloud-based Internet of Things(IoT)networks,and it is optimized using explainable artificial intelligence.For anomaly detection,the Radial Boltzmann Gaussian Temporal Fuzzy Network(RBGTFN)is used in the process of doing information privacy analysis for healthcare data.Remora Colony SwarmOptimization is then used to carry out the optimization of the network.The performance of the model in identifying anomalies across a variety of healthcare data is evaluated by an experimental study.This evaluation suggested that themodel measures the accuracy,precision,latency,Quality of Service(QoS),and scalability of themodel.A remarkable 95%precision,93%latency,89%quality of service,98%detection accuracy,and 96%scalability were obtained by the suggested model,as shown by the subsequent findings.展开更多
Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscul...Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscultation of cardiovascular sounds is heavily reliant on clinical expertise and subject to high variability.To counter this limitation,this study proposes an AI-driven classification system for cardiovascular sounds whereby deep learning techniques are engaged to automate the detection of an abnormal heartbeat.We employ FastAI vision-learner-based convolutional neural networks(CNNs)that include ResNet,DenseNet,VGG,ConvNeXt,SqueezeNet,and AlexNet to classify heart sound recordings.Instead of raw waveform analysis,the proposed approach transforms preprocessed cardiovascular audio signals into spectrograms,which are suited for capturing temporal and frequency-wise patterns.The models are trained on the PASCAL Cardiovascular Challenge dataset while taking into consideration the recording variations,noise levels,and acoustic distortions.To demonstrate generalization,external validation using Google’s Audio set Heartbeat Sound dataset was performed using a dataset rich in cardiovascular sounds.Comparative analysis revealed that DenseNet-201,ConvNext Large,and ResNet-152 could deliver superior performance to the other architectures,achieving an accuracy of 81.50%,a precision of 85.50%,and an F1-score of 84.50%.In the process,we performed statistical significance testing,such as the Wilcoxon signed-rank test,to validate performance improvements over traditional classification methods.Beyond the technical contributions,the research underscores clinical integration,outlining a pathway in which the proposed system can augment conventional electronic stethoscopes and telemedicine platforms in the AI-assisted diagnostic workflows.We also discuss in detail issues of computational efficiency,model interpretability,and ethical considerations,particularly concerning algorithmic bias stemming from imbalanced datasets and the need for real-time processing in clinical settings.The study describes a scalable,automated system combining deep learning,feature extraction using spectrograms,and external validation that can assist healthcare providers in the early and accurate detection of cardiovascular disease.AI-driven solutions can be viable in improving access,reducing delays in diagnosis,and ultimately even the continued global burden of heart disease.展开更多
Protecting Supervisory Control and Data Acquisition-Industrial Internet of Things(SCADA-IIoT)systems against intruders has become essential since industrial control systems now oversee critical infrastructure,and cybe...Protecting Supervisory Control and Data Acquisition-Industrial Internet of Things(SCADA-IIoT)systems against intruders has become essential since industrial control systems now oversee critical infrastructure,and cyber attackers more frequently target these systems.Due to their connection of physical assets with digital networks,SCADA-IIoT systems face substantial risks from multiple attack types,including Distributed Denial of Service(DDoS),spoofing,and more advanced intrusion methods.Previous research in this field faces challenges due to insufficient solutions,as current intrusion detection systems lack the necessary accuracy,scalability,and adaptability needed for IIoT environments.This paper introduces CyberFortis,a novel cybersecurity framework aimed at detecting and preventing cyber threats in SCADA-IIoT systems.CyberFortis presents two key innovations:Firstly,Siamese Double Deep Q-Network with Autoencoders(Siamdqn-AE)FusionNet,which enhances intrusion detection by combining deep Q-Networks with autoencoders for improved attack detection and feature extraction;and secondly,the PopHydra Optimiser,an innovative solution to compute reinforcement learning discount factors for better model performance and convergence.This method combines Siamese deep Q-Networks with autoencoders to create a system that can detect different types of attacks more effectively and adapt to new challenges.CyberFortis is better than current top attack detection systems,showing higher scores in important areas like accuracy,precision,recall,and F1-score,based on data from CICIoT 2023,UNSW-NB 15,and WUSTL-IIoT datasets.Results from the proposed framework show a 97.5%accuracy rate,indicating its potential as an effective solution for SCADA-IIoT cybersecurity against emerging threats.The research confirms that the proposed security and resilience methods are successful in protecting vital industrial control systems within their operational environments.展开更多
This article presents an innovative approach to automatic rule discovery for data transformation tasks leveraging XGBoost,a machine learning algorithm renowned for its efficiency and performance.The framework proposed...This article presents an innovative approach to automatic rule discovery for data transformation tasks leveraging XGBoost,a machine learning algorithm renowned for its efficiency and performance.The framework proposed herein utilizes the fusion of diversified feature formats,specifically,metadata,textual,and pattern features.The goal is to enhance the system’s ability to discern and generalize transformation rules fromsource to destination formats in varied contexts.Firstly,the article delves into the methodology for extracting these distinct features from raw data and the pre-processing steps undertaken to prepare the data for the model.Subsequent sections expound on the mechanism of feature optimization using Recursive Feature Elimination(RFE)with linear regression,aiming to retain the most contributive features and eliminate redundant or less significant ones.The core of the research revolves around the deployment of the XGBoostmodel for training,using the prepared and optimized feature sets.The article presents a detailed overview of the mathematical model and algorithmic steps behind this procedure.Finally,the process of rule discovery(prediction phase)by the trained XGBoost model is explained,underscoring its role in real-time,automated data transformations.By employingmachine learning and particularly,the XGBoost model in the context of Business Rule Engine(BRE)data transformation,the article underscores a paradigm shift towardsmore scalable,efficient,and less human-dependent data transformation systems.This research opens doors for further exploration into automated rule discovery systems and their applications in various sectors.展开更多
One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operati...One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operations.As a result,a reliable roof fall prediction model is essential to tackle such challenges.Different parameters that substantially impact roof falls are ill-defined and intangible,making this an uncertain and challenging research issue.The National Institute for Occupational Safety and Health assembled a national database of roof performance from 37 coal mines to explore the factors contributing to roof falls.Data acquired for 37 mines is limited due to several restrictions,which increased the likelihood of incompleteness.Fuzzy logic is a technique for coping with ambiguity,incompleteness,and uncertainty.Therefore,In this paper,the fuzzy inference method is presented,which employs a genetic algorithm to create fuzzy rules based on 109 records of roof fall data and pattern search to refine the membership functions of parameters.The performance of the deployed model is evaluated using statistical measures such as the Root-Mean-Square Error,Mean-Absolute-Error,and coefficient of determination(R_(2)).Based on these criteria,the suggested model outperforms the existing models to precisely predict roof fall rates using fewer fuzzy rules.展开更多
Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced fe...Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced features of cloud computing, such as SaaS, PaaS and IaaS service models. So, nowadays, many organizations are trying to implement Cloud Computing based ERP system to enjoy the benefits of cloud computing. To implement any ERP system, an organization usually faces many challenges. As a result, this research has introduced how easily this cloud system can be implemented in an organization. By using this ERP system, an organization can be benefited in many ways;especially Small and Medium Enterprises (SMEs) can enjoy the highest possible benefits from this system.展开更多
Analyzing colon cancer data is essential for improving early detection,treatment outcomes,public health initiatives,research efforts,and overall patient care,ultimately leading to better outcomes and reduced burden as...Analyzing colon cancer data is essential for improving early detection,treatment outcomes,public health initiatives,research efforts,and overall patient care,ultimately leading to better outcomes and reduced burden associated with this disease.The prediction of any disease depends on the quality of the available dataset.Before applying the prediction algorithm,it is important to analyze its characteristics.This research presented a comprehensive framework for addressing data imbalance in colon cancer datasets,which has been a significant challenge in previous studies in terms of imbalancing and high dimensionality for the prediction of colon cancer data.Both characters are important concepts of preprocessing.Imbalancing refers to the adjusting the data points in the proper portion of the class label.Feature selection is the process of selecting the strong feature from the available dataspace.This study aims to improve the performance of the popular tree,rule,lazy(K nearest neighbor(KNN))classifiers,and support vector machine(SVM)algorithm after addressing the imbalancing issue of data analysis and applying various feature selection methods such as chi-square,symmetrical uncertainty,correlation-based feature selection(CFS)subset,and classifier subset evaluators.The proposed research framework shows that after balancing the dataset,all the algorithms performed better with all applied feature selection methods.Out of all methods,Jrip records 85.71%accuracy with classifier subset evaluators,Ridor marks 84.52%accuracy with CFS,J48 produces 83.33%accuracy with both CFS and classifier subset evaluators,simple cart notices 84.52%with classifier subset evaluators,KNN records 91.66%accuracy with Chi and CFS,and SVM produces 92.85%with symmetrical uncertainty.展开更多
In today’s world,image processing techniques play a crucial role in the prognosis and diagnosis of various diseases due to the development of several precise and accurate methods for medical images.Automated analysis...In today’s world,image processing techniques play a crucial role in the prognosis and diagnosis of various diseases due to the development of several precise and accurate methods for medical images.Automated analysis of medical images is essential for doctors,as manual investigation often leads to inter-observer variability.This research aims to enhance healthcare by enabling the early detection of diabetic retinopathy through an efficient image processing framework.The proposed hybridized method combines Modified Inertia Weight Particle Swarm Optimization(MIWPSO)and Fuzzy C-Means clustering(FCM)algorithms.Traditional FCM does not incorporate spatial neighborhood features,making it highly sensitive to noise,which significantly affects segmentation output.Our method incorporates a modified FCM that includes spatial functions in the fuzzy membership matrix to eliminate noise.The results demonstrate that the proposed FCM-MIWPSO method achieves highly precise and accurate medical image segmentation.Furthermore,segmented images are classified as benign or malignant using the Decision Tree-Based Temporal Association Rule(DT-TAR)Algorithm.Comparative analysis with existing state-of-the-art models indicates that the proposed FCM-MIWPSO segmentation technique achieves a remarkable accuracy of 98.42%on the dataset,highlighting its significant impact on improving diagnostic capabilities in medical imaging.展开更多
In recent decades,intelligent transportation systems(ITS)have improved drivers’safety and have shared information(such as traffic congestion and accidents)in a very efficient way.However,the privacy of vehicles and t...In recent decades,intelligent transportation systems(ITS)have improved drivers’safety and have shared information(such as traffic congestion and accidents)in a very efficient way.However,the privacy of vehicles and the security of event information is a major concern.The problem of secure sharing of event information without compromising the trusted third party(TTP)and data storage is the main issue in ITS.Blockchain technologies can resolve this problem.A work has been published on blockchain-based protocol for secure sharing of events and authentication of vehicles.This protocol addresses the issue of the safe storing of event information.However,authentication of vehicles solely depends on the cloud server.As a result,their scheme utilizes the notion of partially decentralized architecture.This paper proposes a novel decentralized architecture for the vehicular ad-hoc network(VANET)without the cloud server.This work also presents a protocol for securing event information and vehicle authentication using the blockchain mechanism.In this protocol,the registered user accesses the event information securely from the interplanetary file system(IPFS).We incorporate the IPFS,along with blockchain,to store the information in a fully distributed manner.The proposed protocol is compared with the state-of-the-art.The comparison provides desirable security at a reasonable cost.The evaluation of the proposed smart contract in terms of cost(GAS)is also discussed.展开更多
Mobile sink is the challenging task for wireless sensor networks(WSNs).In this paper we propose to design an efficient routing protocol for single mobile sink and multiple mobile sink for data gathering in WSN.In this...Mobile sink is the challenging task for wireless sensor networks(WSNs).In this paper we propose to design an efficient routing protocol for single mobile sink and multiple mobile sink for data gathering in WSN.In this process,a biased random walk method is used to determine the next position of the sink.Then,a rendezvous point selection with splitting tree technique is used to find the optimal data transmission path.If the sink moves within the range of the rendezvous point,it receives the gathered data and if moved out,it selects a relay node from its neighbours to relay packets from rendezvous point to the sink.Proposed algorithm reduces the signal overhead and improves the triangular routing problem.Here the sink acts as a vehicle and collect the data from the sensor.The results show that the proposed model effectively supports sink mobility with low overhead and delay when compared with Intelligent Agent-based Routing protocol(IAR) and also increases the reliability and delivery ratio when the number of sources increases.展开更多
BACKGROUND Liver disease indicates any pathology that can harm or destroy the liver or prevent it from normal functioning.The global community has recently witnessed an increase in the mortality rate due to liver dise...BACKGROUND Liver disease indicates any pathology that can harm or destroy the liver or prevent it from normal functioning.The global community has recently witnessed an increase in the mortality rate due to liver disease.This could be attributed to many factors,among which are human habits,awareness issues,poor healthcare,and late detection.To curb the growing threats from liver disease,early detection is critical to help reduce the risks and improve treatment outcome.Emerging technologies such as machine learning,as shown in this study,could be deployed to assist in enhancing its prediction and treatment.AIM To present a more efficient system for timely prediction of liver disease using a hybrid eXtreme Gradient Boosting model with hyperparameter tuning with a view to assist in early detection,diagnosis,and reduction of risks and mortality associated with the disease.METHODS The dataset used in this study consisted of 416 people with liver problems and 167 with no such history.The data were collected from the state of Andhra Pradesh,India,through https://www.kaggle.com/datasets/uciml/indian-liver-patientrecords.The population was divided into two sets depending on the disease state of the patient.This binary information was recorded in the attribute"is_patient".RESULTS The results indicated that the chi-square automated interaction detection and classification and regression trees models achieved an accuracy level of 71.36%and 73.24%,respectively,which was much better than the conventional method.The proposed solution would assist patients and physicians in tackling the problem of liver disease and ensuring that cases are detected early to prevent it from developing into cirrhosis(scarring)and to enhance the survival of patients.The study showed the potential of machine learning in health care,especially as it concerns disease prediction and monitoring.CONCLUSION This study contributed to the knowledge of machine learning application to health and to the efforts toward combating the problem of liver disease.However,relevant authorities have to invest more into machine learning research and other health technologies to maximize their potential.展开更多
基金funded by the ICT Division of theMinistry of Posts,Telecommunications,and Information Technology of Bangladesh under Grant Number 56.00.0000.052.33.005.21-7(Tracking No.22FS15306)support from the University of Rajshahi.
文摘The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for healthcare systems,particularly for identifying actions critical to patient well-being.However,challenges such as high computational demands,low accuracy,and limited adaptability persist in Human Motion Recognition(HMR).While some studies have integrated HMR with IoT for real-time healthcare applications,limited research has focused on recognizing MRHA as essential for effective patient monitoring.This study proposes a novel HMR method tailored for MRHA detection,leveraging multi-stage deep learning techniques integrated with IoT.The approach employs EfficientNet to extract optimized spatial features from skeleton frame sequences using seven Mobile Inverted Bottleneck Convolutions(MBConv)blocks,followed by Convolutional Long Short Term Memory(ConvLSTM)to capture spatio-temporal patterns.A classification module with global average pooling,a fully connected layer,and a dropout layer generates the final predictions.The model is evaluated on the NTU RGB+D 120 and HMDB51 datasets,focusing on MRHA such as sneezing,falling,walking,sitting,etc.It achieves 94.85%accuracy for cross-subject evaluations and 96.45%for cross-view evaluations on NTU RGB+D 120,along with 89.22%accuracy on HMDB51.Additionally,the system integrates IoT capabilities using a Raspberry Pi and GSM module,delivering real-time alerts via Twilios SMS service to caregivers and patients.This scalable and efficient solution bridges the gap between HMR and IoT,advancing patient monitoring,improving healthcare outcomes,and reducing costs.
文摘The explosive expansion of the Internet of Things(IoT)systems has increased the imperative to have strong and robust solutions to cyber Security,especially to curtail Distributed Denial of Service(DDoS)attacks,which can cripple critical infrastructure.The proposed framework presented in the current paper is a new hybrid scheme that induces deep learning-based traffic classification and blockchain-enabledmitigation tomake intelligent,decentralized,and real-time DDoS countermeasures in an IoT network.The proposed model fuses the extracted deep features with statistical features and trains them by using traditional machine-learning algorithms,which makes them more accurate in detection than statistical features alone,based on the Convolutional Neural Network(CNN)architecture,which can extract deep features.A permissioned blockchain will be included to record the threat cases immutably and automatically execute mitigation measures through smart contracts to provide transparency and resilience.When tested on two test sets,BoT-IoT and IoT-23,the framework obtains a maximum F1-score at 97.5 percent and only a 1.8 percent false positive rate,which compares favorably to other solutions regarding effectiveness and the amount of time required to respond.Our findings support the feasibility of our method as an extensible and secure paradigm of nextgeneration IoT security,which has constrictive utility in mission-critical or resource-constrained settings.The work is a substantial milestone in autonomous and trustful mitigation against DDoS attacks through intelligent learning and decentralized enforcement.
文摘Internet of things networks often suffer from early node failures and short lifespan due to energy limits.Traditional routing methods are not enough.This work proposes a new hybrid algorithm called ACOGA.It combines Ant Colony Optimization(ACO)and the Greedy Algorithm(GA).ACO finds smart paths while Greedy makes quick decisions.This improves energy use and performance.ACOGA outperforms Hybrid Energy-Efficient(HEE)and Adaptive Lossless Data Compression(ALDC)algorithms.After 500 rounds,only 5%of ACOGA’s nodes are dead,compared to 15%for HEE and 20%for ALDC.The network using ACOGA runs for 1200 rounds before the first nodes fail.HEE lasts 900 rounds and ALDC only 850.ACOGA saves at least 15%more energy by better distributing the load.It also achieves a 98%packet delivery rate.The method works well in mixed IoT networks like Smart Water Management Systems(SWMS).These systems have different power levels and communication ranges.The simulation of proposed model has been done in MATLAB simulator.The results show that that the proposed model outperform then the existing models.
文摘The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud data centres and an important concern of research for many researchers.In this paper,we proposed a cuckoo search(CS)-based optimisation technique for the virtual machine(VM)selection and a novel placement algorithm considering the different constraints.The energy consumption model and the simulation model have been implemented for the efficient selection of VM.The proposed model CSOA-VM not only lessens the violations at the service level agreement(SLA)level but also minimises the VM migrations.The proposed model also saves energy and the performance analysis shows that energy consumption obtained is 1.35 kWh,SLA violation is 9.2 and VM migration is about 268.Thus,there is an improvement in energy consumption of about 1.8%and a 2.1%improvement(reduction)in violations of SLA in comparison to existing techniques.
文摘Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability.In this paper,Hybrid Golden Jackal,and Improved Whale Optimization Algorithm(HGJIWOA)is proposed as an effective and optimal routing protocol that guarantees efficient routing of data packets in the established between the CHs and the movable sink.This HGJIWOA included the phases of Dynamic Lens-Imaging Learning Strategy and Novel Update Rules for determining the reliable route essential for data packets broadcasting attained through fitness measure estimation-based CH selection.The process of CH selection achieved using Golden Jackal Optimization Algorithm(GJOA)completely depends on the factors of maintainability,consistency,trust,delay,and energy.The adopted GJOA algorithm play a dominant role in determining the optimal path of routing depending on the parameter of reduced delay and minimal distance.It further utilized Improved Whale Optimisation Algorithm(IWOA)for forwarding the data from chosen CHs to the BS via optimized route depending on the parameters of energy and distance.It also included a reliable route maintenance process that aids in deciding the selected route through which data need to be transmitted or re-routed.The simulation outcomes of the proposed HGJIWOA mechanism with different sensor nodes confirmed an improved mean throughput of 18.21%,sustained residual energy of 19.64%with minimized end-to-end delay of 21.82%,better than the competitive CH selection approaches.
基金funded by UKRI EPSRC Grant EP/W020408/1 Project SPRITE+2:The Security,Privacy,Identity,and Trust Engagement Network plus(phase 2)for this studyfunded by PhD project RS718 on Explainable AI through the UKRI EPSRC Grant-funded Doctoral Training Centre at Swansea University.
文摘During its growth stage,the plant is exposed to various diseases.Detection and early detection of crop diseases is amajor challenge in the horticulture industry.Crop infections can harmtotal crop yield and reduce farmers’income if not identified early.Today’s approved method involves a professional plant pathologist to diagnose the disease by visual inspection of the afflicted plant leaves.This is an excellent use case for Community Assessment and Treatment Services(CATS)due to the lengthy manual disease diagnosis process and the accuracy of identification is directly proportional to the skills of pathologists.An alternative to conventional Machine Learning(ML)methods,which require manual identification of parameters for exact results,is to develop a prototype that can be classified without pre-processing.To automatically diagnose tomato leaf disease,this research proposes a hybrid model using the Convolutional Auto-Encoders(CAE)network and the CNN-based deep learning architecture of DenseNet.To date,none of the modern systems described in this paper have a combined model based on DenseNet,CAE,and ConvolutionalNeuralNetwork(CNN)todiagnose the ailments of tomato leaves automatically.Themodelswere trained on a dataset obtained from the Plant Village repository.The dataset consisted of 9920 tomato leaves,and the model-tomodel accuracy ratio was 98.35%.Unlike other approaches discussed in this paper,this hybrid strategy requires fewer training components.Therefore,the training time to classify plant diseases with the trained algorithm,as well as the training time to automatically detect the ailments of tomato leaves,is significantly reduced.
基金funded by Researchers Supporting ProjectNumber(RSPD2025R947),King Saud University,Riyadh,Saudi Arabia.
文摘The integration of IoT and Deep Learning(DL)has significantly advanced real-time health monitoring and predictive maintenance in prognostic and health management(PHM).Electrocardiograms(ECGs)are widely used for cardiovascular disease(CVD)diagnosis,but fluctuating signal patterns make classification challenging.Computer-assisted automated diagnostic tools that enhance ECG signal categorization using sophisticated algorithms and machine learning are helping healthcare practitioners manage greater patient populations.With this motivation,the study proposes a DL framework leveraging the PTB-XL ECG dataset to improve CVD diagnosis.Deep Transfer Learning(DTL)techniques extract features,followed by feature fusion to eliminate redundancy and retain the most informative features.Utilizing the African Vulture Optimization Algorithm(AVOA)for feature selection is more effective than the standard methods,as it offers an ideal balance between exploration and exploitation that results in an optimal set of features,improving classification performance while reducing redundancy.Various machine learning classifiers,including Support Vector Machine(SVM),eXtreme Gradient Boosting(XGBoost),Adaptive Boosting(AdaBoost),and Extreme Learning Machine(ELM),are used for further classification.Additionally,an ensemble model is developed to further improve accuracy.Experimental results demonstrate that the proposed model achieves the highest accuracy of 96.31%,highlighting its effectiveness in enhancing CVD diagnosis.
文摘The Sine and Wormhole Energy Whale Optimization Algorithm(SWEWOA)represents an advanced solution method for resolving Optimal Power Flow(OPF)problems in power systems equipped with Flexible AC Transmission System(FACTS)devices which include Thyristor-Controlled Series Compensator(TCSC),Thyristor-Controlled Phase Shifter(TCPS),and Static Var Compensator(SVC).SWEWOA expands Whale Optimization Algorithm(WOA)through the integration of sine and wormhole energy features thus improving exploration and exploitation capabilities for efficient convergence in complex non-linear OPF problems.A performance evaluation of SWEWOA takes place on the IEEE-30 bus test system through static and dynamic loading scenarios where it demonstrates better results than five contemporary algorithms:Adaptive Chaotic WOA(ACWOA),WOA,Chaotic WOA(CWOA),Sine Cosine Algorithm Differential Evolution(SCADE),and Hybrid Grey Wolf Optimization(HGWO).The research shows that SWEWOA delivers superior generation cost reduction than other algorithms by reaching a minimum of 0.9%better performance.SWEWOA demonstrates superior power loss performance by achieving(P_(loss,min))at the lowest level compared to all other tested algorithms which leads to better system energy efficiency.The dynamic loading performance of SWEWOA leads to a 4.38%reduction in gross costs which proves its capability to handle different operating conditions.The algorithm achieves top performance in Friedman Rank Test(FRT)assessments through multiple performance metrics which verifies its consistent reliability and strong stability during changing power demands.The repeated simulations show that SWEWOA generates mean costs(C_(gen,min))and mean power loss values(P_(loss,min))with small deviations which indicate its capability to maintain cost-effective solutions in each simulation run.SWEWOA demonstrates great potential as an advanced optimization solution for power system operations through the results presented in this study.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R343),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabiathe Deanship of Scientific Research at Northern Border University,Arar,Saudi Arabia for funding this research work through the project number NBU-FFR-2025-1092-10.
文摘As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)systems.These systems are essential for monitoring and controlling industrial operations,making their security paramount.A key threat arises from Shor’s algorithm,a powerful quantum computing tool that can compromise current hash functions,leading to significant concerns about data integrity and confidentiality.To tackle these issues,this article introduces a novel Quantum-Resistant Hash Algorithm(QRHA)known as the Modular Hash Learning Algorithm(MHLA).This algorithm is meticulously crafted to withstand potential quantum attacks by incorporating advanced mathematical and algorithmic techniques,enhancing its overall security framework.Our research delves into the effectiveness ofMHLA in defending against both traditional and quantum-based threats,with a particular emphasis on its resilience to Shor’s algorithm.The findings from our study demonstrate that MHLA significantly enhances the security of SCADA systems in the context of quantum technology.By ensuring that sensitive data remains protected and confidential,MHLA not only fortifies individual systems but also contributes to the broader efforts of safeguarding industrial and infrastructure control systems against future quantumthreats.Our evaluation demonstrates that MHLA improves security by 38%against quantumattack simulations compared to traditional hash functionswhilemaintaining a computational efficiency ofO(m⋅n⋅k+v+n).The algorithm achieved a 98%success rate in detecting data tampering during integrity testing.These findings underline MHLA’s effectiveness in enhancing SCADA system security amidst evolving quantum technologies.This research represents a crucial step toward developing more secure cryptographic systems that can adapt to the rapidly changing technological landscape,ultimately ensuring the reliability and integrity of critical infrastructure in an era where quantum computing poses a growing risk.
基金funded by Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah under grant No.(RG-6-611-43)the authors,therefore,acknowledge with thanks DSR technical and financial support.
文摘The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated challenges,such as data security,interoperability,and ethical concerns,is crucial to realizing the full potential of IoT in healthcare.Real-time anomaly detection plays a key role in protecting patient data and maintaining device integrity amidst the additional security risks posed by interconnected systems.In this context,this paper presents a novelmethod for healthcare data privacy analysis.The technique is based on the identification of anomalies in cloud-based Internet of Things(IoT)networks,and it is optimized using explainable artificial intelligence.For anomaly detection,the Radial Boltzmann Gaussian Temporal Fuzzy Network(RBGTFN)is used in the process of doing information privacy analysis for healthcare data.Remora Colony SwarmOptimization is then used to carry out the optimization of the network.The performance of the model in identifying anomalies across a variety of healthcare data is evaluated by an experimental study.This evaluation suggested that themodel measures the accuracy,precision,latency,Quality of Service(QoS),and scalability of themodel.A remarkable 95%precision,93%latency,89%quality of service,98%detection accuracy,and 96%scalability were obtained by the suggested model,as shown by the subsequent findings.
基金funded by the deanship of scientific research(DSR),King Abdulaziz University,Jeddah,under grant No.(G-1436-611-309).
文摘Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscultation of cardiovascular sounds is heavily reliant on clinical expertise and subject to high variability.To counter this limitation,this study proposes an AI-driven classification system for cardiovascular sounds whereby deep learning techniques are engaged to automate the detection of an abnormal heartbeat.We employ FastAI vision-learner-based convolutional neural networks(CNNs)that include ResNet,DenseNet,VGG,ConvNeXt,SqueezeNet,and AlexNet to classify heart sound recordings.Instead of raw waveform analysis,the proposed approach transforms preprocessed cardiovascular audio signals into spectrograms,which are suited for capturing temporal and frequency-wise patterns.The models are trained on the PASCAL Cardiovascular Challenge dataset while taking into consideration the recording variations,noise levels,and acoustic distortions.To demonstrate generalization,external validation using Google’s Audio set Heartbeat Sound dataset was performed using a dataset rich in cardiovascular sounds.Comparative analysis revealed that DenseNet-201,ConvNext Large,and ResNet-152 could deliver superior performance to the other architectures,achieving an accuracy of 81.50%,a precision of 85.50%,and an F1-score of 84.50%.In the process,we performed statistical significance testing,such as the Wilcoxon signed-rank test,to validate performance improvements over traditional classification methods.Beyond the technical contributions,the research underscores clinical integration,outlining a pathway in which the proposed system can augment conventional electronic stethoscopes and telemedicine platforms in the AI-assisted diagnostic workflows.We also discuss in detail issues of computational efficiency,model interpretability,and ethical considerations,particularly concerning algorithmic bias stemming from imbalanced datasets and the need for real-time processing in clinical settings.The study describes a scalable,automated system combining deep learning,feature extraction using spectrograms,and external validation that can assist healthcare providers in the early and accurate detection of cardiovascular disease.AI-driven solutions can be viable in improving access,reducing delays in diagnosis,and ultimately even the continued global burden of heart disease.
基金financially supported by the Ongoing Research Funding Program(ORF-2025-846),King Saud University,Riyadh,Saudi Arabia.
文摘Protecting Supervisory Control and Data Acquisition-Industrial Internet of Things(SCADA-IIoT)systems against intruders has become essential since industrial control systems now oversee critical infrastructure,and cyber attackers more frequently target these systems.Due to their connection of physical assets with digital networks,SCADA-IIoT systems face substantial risks from multiple attack types,including Distributed Denial of Service(DDoS),spoofing,and more advanced intrusion methods.Previous research in this field faces challenges due to insufficient solutions,as current intrusion detection systems lack the necessary accuracy,scalability,and adaptability needed for IIoT environments.This paper introduces CyberFortis,a novel cybersecurity framework aimed at detecting and preventing cyber threats in SCADA-IIoT systems.CyberFortis presents two key innovations:Firstly,Siamese Double Deep Q-Network with Autoencoders(Siamdqn-AE)FusionNet,which enhances intrusion detection by combining deep Q-Networks with autoencoders for improved attack detection and feature extraction;and secondly,the PopHydra Optimiser,an innovative solution to compute reinforcement learning discount factors for better model performance and convergence.This method combines Siamese deep Q-Networks with autoencoders to create a system that can detect different types of attacks more effectively and adapt to new challenges.CyberFortis is better than current top attack detection systems,showing higher scores in important areas like accuracy,precision,recall,and F1-score,based on data from CICIoT 2023,UNSW-NB 15,and WUSTL-IIoT datasets.Results from the proposed framework show a 97.5%accuracy rate,indicating its potential as an effective solution for SCADA-IIoT cybersecurity against emerging threats.The research confirms that the proposed security and resilience methods are successful in protecting vital industrial control systems within their operational environments.
文摘This article presents an innovative approach to automatic rule discovery for data transformation tasks leveraging XGBoost,a machine learning algorithm renowned for its efficiency and performance.The framework proposed herein utilizes the fusion of diversified feature formats,specifically,metadata,textual,and pattern features.The goal is to enhance the system’s ability to discern and generalize transformation rules fromsource to destination formats in varied contexts.Firstly,the article delves into the methodology for extracting these distinct features from raw data and the pre-processing steps undertaken to prepare the data for the model.Subsequent sections expound on the mechanism of feature optimization using Recursive Feature Elimination(RFE)with linear regression,aiming to retain the most contributive features and eliminate redundant or less significant ones.The core of the research revolves around the deployment of the XGBoostmodel for training,using the prepared and optimized feature sets.The article presents a detailed overview of the mathematical model and algorithmic steps behind this procedure.Finally,the process of rule discovery(prediction phase)by the trained XGBoost model is explained,underscoring its role in real-time,automated data transformations.By employingmachine learning and particularly,the XGBoost model in the context of Business Rule Engine(BRE)data transformation,the article underscores a paradigm shift towardsmore scalable,efficient,and less human-dependent data transformation systems.This research opens doors for further exploration into automated rule discovery systems and their applications in various sectors.
文摘One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operations.As a result,a reliable roof fall prediction model is essential to tackle such challenges.Different parameters that substantially impact roof falls are ill-defined and intangible,making this an uncertain and challenging research issue.The National Institute for Occupational Safety and Health assembled a national database of roof performance from 37 coal mines to explore the factors contributing to roof falls.Data acquired for 37 mines is limited due to several restrictions,which increased the likelihood of incompleteness.Fuzzy logic is a technique for coping with ambiguity,incompleteness,and uncertainty.Therefore,In this paper,the fuzzy inference method is presented,which employs a genetic algorithm to create fuzzy rules based on 109 records of roof fall data and pattern search to refine the membership functions of parameters.The performance of the deployed model is evaluated using statistical measures such as the Root-Mean-Square Error,Mean-Absolute-Error,and coefficient of determination(R_(2)).Based on these criteria,the suggested model outperforms the existing models to precisely predict roof fall rates using fewer fuzzy rules.
文摘Cloud Computing is an uprising technology in the rapid growing IT world. The adaptation of cloud computing is increasing in very large scale business organizations to small institutions rapidly due to many advanced features of cloud computing, such as SaaS, PaaS and IaaS service models. So, nowadays, many organizations are trying to implement Cloud Computing based ERP system to enjoy the benefits of cloud computing. To implement any ERP system, an organization usually faces many challenges. As a result, this research has introduced how easily this cloud system can be implemented in an organization. By using this ERP system, an organization can be benefited in many ways;especially Small and Medium Enterprises (SMEs) can enjoy the highest possible benefits from this system.
文摘Analyzing colon cancer data is essential for improving early detection,treatment outcomes,public health initiatives,research efforts,and overall patient care,ultimately leading to better outcomes and reduced burden associated with this disease.The prediction of any disease depends on the quality of the available dataset.Before applying the prediction algorithm,it is important to analyze its characteristics.This research presented a comprehensive framework for addressing data imbalance in colon cancer datasets,which has been a significant challenge in previous studies in terms of imbalancing and high dimensionality for the prediction of colon cancer data.Both characters are important concepts of preprocessing.Imbalancing refers to the adjusting the data points in the proper portion of the class label.Feature selection is the process of selecting the strong feature from the available dataspace.This study aims to improve the performance of the popular tree,rule,lazy(K nearest neighbor(KNN))classifiers,and support vector machine(SVM)algorithm after addressing the imbalancing issue of data analysis and applying various feature selection methods such as chi-square,symmetrical uncertainty,correlation-based feature selection(CFS)subset,and classifier subset evaluators.The proposed research framework shows that after balancing the dataset,all the algorithms performed better with all applied feature selection methods.Out of all methods,Jrip records 85.71%accuracy with classifier subset evaluators,Ridor marks 84.52%accuracy with CFS,J48 produces 83.33%accuracy with both CFS and classifier subset evaluators,simple cart notices 84.52%with classifier subset evaluators,KNN records 91.66%accuracy with Chi and CFS,and SVM produces 92.85%with symmetrical uncertainty.
基金Scientific Research Deanship has funded this project at the University of Ha’il–Saudi Arabia Ha’il–Saudi Arabia through project number RG-21104.
文摘In today’s world,image processing techniques play a crucial role in the prognosis and diagnosis of various diseases due to the development of several precise and accurate methods for medical images.Automated analysis of medical images is essential for doctors,as manual investigation often leads to inter-observer variability.This research aims to enhance healthcare by enabling the early detection of diabetic retinopathy through an efficient image processing framework.The proposed hybridized method combines Modified Inertia Weight Particle Swarm Optimization(MIWPSO)and Fuzzy C-Means clustering(FCM)algorithms.Traditional FCM does not incorporate spatial neighborhood features,making it highly sensitive to noise,which significantly affects segmentation output.Our method incorporates a modified FCM that includes spatial functions in the fuzzy membership matrix to eliminate noise.The results demonstrate that the proposed FCM-MIWPSO method achieves highly precise and accurate medical image segmentation.Furthermore,segmented images are classified as benign or malignant using the Decision Tree-Based Temporal Association Rule(DT-TAR)Algorithm.Comparative analysis with existing state-of-the-art models indicates that the proposed FCM-MIWPSO segmentation technique achieves a remarkable accuracy of 98.42%on the dataset,highlighting its significant impact on improving diagnostic capabilities in medical imaging.
文摘In recent decades,intelligent transportation systems(ITS)have improved drivers’safety and have shared information(such as traffic congestion and accidents)in a very efficient way.However,the privacy of vehicles and the security of event information is a major concern.The problem of secure sharing of event information without compromising the trusted third party(TTP)and data storage is the main issue in ITS.Blockchain technologies can resolve this problem.A work has been published on blockchain-based protocol for secure sharing of events and authentication of vehicles.This protocol addresses the issue of the safe storing of event information.However,authentication of vehicles solely depends on the cloud server.As a result,their scheme utilizes the notion of partially decentralized architecture.This paper proposes a novel decentralized architecture for the vehicular ad-hoc network(VANET)without the cloud server.This work also presents a protocol for securing event information and vehicle authentication using the blockchain mechanism.In this protocol,the registered user accesses the event information securely from the interplanetary file system(IPFS).We incorporate the IPFS,along with blockchain,to store the information in a fully distributed manner.The proposed protocol is compared with the state-of-the-art.The comparison provides desirable security at a reasonable cost.The evaluation of the proposed smart contract in terms of cost(GAS)is also discussed.
文摘Mobile sink is the challenging task for wireless sensor networks(WSNs).In this paper we propose to design an efficient routing protocol for single mobile sink and multiple mobile sink for data gathering in WSN.In this process,a biased random walk method is used to determine the next position of the sink.Then,a rendezvous point selection with splitting tree technique is used to find the optimal data transmission path.If the sink moves within the range of the rendezvous point,it receives the gathered data and if moved out,it selects a relay node from its neighbours to relay packets from rendezvous point to the sink.Proposed algorithm reduces the signal overhead and improves the triangular routing problem.Here the sink acts as a vehicle and collect the data from the sensor.The results show that the proposed model effectively supports sink mobility with low overhead and delay when compared with Intelligent Agent-based Routing protocol(IAR) and also increases the reliability and delivery ratio when the number of sources increases.
文摘BACKGROUND Liver disease indicates any pathology that can harm or destroy the liver or prevent it from normal functioning.The global community has recently witnessed an increase in the mortality rate due to liver disease.This could be attributed to many factors,among which are human habits,awareness issues,poor healthcare,and late detection.To curb the growing threats from liver disease,early detection is critical to help reduce the risks and improve treatment outcome.Emerging technologies such as machine learning,as shown in this study,could be deployed to assist in enhancing its prediction and treatment.AIM To present a more efficient system for timely prediction of liver disease using a hybrid eXtreme Gradient Boosting model with hyperparameter tuning with a view to assist in early detection,diagnosis,and reduction of risks and mortality associated with the disease.METHODS The dataset used in this study consisted of 416 people with liver problems and 167 with no such history.The data were collected from the state of Andhra Pradesh,India,through https://www.kaggle.com/datasets/uciml/indian-liver-patientrecords.The population was divided into two sets depending on the disease state of the patient.This binary information was recorded in the attribute"is_patient".RESULTS The results indicated that the chi-square automated interaction detection and classification and regression trees models achieved an accuracy level of 71.36%and 73.24%,respectively,which was much better than the conventional method.The proposed solution would assist patients and physicians in tackling the problem of liver disease and ensuring that cases are detected early to prevent it from developing into cirrhosis(scarring)and to enhance the survival of patients.The study showed the potential of machine learning in health care,especially as it concerns disease prediction and monitoring.CONCLUSION This study contributed to the knowledge of machine learning application to health and to the efforts toward combating the problem of liver disease.However,relevant authorities have to invest more into machine learning research and other health technologies to maximize their potential.