The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machin...The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.展开更多
The widespread and growing interest in the Internet of Things(IoT)may be attributed to its usefulness in many different fields.Physical settings are probed for data,which is then transferred via linked networks.There ...The widespread and growing interest in the Internet of Things(IoT)may be attributed to its usefulness in many different fields.Physical settings are probed for data,which is then transferred via linked networks.There are several hurdles to overcome when putting IoT into practice,from managing server infrastructure to coordinating the use of tiny sensors.When it comes to deploying IoT,everyone agrees that security is the biggest issue.This is due to the fact that a large number of IoT devices exist in the physicalworld and thatmany of themhave constrained resources such as electricity,memory,processing power,and square footage.This research intends to analyse resource-constrained IoT devices,including RFID tags,sensors,and smart cards,and the issues involved with protecting them in such restricted circumstances.Using lightweight cryptography,the information sent between these gadgets may be secured.In order to provide a holistic picture,this research evaluates and contrasts well-known algorithms based on their implementation cost,hardware/software efficiency,and attack resistance features.We also emphasised how essential lightweight encryption is for striking a good cost-to-performance-to-security ratio.展开更多
In recent years,Speech Emotion Recognition(SER)has developed into an essential instrument for interpreting human emotions from auditory data.The proposed research focuses on the development of a SER system employing d...In recent years,Speech Emotion Recognition(SER)has developed into an essential instrument for interpreting human emotions from auditory data.The proposed research focuses on the development of a SER system employing deep learning and multiple datasets containing samples of emotive speech.The primary objective of this research endeavor is to investigate the utilization of Convolutional Neural Networks(CNNs)in the process of sound feature extraction.Stretching,pitch manipulation,and noise injection are a few of the techniques utilized in this study to improve the data quality.Feature extraction methods including Zero Crossing Rate,Chroma_stft,Mel⁃scale Frequency Cepstral Coefficients(MFCC),Root Mean Square(RMS),and Mel⁃Spectogram are used to train a model.By using these techniques,audio signals can be transformed into recognized features that can be utilized to train the model.Ultimately,the study produces a thorough evaluation of the models performance.When this method was applied,the model achieved an impressive accuracy of 94.57%on the test dataset.The proposed work was also validated on the EMO⁃BD and IEMOCAP datasets.These consist of further data augmentation,feature engineering,and hyperparameter optimization.By following these development paths,SER systems will be able to be implemented in real⁃world scenarios with greater accuracy and resilience.展开更多
We assessed nutrient characteristics, distributions and fractions within the disturbed and undisturbed sediments at four sampling sites within the mainstream of Haihe River. The river sediments contained mostly sand ...We assessed nutrient characteristics, distributions and fractions within the disturbed and undisturbed sediments at four sampling sites within the mainstream of Haihe River. The river sediments contained mostly sand ( 60%). The fraction of clay was 3%. Total nitrogen (TN) and total phosphorus (TP) concentrations ranged from 729 to 1922 mg/kg and from 692 to 1388 mg/kg, respectively. Nutrient concentrations within the sediments usually decreased with increasing depth. The TN and TP concentrations within the fine sand were higher than for that within silt. Sediment phosphorus fractions were between 2.99% and 3.37% Ex-P (exchangeable phosphorus), 7.89% and 13.71% Fe/Al-P (Fe, Al oxides bound phosphorus), 61.32% and 70.14% Ca-P (calcium-bound phosphorus), and 17.03% and 22.04% Org-P (organic phosphorus). Nitrogen and phosphorus release from sediment could lead to the presence of 21.02 mg N/L and 3.10 mg P/L within the water column. A river restoration project should address the sediment nutrient stock.展开更多
To the Editor,We commend Chen et al[1]for their insightful exploration of the challenges associated with the evaluation of large language models(LLMs)and agents in clinical applications.Their work contributes signific...To the Editor,We commend Chen et al[1]for their insightful exploration of the challenges associated with the evaluation of large language models(LLMs)and agents in clinical applications.Their work contributes significantly to the discourse on integrating artificial intelligence into healthcare,highlighting critical issues such as data privacy,model interpretability,and risk of misinformation.展开更多
In recent years, network traffic data have become larger and more complex, leading to higher possibilities of network intrusion. Traditional intrusion detection methods face difficulty in processing high-speed network...In recent years, network traffic data have become larger and more complex, leading to higher possibilities of network intrusion. Traditional intrusion detection methods face difficulty in processing high-speed network data and cannot detect currently unknown attacks. Therefore, this paper proposes a network attack detection method combining a flow calculation and deep learning. The method consists of two parts: a real-time detection algorithm based on flow calculations and frequent patterns and a classification algorithm based on the deep belief network and support vector machine(DBN-SVM). Sliding window(SW) stream data processing enables real-time detection, and the DBN-SVM algorithm can improve classification accuracy. Finally, to verify the proposed method, a system is implemented.Based on the CICIDS2017 open source data set, a series of comparative experiments are conducted. The method's real-time detection efficiency is higher than that of traditional machine learning algorithms. The attack classification accuracy is 0.7 percentage points higher than that of a DBN, which is 2 percentage points higher than that of the integrated algorithm boosting and bagging methods. Hence, it is suitable for the real-time detection of high-speed network intrusions.展开更多
Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes us...Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.展开更多
In Wireless Body Area Networks(WBANs)with respect to health care,sensors are positioned inside the body of an individual to transfer sensed data to a central station periodically.The great challenges posed to healthca...In Wireless Body Area Networks(WBANs)with respect to health care,sensors are positioned inside the body of an individual to transfer sensed data to a central station periodically.The great challenges posed to healthcare WBANs are the black hole and sink hole attacks.Data from deployed sensor nodes are attracted by sink hole or black hole nodes while grabbing the shortest path.Identifying this issue is quite a challenging task as a small variation in medicine intake may result in a severe illness.This work proposes a hybrid detection framework for attacks by applying a Proportional Coinciding Score(PCS)and an MK-Means algorithm,which is a well-known machine learning technique used to raise attack detection accuracy and decrease computational difficulties while giving treatments for heartache and respiratory issues.First,the gathered training data feature count is reduced through data pre-processing in the PCS.Second,the pre-processed features are sent to the MK-Means algorithm for training the data and promoting classification.Third,certain attack detection measures given by the intrusion detection system,such as the number of data packages trans-received,are identified by the MK-Means algorithm.This study demonstrates that the MK-Means framework yields a high detection accuracy with a low packet loss rate,low communication overhead,and reduced end-to-end delay in the network and improves the accuracy of biomedical data.展开更多
The Healthcare monitoring on a clinical base involves many implicit communication between the patient and the care takers. Any misinterpretation leads to adverse effects. A simple wearable system can precisely interpr...The Healthcare monitoring on a clinical base involves many implicit communication between the patient and the care takers. Any misinterpretation leads to adverse effects. A simple wearable system can precisely interpret the implicit communication to the care takers or to an automated support device. Simple and obvious hand movements can be used for the above purpose. The proposed system suggests a novel methodology simpler than the existing sign language interpretations for such implicit communication. The experimental results show a well-distinguished realization of different hand movement activities using a wearable sensor medium and the interpretation results always show significant thresholds.展开更多
In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several ...In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several security-related problems,such as user privacy breaches,data disclosure,data corruption,and so on,during the process of data outsourcing.For addressing and handling the security-related issues on Cloud,several models were proposed.With that concern,this paper develops a Privacy-Preserved Data Security Approach(PP-DSA)to provide the data security and data integrity for the out-sourcing data in Cloud Environment.Privacy preservation is ensured in this work with the Efficient Authentication Technique(EAT)using the Group Signature method that is applied with Third-Party Auditor(TPA).The role of the auditor is to secure the data and guarantee shared data integrity.Additionally,the Cloud Service Provider(CSP)and Data User(DU)can also be the attackers that are to be handled with the EAT.Here,the major objective of the work is to enhance cloud security and thereby,increase Quality of Service(QoS).The results are evaluated based on the model effectiveness,security,and reliability and show that the proposed model provides better results than existing works.展开更多
Optimizing the performance of composite structures is a real-world application with significant benefits.In this paper,a high-fidelity finite element method(FEM)is combined with the iterative improvement capability of...Optimizing the performance of composite structures is a real-world application with significant benefits.In this paper,a high-fidelity finite element method(FEM)is combined with the iterative improvement capability of metaheuristic optimization algorithms to obtain optimized composite plates.The FEM module comprises of ninenode isoparametric plate bending element in conjunction with the first-order shear deformation theory(FSDT).A recently proposed memetic version of particle swarm optimization called RPSOLC is modified in the current research to carry out multi-objective Pareto optimization.The performance of the MO-RPSOLC is found to be comparable with the NSGA-III.This work successfully highlights the use of FEM-MO-RPSOLC in obtaining highfidelity Pareto solutions considering simultaneous maximization of the fundamental frequency and frequency separation in laminated composites by optimizing the stacking sequence.展开更多
Data offloading at the network with less time and reduced energy con-sumption are highly important for every technology.Smart applications process the data very quickly with less power consumption.As technology grows t...Data offloading at the network with less time and reduced energy con-sumption are highly important for every technology.Smart applications process the data very quickly with less power consumption.As technology grows towards 5G communication architecture,identifying a solution for QoS in 5G through energy-efficient computing is important.In this proposed model,we perform data offloading at 5G using the fuzzification concept.Mobile IoT devices create tasks in the network and are offloaded in the cloud or mobile edge nodes based on energy consumption.Two base stations,small(SB)and macro(MB)stations,are initialized and thefirst tasks randomly computed.Then,the tasks are pro-cessed using a fuzzification algorithm to select SB or MB in the central server.The optimization is performed using a grasshopper algorithm for improving the QoS of the 5G network.The result is compared with existing algorithms and indi-cates that the proposed system improves the performance of the system with a cost of 44.64 J for computing 250 benchmark tasks.展开更多
The Mobile Ad-hoc Network(MANET)is a dynamic topology that provides a variety of executions in various disciplines.The most sticky topic in organizationalfields was MANET protection.MANET is helpless against various t...The Mobile Ad-hoc Network(MANET)is a dynamic topology that provides a variety of executions in various disciplines.The most sticky topic in organizationalfields was MANET protection.MANET is helpless against various threats that affect its usability and accessibility.The dark opening assault is considered one of the most far-reaching dynamic assaults that deteriorate the organi-zation's execution and reliability by dropping all approaching packages via the noxious node.The Dark Opening Node aims to deceive any node in the company that wishes to connect to another node by pretending to get the most delicate ability to support the target node.Ad-hoc On-demand Distance Vector(AODV)is a responsive steering convention with no corporate techniques to locate and destroy the dark opening center.We improved AODV by incorporating a novel compact method for detecting and isolating lonely and collaborative black-hole threats that utilize clocks and baits.The recommended method allows MANET nodes to discover and segregate black-hole network nodes over dynamic changes in the network topology.We implement the suggested method's performance with the help of Network Simulator(NS)-3 simulation models.Furthermore,the proposed approach comes exceptionally near to the original AODV,absent black holes in terms of bandwidth,end-to-end latency,error rate,and delivery ratio.展开更多
Major fields such as military applications,medical fields,weather forecasting,and environmental applications use wireless sensor networks for major computing processes.Sensors play a vital role in emerging technologie...Major fields such as military applications,medical fields,weather forecasting,and environmental applications use wireless sensor networks for major computing processes.Sensors play a vital role in emerging technologies of the 20th century.Localization of sensors in needed locations is a very serious problem.The environment is home to every living being in the world.The growth of industries after the industrial revolution increased pollution across the environment.Owing to recent uncontrolled growth and development,sensors to measure pollution levels across industries and surroundings are needed.An interesting and challenging task is choosing the place to fit the sensors.Many meta-heuristic techniques have been introduced in node localization.Swarm intelligent algorithms have proven their efficiency in many studies on localization problems.In this article,we introduce an industrial-centric approach to solve the problem of node localization in the sensor network.First,our work aims at selecting industrial areas in the sensed location.We use random forest regression methodology to select the polluted area.Then,the elephant herding algorithm is used in sensor node localization.These two algorithms are combined to produce the best standard result in localizing the sensor nodes.To check the proposed performance,experiments are conducted with data from the KDD Cup 2018,which contain the name of 35 stations with concentrations of air pollutants such as PM,SO_(2),CO,NO_(2),and O_(3).These data are normalized and tested with algorithms.The results are comparatively analyzed with other swarm intelligence algorithms such as the elephant herding algorithm,particle swarm optimization,and machine learning algorithms such as decision tree regression and multi-layer perceptron.Results can indicate our proposed algorithm can suggest more meaningful locations for localizing the sensors in the topology.Our proposed method achieves a lower root mean square value with 0.06 to 0.08 for localizing with Stations 1 to 5.展开更多
Introducing a thin InGaN interlayer with a relatively lower indium content between the quantum well (QW) and barrier results in a step-like InxGa1-xN/GaN potential barrier on one side of the QW. This change in the a...Introducing a thin InGaN interlayer with a relatively lower indium content between the quantum well (QW) and barrier results in a step-like InxGa1-xN/GaN potential barrier on one side of the QW. This change in the active region leads to a significant shift in photolumineseence (PL) and electroluminescence (EL) emissions to a longer wavelength compared with the conventional QW based light-emitting diodes. More importantly, an improvement against efficiency droop and an enhancement in light output power at the high-current injection are observed in the modified light-emitting diode structures. The role of the inserted layer in these improvements is investigated by simulation in detail, which shows that the creation of more sublevels in the valence band and the increase of hole concentration inside QWs are the main reasons for these improvements.展开更多
Digital Watermarking is a technology, to facilitate the authentication, copyright protection and Security of digital media. The objective of developing a robust watermarking technique is to incorporate the maximum pos...Digital Watermarking is a technology, to facilitate the authentication, copyright protection and Security of digital media. The objective of developing a robust watermarking technique is to incorporate the maximum possible robustness without compromising with the transparency. Singular Value Decomposition (SVD) using Firefly Algorithm provides this objective of an optimal robust watermarking technique. Multiple scaling factors are used to embed the watermark image into the host by multiplying these scaling factors with the Singular Values (SV) of the host audio. Firefly Algorithm is used to optimise the modified host audio to achieve the highest possible robustness and transparency. This approach can significantly increase the quality of watermarked audio and provide more robustness to the embedded watermark against various attacks such as noise, resampling, filtering attacks etc.展开更多
The use of Explainable Artificial Intelligence(XAI)models becomes increasingly important for making decisions in smart healthcare environments.It is to make sure that decisions are based on trustworthy algorithms and ...The use of Explainable Artificial Intelligence(XAI)models becomes increasingly important for making decisions in smart healthcare environments.It is to make sure that decisions are based on trustworthy algorithms and that healthcare workers understand the decisions made by these algorithms.These models can potentially enhance interpretability and explainability in decision-making processes that rely on artificial intelligence.Nevertheless,the intricate nature of the healthcare field necessitates the utilization of sophisticated models to classify cancer images.This research presents an advanced investigation of XAI models to classify cancer images.It describes the different levels of explainability and interpretability associated with XAI models and the challenges faced in deploying them in healthcare applications.In addition,this study proposes a novel framework for cancer image classification that incorporates XAI models with deep learning and advanced medical imaging techniques.The proposed model integrates several techniques,including end-to-end explainable evaluation,rule-based explanation,and useradaptive explanation.The proposed XAI reaches 97.72%accuracy,90.72%precision,93.72%recall,96.72%F1-score,9.55%FDR,9.66%FOR,and 91.18%DOR.It will discuss the potential applications of the proposed XAI models in the smart healthcare environment.It will help ensure trust and accountability in AI-based decisions,which is essential for achieving a safe and reliable smart healthcare environment.展开更多
To the Editor,We read with keen interest the article by Singh and Mantri titled"A clinical decision support system using rough set theory and machine learning for disease prediction,"published in Intelligent...To the Editor,We read with keen interest the article by Singh and Mantri titled"A clinical decision support system using rough set theory and machine learning for disease prediction,"published in Intelligent Medicine in 2024[1].The paper presents an important contribution to the field of clinical decision support by leveraging rough set theory for feature reduction and hybridizing it with machine learning models to predict disease conditions with improved interpretability.The fusion of rule-based systems with algorithmic learning aligns well with the current efforts to enhance transparency in medical artificial intelligence.展开更多
To the Editor,We read with great interest the recent study by Islam et al [1]. The article offers a timely and robust exploration into hypertension prediction using machine learning (ML) and explainable AI tools. The ...To the Editor,We read with great interest the recent study by Islam et al [1]. The article offers a timely and robust exploration into hypertension prediction using machine learning (ML) and explainable AI tools. The combination of XGBoost and Recursive Feature Elimination (RFE), supported by interpretability methods such as SHapley Additive exPlanations (SHAP) and Local Interpretable Model-agnostic Explanations (LIME), resulted in a 91.5% accuracy and an area under the curve of 0.95, while also uncovering key predictors such as genetic pedigree coefficients and hemoglobin levels [1].展开更多
Wireless Sensor Networks(WSNs)are large-scale and high-density networks that typically have coverage area overlap.In addition,a random deployment of sensor nodes cannot fully guarantee coverage of the sensing area,whi...Wireless Sensor Networks(WSNs)are large-scale and high-density networks that typically have coverage area overlap.In addition,a random deployment of sensor nodes cannot fully guarantee coverage of the sensing area,which leads to coverage holes in WSNs.Thus,coverage control plays an important role in WSNs.To alleviate unnecessary energy wastage and improve network performance,we consider both energy efficiency and coverage rate for WSNs.In this paper,we present a novel coverage control algorithm based on Particle Swarm Optimization(PSO).Firstly,the sensor nodes are randomly deployed in a target area and remain static after deployment.Then,the whole network is partitioned into grids,and we calculate each grid’s coverage rate and energy consumption.Finally,each sensor nodes’sensing radius is adjusted according to the coverage rate and energy consumption of each grid.Simulation results show that our algorithm can effectively improve coverage rate and reduce energy consumption.展开更多
文摘The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms.
基金supported by project TRANSACT funded under H2020-EU.2.1.1.-INDUSTRIAL LEADERSHIP-Leadership in Enabling and Industrial Technologies-Information and Communication Technologies(Grant Agreement ID:101007260).
文摘The widespread and growing interest in the Internet of Things(IoT)may be attributed to its usefulness in many different fields.Physical settings are probed for data,which is then transferred via linked networks.There are several hurdles to overcome when putting IoT into practice,from managing server infrastructure to coordinating the use of tiny sensors.When it comes to deploying IoT,everyone agrees that security is the biggest issue.This is due to the fact that a large number of IoT devices exist in the physicalworld and thatmany of themhave constrained resources such as electricity,memory,processing power,and square footage.This research intends to analyse resource-constrained IoT devices,including RFID tags,sensors,and smart cards,and the issues involved with protecting them in such restricted circumstances.Using lightweight cryptography,the information sent between these gadgets may be secured.In order to provide a holistic picture,this research evaluates and contrasts well-known algorithms based on their implementation cost,hardware/software efficiency,and attack resistance features.We also emphasised how essential lightweight encryption is for striking a good cost-to-performance-to-security ratio.
文摘In recent years,Speech Emotion Recognition(SER)has developed into an essential instrument for interpreting human emotions from auditory data.The proposed research focuses on the development of a SER system employing deep learning and multiple datasets containing samples of emotive speech.The primary objective of this research endeavor is to investigate the utilization of Convolutional Neural Networks(CNNs)in the process of sound feature extraction.Stretching,pitch manipulation,and noise injection are a few of the techniques utilized in this study to improve the data quality.Feature extraction methods including Zero Crossing Rate,Chroma_stft,Mel⁃scale Frequency Cepstral Coefficients(MFCC),Root Mean Square(RMS),and Mel⁃Spectogram are used to train a model.By using these techniques,audio signals can be transformed into recognized features that can be utilized to train the model.Ultimately,the study produces a thorough evaluation of the models performance.When this method was applied,the model achieved an impressive accuracy of 94.57%on the test dataset.The proposed work was also validated on the EMO⁃BD and IEMOCAP datasets.These consist of further data augmentation,feature engineering,and hyperparameter optimization.By following these development paths,SER systems will be able to be implemented in real⁃world scenarios with greater accuracy and resilience.
基金supported by the National Natural Sci- ence Foundation of China (No. 51079068)the Natural Science Foundation of Tianjin (No. 09ZCGYSF00400, 08FDZDSF03402)+1 种基金the National Key-Projects of Water Pollution Control and Prevention (No. 2008ZX07314-005- 001, 2009ZX07209-001)funded by The Royal Society
文摘We assessed nutrient characteristics, distributions and fractions within the disturbed and undisturbed sediments at four sampling sites within the mainstream of Haihe River. The river sediments contained mostly sand ( 60%). The fraction of clay was 3%. Total nitrogen (TN) and total phosphorus (TP) concentrations ranged from 729 to 1922 mg/kg and from 692 to 1388 mg/kg, respectively. Nutrient concentrations within the sediments usually decreased with increasing depth. The TN and TP concentrations within the fine sand were higher than for that within silt. Sediment phosphorus fractions were between 2.99% and 3.37% Ex-P (exchangeable phosphorus), 7.89% and 13.71% Fe/Al-P (Fe, Al oxides bound phosphorus), 61.32% and 70.14% Ca-P (calcium-bound phosphorus), and 17.03% and 22.04% Org-P (organic phosphorus). Nitrogen and phosphorus release from sediment could lead to the presence of 21.02 mg N/L and 3.10 mg P/L within the water column. A river restoration project should address the sediment nutrient stock.
文摘To the Editor,We commend Chen et al[1]for their insightful exploration of the challenges associated with the evaluation of large language models(LLMs)and agents in clinical applications.Their work contributes significantly to the discourse on integrating artificial intelligence into healthcare,highlighting critical issues such as data privacy,model interpretability,and risk of misinformation.
基金supported by the National Key Research and Development Program of China(2017YFB1401300,2017YFB1401304)the National Natural Science Foundation of China(61702211,L1724007,61902203)+3 种基金Hubei Provincial Science and Technology Program of China(2017AKA191)the Self-Determined Research Funds of Central China Normal University(CCNU)from the Colleges’Basic Research(CCNU17QD0004,CCNU17GF0002)the Natural Science Foundation of Shandong Province(ZR2017QF015)the Key Research and Development Plan–Major Scientific and Technological Innovation Projects of Shandong Province(2019JZZY020101)。
文摘In recent years, network traffic data have become larger and more complex, leading to higher possibilities of network intrusion. Traditional intrusion detection methods face difficulty in processing high-speed network data and cannot detect currently unknown attacks. Therefore, this paper proposes a network attack detection method combining a flow calculation and deep learning. The method consists of two parts: a real-time detection algorithm based on flow calculations and frequent patterns and a classification algorithm based on the deep belief network and support vector machine(DBN-SVM). Sliding window(SW) stream data processing enables real-time detection, and the DBN-SVM algorithm can improve classification accuracy. Finally, to verify the proposed method, a system is implemented.Based on the CICIDS2017 open source data set, a series of comparative experiments are conducted. The method's real-time detection efficiency is higher than that of traditional machine learning algorithms. The attack classification accuracy is 0.7 percentage points higher than that of a DBN, which is 2 percentage points higher than that of the integrated algorithm boosting and bagging methods. Hence, it is suitable for the real-time detection of high-speed network intrusions.
基金This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea(NRF),funded by the Ministry of Education(NRF-2021R1A6A1A03039493)in part by the NRF grant funded by the Korea government(MSIT)(NRF-2022R1A2C1004401).
文摘Phishing is a type of cybercrime in which cyber-attackers pose themselves as authorized persons or entities and hack the victims’sensitive data.E-mails,instant messages and phone calls are some of the common modes used in cyberattacks.Though the security models are continuously upgraded to prevent cyberattacks,hackers find innovative ways to target the victims.In this background,there is a drastic increase observed in the number of phishing emails sent to potential targets.This scenario necessitates the importance of designing an effective classification model.Though numerous conventional models are available in the literature for proficient classification of phishing emails,the Machine Learning(ML)techniques and the Deep Learning(DL)models have been employed in the literature.The current study presents an Intelligent Cuckoo Search(CS)Optimization Algorithm with a Deep Learning-based Phishing Email Detection and Classification(ICSOA-DLPEC)model.The aim of the proposed ICSOA-DLPEC model is to effectually distinguish the emails as either legitimate or phishing ones.At the initial stage,the pre-processing is performed through three stages such as email cleaning,tokenization and stop-word elimination.Then,the N-gram approach is;moreover,the CS algorithm is applied to extract the useful feature vectors.Moreover,the CS algorithm is employed with the Gated Recurrent Unit(GRU)model to detect and classify phishing emails.Furthermore,the CS algorithm is used to fine-tune the parameters involved in the GRU model.The performance of the proposed ICSOA-DLPEC model was experimentally validated using a benchmark dataset,and the results were assessed under several dimensions.Extensive comparative studies were conducted,and the results confirmed the superior performance of the proposed ICSOA-DLPEC model over other existing approaches.The proposed model achieved a maximum accuracy of 99.72%.
基金funded by Stefan cel Mare University of Suceava,Romania.
文摘In Wireless Body Area Networks(WBANs)with respect to health care,sensors are positioned inside the body of an individual to transfer sensed data to a central station periodically.The great challenges posed to healthcare WBANs are the black hole and sink hole attacks.Data from deployed sensor nodes are attracted by sink hole or black hole nodes while grabbing the shortest path.Identifying this issue is quite a challenging task as a small variation in medicine intake may result in a severe illness.This work proposes a hybrid detection framework for attacks by applying a Proportional Coinciding Score(PCS)and an MK-Means algorithm,which is a well-known machine learning technique used to raise attack detection accuracy and decrease computational difficulties while giving treatments for heartache and respiratory issues.First,the gathered training data feature count is reduced through data pre-processing in the PCS.Second,the pre-processed features are sent to the MK-Means algorithm for training the data and promoting classification.Third,certain attack detection measures given by the intrusion detection system,such as the number of data packages trans-received,are identified by the MK-Means algorithm.This study demonstrates that the MK-Means framework yields a high detection accuracy with a low packet loss rate,low communication overhead,and reduced end-to-end delay in the network and improves the accuracy of biomedical data.
文摘The Healthcare monitoring on a clinical base involves many implicit communication between the patient and the care takers. Any misinterpretation leads to adverse effects. A simple wearable system can precisely interpret the implicit communication to the care takers or to an automated support device. Simple and obvious hand movements can be used for the above purpose. The proposed system suggests a novel methodology simpler than the existing sign language interpretations for such implicit communication. The experimental results show a well-distinguished realization of different hand movement activities using a wearable sensor medium and the interpretation results always show significant thresholds.
文摘In the present scenario of rapid growth in cloud computing models,several companies and users started to share their data on cloud servers.However,when the model is not completely trusted,the data owners face several security-related problems,such as user privacy breaches,data disclosure,data corruption,and so on,during the process of data outsourcing.For addressing and handling the security-related issues on Cloud,several models were proposed.With that concern,this paper develops a Privacy-Preserved Data Security Approach(PP-DSA)to provide the data security and data integrity for the out-sourcing data in Cloud Environment.Privacy preservation is ensured in this work with the Efficient Authentication Technique(EAT)using the Group Signature method that is applied with Third-Party Auditor(TPA).The role of the auditor is to secure the data and guarantee shared data integrity.Additionally,the Cloud Service Provider(CSP)and Data User(DU)can also be the attackers that are to be handled with the EAT.Here,the major objective of the work is to enhance cloud security and thereby,increase Quality of Service(QoS).The results are evaluated based on the model effectiveness,security,and reliability and show that the proposed model provides better results than existing works.
文摘Optimizing the performance of composite structures is a real-world application with significant benefits.In this paper,a high-fidelity finite element method(FEM)is combined with the iterative improvement capability of metaheuristic optimization algorithms to obtain optimized composite plates.The FEM module comprises of ninenode isoparametric plate bending element in conjunction with the first-order shear deformation theory(FSDT).A recently proposed memetic version of particle swarm optimization called RPSOLC is modified in the current research to carry out multi-objective Pareto optimization.The performance of the MO-RPSOLC is found to be comparable with the NSGA-III.This work successfully highlights the use of FEM-MO-RPSOLC in obtaining highfidelity Pareto solutions considering simultaneous maximization of the fundamental frequency and frequency separation in laminated composites by optimizing the stacking sequence.
文摘Data offloading at the network with less time and reduced energy con-sumption are highly important for every technology.Smart applications process the data very quickly with less power consumption.As technology grows towards 5G communication architecture,identifying a solution for QoS in 5G through energy-efficient computing is important.In this proposed model,we perform data offloading at 5G using the fuzzification concept.Mobile IoT devices create tasks in the network and are offloaded in the cloud or mobile edge nodes based on energy consumption.Two base stations,small(SB)and macro(MB)stations,are initialized and thefirst tasks randomly computed.Then,the tasks are pro-cessed using a fuzzification algorithm to select SB or MB in the central server.The optimization is performed using a grasshopper algorithm for improving the QoS of the 5G network.The result is compared with existing algorithms and indi-cates that the proposed system improves the performance of the system with a cost of 44.64 J for computing 250 benchmark tasks.
文摘The Mobile Ad-hoc Network(MANET)is a dynamic topology that provides a variety of executions in various disciplines.The most sticky topic in organizationalfields was MANET protection.MANET is helpless against various threats that affect its usability and accessibility.The dark opening assault is considered one of the most far-reaching dynamic assaults that deteriorate the organi-zation's execution and reliability by dropping all approaching packages via the noxious node.The Dark Opening Node aims to deceive any node in the company that wishes to connect to another node by pretending to get the most delicate ability to support the target node.Ad-hoc On-demand Distance Vector(AODV)is a responsive steering convention with no corporate techniques to locate and destroy the dark opening center.We improved AODV by incorporating a novel compact method for detecting and isolating lonely and collaborative black-hole threats that utilize clocks and baits.The recommended method allows MANET nodes to discover and segregate black-hole network nodes over dynamic changes in the network topology.We implement the suggested method's performance with the help of Network Simulator(NS)-3 simulation models.Furthermore,the proposed approach comes exceptionally near to the original AODV,absent black holes in terms of bandwidth,end-to-end latency,error rate,and delivery ratio.
文摘Major fields such as military applications,medical fields,weather forecasting,and environmental applications use wireless sensor networks for major computing processes.Sensors play a vital role in emerging technologies of the 20th century.Localization of sensors in needed locations is a very serious problem.The environment is home to every living being in the world.The growth of industries after the industrial revolution increased pollution across the environment.Owing to recent uncontrolled growth and development,sensors to measure pollution levels across industries and surroundings are needed.An interesting and challenging task is choosing the place to fit the sensors.Many meta-heuristic techniques have been introduced in node localization.Swarm intelligent algorithms have proven their efficiency in many studies on localization problems.In this article,we introduce an industrial-centric approach to solve the problem of node localization in the sensor network.First,our work aims at selecting industrial areas in the sensed location.We use random forest regression methodology to select the polluted area.Then,the elephant herding algorithm is used in sensor node localization.These two algorithms are combined to produce the best standard result in localizing the sensor nodes.To check the proposed performance,experiments are conducted with data from the KDD Cup 2018,which contain the name of 35 stations with concentrations of air pollutants such as PM,SO_(2),CO,NO_(2),and O_(3).These data are normalized and tested with algorithms.The results are comparatively analyzed with other swarm intelligence algorithms such as the elephant herding algorithm,particle swarm optimization,and machine learning algorithms such as decision tree regression and multi-layer perceptron.Results can indicate our proposed algorithm can suggest more meaningful locations for localizing the sensors in the topology.Our proposed method achieves a lower root mean square value with 0.06 to 0.08 for localizing with Stations 1 to 5.
基金Supported by the National Natural Science Foundation of China under Grant Nos 61334005,51272008 and 60990314the Beijing Municipal Science and Technology Project under Grant No H030430020000the National Basic Research Program of China under Grant Nos 2012CB619304 and 2012CB619306
文摘Introducing a thin InGaN interlayer with a relatively lower indium content between the quantum well (QW) and barrier results in a step-like InxGa1-xN/GaN potential barrier on one side of the QW. This change in the active region leads to a significant shift in photolumineseence (PL) and electroluminescence (EL) emissions to a longer wavelength compared with the conventional QW based light-emitting diodes. More importantly, an improvement against efficiency droop and an enhancement in light output power at the high-current injection are observed in the modified light-emitting diode structures. The role of the inserted layer in these improvements is investigated by simulation in detail, which shows that the creation of more sublevels in the valence band and the increase of hole concentration inside QWs are the main reasons for these improvements.
文摘Digital Watermarking is a technology, to facilitate the authentication, copyright protection and Security of digital media. The objective of developing a robust watermarking technique is to incorporate the maximum possible robustness without compromising with the transparency. Singular Value Decomposition (SVD) using Firefly Algorithm provides this objective of an optimal robust watermarking technique. Multiple scaling factors are used to embed the watermark image into the host by multiplying these scaling factors with the Singular Values (SV) of the host audio. Firefly Algorithm is used to optimise the modified host audio to achieve the highest possible robustness and transparency. This approach can significantly increase the quality of watermarked audio and provide more robustness to the embedded watermark against various attacks such as noise, resampling, filtering attacks etc.
基金supported by theCONAHCYT(Consejo Nacional deHumanidades,Ciencias y Tecnologias).
文摘The use of Explainable Artificial Intelligence(XAI)models becomes increasingly important for making decisions in smart healthcare environments.It is to make sure that decisions are based on trustworthy algorithms and that healthcare workers understand the decisions made by these algorithms.These models can potentially enhance interpretability and explainability in decision-making processes that rely on artificial intelligence.Nevertheless,the intricate nature of the healthcare field necessitates the utilization of sophisticated models to classify cancer images.This research presents an advanced investigation of XAI models to classify cancer images.It describes the different levels of explainability and interpretability associated with XAI models and the challenges faced in deploying them in healthcare applications.In addition,this study proposes a novel framework for cancer image classification that incorporates XAI models with deep learning and advanced medical imaging techniques.The proposed model integrates several techniques,including end-to-end explainable evaluation,rule-based explanation,and useradaptive explanation.The proposed XAI reaches 97.72%accuracy,90.72%precision,93.72%recall,96.72%F1-score,9.55%FDR,9.66%FOR,and 91.18%DOR.It will discuss the potential applications of the proposed XAI models in the smart healthcare environment.It will help ensure trust and accountability in AI-based decisions,which is essential for achieving a safe and reliable smart healthcare environment.
文摘To the Editor,We read with keen interest the article by Singh and Mantri titled"A clinical decision support system using rough set theory and machine learning for disease prediction,"published in Intelligent Medicine in 2024[1].The paper presents an important contribution to the field of clinical decision support by leveraging rough set theory for feature reduction and hybridizing it with machine learning models to predict disease conditions with improved interpretability.The fusion of rule-based systems with algorithmic learning aligns well with the current efforts to enhance transparency in medical artificial intelligence.
文摘To the Editor,We read with great interest the recent study by Islam et al [1]. The article offers a timely and robust exploration into hypertension prediction using machine learning (ML) and explainable AI tools. The combination of XGBoost and Recursive Feature Elimination (RFE), supported by interpretability methods such as SHapley Additive exPlanations (SHAP) and Local Interpretable Model-agnostic Explanations (LIME), resulted in a 91.5% accuracy and an area under the curve of 0.95, while also uncovering key predictors such as genetic pedigree coefficients and hemoglobin levels [1].
基金This research work was supported by the National Natural Science Foundation of China(61772454,61811530332).Professor Gwang-jun Kim is the corresponding author.
文摘Wireless Sensor Networks(WSNs)are large-scale and high-density networks that typically have coverage area overlap.In addition,a random deployment of sensor nodes cannot fully guarantee coverage of the sensing area,which leads to coverage holes in WSNs.Thus,coverage control plays an important role in WSNs.To alleviate unnecessary energy wastage and improve network performance,we consider both energy efficiency and coverage rate for WSNs.In this paper,we present a novel coverage control algorithm based on Particle Swarm Optimization(PSO).Firstly,the sensor nodes are randomly deployed in a target area and remain static after deployment.Then,the whole network is partitioned into grids,and we calculate each grid’s coverage rate and energy consumption.Finally,each sensor nodes’sensing radius is adjusted according to the coverage rate and energy consumption of each grid.Simulation results show that our algorithm can effectively improve coverage rate and reduce energy consumption.