期刊文献+
共找到14,668篇文章
< 1 2 250 >
每页显示 20 50 100
DEEP NEURAL NETWORKS COMBINING MULTI-TASK LEARNING FOR SOLVING DELAY INTEGRO-DIFFERENTIAL EQUATIONS 被引量:1
1
作者 WANG Chen-yao SHI Feng 《数学杂志》 2025年第1期13-38,共26页
Deep neural networks(DNNs)are effective in solving both forward and inverse problems for nonlinear partial differential equations(PDEs).However,conventional DNNs are not effective in handling problems such as delay di... Deep neural networks(DNNs)are effective in solving both forward and inverse problems for nonlinear partial differential equations(PDEs).However,conventional DNNs are not effective in handling problems such as delay differential equations(DDEs)and delay integrodifferential equations(DIDEs)with constant delays,primarily due to their low regularity at delayinduced breaking points.In this paper,a DNN method that combines multi-task learning(MTL)which is proposed to solve both the forward and inverse problems of DIDEs.The core idea of this approach is to divide the original equation into multiple tasks based on the delay,using auxiliary outputs to represent the integral terms,followed by the use of MTL to seamlessly incorporate the properties at the breaking points into the loss function.Furthermore,given the increased training dificulty associated with multiple tasks and outputs,we employ a sequential training scheme to reduce training complexity and provide reference solutions for subsequent tasks.This approach significantly enhances the approximation accuracy of solving DIDEs with DNNs,as demonstrated by comparisons with traditional DNN methods.We validate the effectiveness of this method through several numerical experiments,test various parameter sharing structures in MTL and compare the testing results of these structures.Finally,this method is implemented to solve the inverse problem of nonlinear DIDE and the results show that the unknown parameters of DIDE can be discovered with sparse or noisy data. 展开更多
关键词 Delay integro-differential equation Multi-task learning parameter sharing structure deep neural network sequential training scheme
在线阅读 下载PDF
Addressing Modern Cybersecurity Challenges: A Hybrid Machine Learning and Deep Learning Approach for Network Intrusion Detection
2
作者 Khadija Bouzaachane El Mahdi El Guarmah +1 位作者 Abdullah M.Alnajim Sheroz Khan 《Computers, Materials & Continua》 2025年第8期2391-2410,共20页
The rapid increase in the number of Internet of Things(IoT)devices,coupled with a rise in sophisticated cyberattacks,demands robust intrusion detection systems.This study presents a holistic,intelligent intrusion dete... The rapid increase in the number of Internet of Things(IoT)devices,coupled with a rise in sophisticated cyberattacks,demands robust intrusion detection systems.This study presents a holistic,intelligent intrusion detection system.It uses a combined method that integrates machine learning(ML)and deep learning(DL)techniques to improve the protection of contemporary information technology(IT)systems.Unlike traditional signature-based or singlemodel methods,this system integrates the strengths of ensemble learning for binary classification and deep learning for multi-class classification.This combination provides a more nuanced and adaptable defense.The research utilizes the NF-UQ-NIDS-v2 dataset,a recent,comprehensive benchmark for evaluating network intrusion detection systems(NIDS).Our methodological framework employs advanced artificial intelligence techniques.Specifically,we use ensemble learning algorithms(Random Forest,Gradient Boosting,AdaBoost,and XGBoost)for binary classification.Deep learning architectures are also employed to address the complexities of multi-class classification,allowing for fine-grained identification of intrusion types.To mitigate class imbalance,a common problem in multi-class intrusion detection that biases model performance,we use oversampling and data augmentation.These techniques ensure equitable class representation.The results demonstrate the efficacy of the proposed hybrid ML-DL system.It achieves significant improvements in intrusion detection accuracy and reliability.This research contributes substantively to cybersecurity by providing a more robust and adaptable intrusion detection solution. 展开更多
关键词 network intrusion detection systems(NIDS) NF-UQ-NIDS-v2 dataset ensemble learning decision tree K-means SMOTE deep learning
在线阅读 下载PDF
Handling class imbalance of radio frequency interference in deep learning-based fast radio burst search pipelines using a deep convolutional generative adversarial network
3
作者 Wenlong Du Yanling Liu Maozheng Chen 《Astronomical Techniques and Instruments》 2025年第1期10-15,共6页
This paper addresses the performance degradation issue in a fast radio burst search pipeline based on deep learning.This issue is caused by the class imbalance of the radio frequency interference samples in the traini... This paper addresses the performance degradation issue in a fast radio burst search pipeline based on deep learning.This issue is caused by the class imbalance of the radio frequency interference samples in the training dataset,and one solution is applied to improve the distribution of the training data by augmenting minority class samples using a deep convolutional generative adversarial network.Experi.mental results demonstrate that retraining the deep learning model with the newly generated dataset leads to a new fast radio burst classifier,which effectively reduces false positives caused by periodic wide-band impulsive radio frequency interference,thereby enhancing the performance of the search pipeline. 展开更多
关键词 Fast radio burst deep convolutional generative adversarial network Class imbalance Radio frequency interference deep learning
在线阅读 下载PDF
The Blockchain Neural Network Superior to Deep Learning for Improving the Trust of Supply Chain
4
作者 Hsiao-Chun Han Der-Chen Huang 《Computer Modeling in Engineering & Sciences》 2025年第6期3921-3941,共21页
With the increasing importance of supply chain transparency,blockchain-based data has emerged as a valuable and verifiable source for analyzing procurement transaction risks.This study extends the mathematical model a... With the increasing importance of supply chain transparency,blockchain-based data has emerged as a valuable and verifiable source for analyzing procurement transaction risks.This study extends the mathematical model and proof of‘the Overall Performance Characteristics of the Supply Chain’to encompass multiple variables within blockchain data.Utilizing graph theory,the model is further developed into a single-layer neural network,which serves as the foundation for constructing two multi-layer deep learning neural network models,Feedforward Neural Network(abbreviated as FNN)and Deep Clustering Network(abbreviated as DCN).Furthermore,this study retrieves corporate data from the Chunghwa Yellow Pages online resource and Taiwan Economic Journal database(abbreviated as TEJ).These data are then virtualized using‘the Metaverse Algorithm’,and the selected virtualized blockchain variables are utilized to train a neural network model for classification.The results demonstrate that a single-layer neural network model,leveraging blockchain data and employing the Proof of Relation algorithm(abbreviated as PoR)as the activation function,effectively identifies anomalous enterprises,which constitute 7.2%of the total sample,aligning with expectations.In contrast,the multi-layer neural network models,DCN and FNN,classify an excessively large proportion of enterprises as anomalous(ranging from one-fourth to one-third),which deviates from expectations.This indicates that deep learning may still be inadequate in effectively capturing or identifying malicious corporate behaviors associated with distortions in procurement transaction data.In other words,procurement transaction blockchain data possesses intrinsic value that cannot be replaced by artificial intelligence(abbreviated as AI). 展开更多
关键词 Blockchain neural network deep learning consensus algorithm supply chain management information security management
在线阅读 下载PDF
A Deep Learning Framework for Arabic Cyberbullying Detection in Social Networks
5
作者 Yahya Tashtoush Areen Banysalim +3 位作者 Majdi Maabreh Shorouq Al-Eidi Ola Karajeh Plamen Zahariev 《Computers, Materials & Continua》 2025年第5期3113-3134,共22页
Social media has emerged as one of the most transformative developments on the internet,revolu-tionizing the way people communicate and interact.However,alongside its benefits,social media has also given rise to signi... Social media has emerged as one of the most transformative developments on the internet,revolu-tionizing the way people communicate and interact.However,alongside its benefits,social media has also given rise to significant challenges,one of the most pressing being cyberbullying.This issue has become a major concern in modern society,particularly due to its profound negative impacts on the mental health and well-being of its victims.In the Arab world,where social media usage is exceptionblly high,cyberbullying has become increasingly prevalent,necessitating urgent attention.Early detection of harmful online behavior is critical to fostering safer digital environments and mitigating the adverse efcts of cyberbullying.This underscores the importance of developing advanced tools and systems to identify and address such behavior efectively.This paper investigates the development of a robust cyberbullying detection and classifcation system tailored for Arabic comments on YouTube.The study explores the efectiveness of various deep learning models,including Bi-LSTM(Bidirectional Long Short Term Memory),LSTM(Long Short-Term Memory),CNN(Convolutional Neural Networks),and a hybrid CNN-LSTM,in classifying Arabic comments into binary classes(bullying or not)and multiclass categories.A comprehensive dataset of 20,000 Arabic YouTube comments was collected,preprocessed,and labeled to support these tasks.The results revealed that the CNN and hybrid CNN-LSTM models achieved the highest accuracy in binary classification,reaching an impressive 91.9%.For multiclass dlassification,the LSTM and Bi-LSTM models outperformed others,achieving an accuracy of 89.5%.These findings highlight the efctiveness of deep learning approaches in the mitigation of cyberbullying within Arabic online communities. 展开更多
关键词 Arabic text lassification arabic text mining cyberbullying detection neural networks deep learning CNN LSTM YOUTUBE Bi-LSTM
在线阅读 下载PDF
Deep reinforcement learning based latency-energy minimization in smart healthcare network
6
作者 Xin Su Xin Fang +2 位作者 Zhen Cheng Ziyang Gong Chang Choi 《Digital Communications and Networks》 2025年第3期795-805,共11页
Significant breakthroughs in the Internet of Things(IoT)and 5G technologies have driven several smart healthcare activities,leading to a flood of computationally intensive applications in smart healthcare networks.Mob... Significant breakthroughs in the Internet of Things(IoT)and 5G technologies have driven several smart healthcare activities,leading to a flood of computationally intensive applications in smart healthcare networks.Mobile Edge Computing(MEC)is considered as an efficient solution to provide powerful computing capabilities to latency or energy sensitive nodes.The low-latency and high-reliability requirements of healthcare application services can be met through optimal offloading and resource allocation for the computational tasks of the nodes.In this study,we established a system model consisting of two types of nodes by considering nondivisible and trade-off computational tasks between latency and energy consumption.To minimize processing cost of the system tasks,a Mixed-Integer Nonlinear Programming(MINLP)task offloading problem is proposed.Furthermore,this problem is decomposed into task offloading decisions and resource allocation problems.The resource allocation problem is solved using traditional optimization algorithms,and the offloading decision problem is solved using a deep reinforcement learning algorithm.We propose an Online Offloading based on the Deep Reinforcement Learning(OO-DRL)algorithm with parallel deep neural networks and a weightsensitive experience replay mechanism.Simulation results show that,compared with several existing methods,our proposed algorithm can perform real-time task offloading in a smart healthcare network in dynamically varying environments and reduce the system task processing cost. 展开更多
关键词 Smart healthcare network Mobile edge computing Resource allocation Computation offloading deep reinforcement learning
在线阅读 下载PDF
Dynamic Clustering Method for Underwater Wireless Sensor Networks based on Deep Reinforcement Learning
7
作者 Kohyar Bolvary Zadeh Dashtestani Reza Javidan Reza Akbari 《哈尔滨工程大学学报(英文版)》 2025年第4期864-876,共13页
Underwater wireless sensor networks(UWSNs)have emerged as a new paradigm of real-time organized systems,which are utilized in a diverse array of scenarios to manage the underwater environment surrounding them.One of t... Underwater wireless sensor networks(UWSNs)have emerged as a new paradigm of real-time organized systems,which are utilized in a diverse array of scenarios to manage the underwater environment surrounding them.One of the major challenges that these systems confront is topology control via clustering,which reduces the overload of wireless communications within a network and ensures low energy consumption and good scalability.This study aimed to present a clustering technique in which the clustering process and cluster head(CH)selection are performed based on the Markov decision process and deep reinforcement learning(DRL).DRL algorithm selects the CH by maximizing the defined reward function.Subsequently,the sensed data are collected by the CHs and then sent to the autonomous underwater vehicles.In the final phase,the consumed energy by each sensor is calculated,and its residual energy is updated.Then,the autonomous underwater vehicle performs all clustering and CH selection operations.This procedure persists until the point of cessation when the sensor’s power has been reduced to such an extent that no node can become a CH.Through analysis of the findings from this investigation and their comparison with alternative frameworks,the implementation of this method can be used to control the cluster size and the number of CHs,which ultimately augments the energy usage of nodes and prolongs the lifespan of the network.Our simulation results illustrate that the suggested methodology surpasses the conventional low-energy adaptive clustering hierarchy,the distance-and energy-constrained K-means clustering scheme,and the vector-based forward protocol and is viable for deployment in an actual operational environment. 展开更多
关键词 Underwater wireless sensor network CLUSTERING Cluster head selection deep reinforcement learning
暂未订购
In silico prediction of pK_(a) values using explainable deep learning methods 被引量:1
8
作者 Chen Yang Changda Gong +4 位作者 Zhixing Zhang Jiaojiao Fang Weihua Li Guixia Liu Yun Tang 《Journal of Pharmaceutical Analysis》 2025年第6期1264-1276,共13页
Negative logarithm of the acid dissociation constant(pK_(a))significantly influences the absorption,dis-tribution,metabolism,excretion,and toxicity(ADMET)properties of molecules and is a crucial indicator in drug rese... Negative logarithm of the acid dissociation constant(pK_(a))significantly influences the absorption,dis-tribution,metabolism,excretion,and toxicity(ADMET)properties of molecules and is a crucial indicator in drug research.Given the rapid and accurate characteristics of computational methods,their role in predicting drug properties is increasingly important.Although many pK_(a) prediction models currently exist,they often focus on enhancing model precision while neglecting interpretability.In this study,we present GraFpKa,a pK_(a) prediction model using graph neural networks(GNNs)and molecular finger-prints.The results show that our acidic and basic models achieved mean absolute errors(MAEs)of 0.621 and 0.402,respectively,on the test set,demonstrating good predictive performance.Notably,to improve interpretability,GraFpKa also incorporates Integrated Gradients(IGs),providing a clearer visual description of the atoms significantly affecting the pK_(a) values.The high reliability and interpretability of GraFpKa ensure accurate pKa predictions while also facilitating a deeper understanding of the relation-ship between molecular structure and pK_(a) values,making it a valuable tool in the field of pK_(a) prediction. 展开更多
关键词 pK_(a) deep learning Graph neural networks AttentiveFP Integrated gradients In silico prediction
在线阅读 下载PDF
Numerical Study of Dynamical System Using Deep Learning Approach
9
作者 Manana Chumburidze Miranda Mnatsakaniani +1 位作者 David Lekveishvili Nana Julakidze 《Open Journal of Applied Sciences》 2025年第2期425-432,共8页
This article is devoted to developing a deep learning method for the numerical solution of the partial differential equations (PDEs). Graph kernel neural networks (GKNN) approach to embedding graphs into a computation... This article is devoted to developing a deep learning method for the numerical solution of the partial differential equations (PDEs). Graph kernel neural networks (GKNN) approach to embedding graphs into a computationally numerical format has been used. In particular, for investigation mathematical models of the dynamical system of cancer cell invasion in inhomogeneous areas of human tissues have been considered. Neural operators were initially proposed to model the differential operator of PDEs. The GKNN mapping features between input data to the PDEs and their solutions have been constructed. The boundary integral method in combination with Green’s functions for a large number of boundary conditions is used. The tools applied in this development are based on the Fourier neural operators (FNOs), graph theory, theory elasticity, and singular integral equations. 展开更多
关键词 deep learning Graph Kernel network Green’s Tensor
在线阅读 下载PDF
Secure Malicious Node Detection in Decentralized Healthcare Networks Using Cloud and Edge Computing with Blockchain-Enabled Federated Learning
10
作者 Raj Sonani Reham Alhejaili +2 位作者 Pushpalika Chatterjee Khalid Hamad Alnafisah Jehad Ali 《Computer Modeling in Engineering & Sciences》 2025年第9期3169-3189,共21页
Healthcare networks are transitioning from manual records to electronic health records,but this shift introduces vulnerabilities such as secure communication issues,privacy concerns,and the presence of malicious nodes... Healthcare networks are transitioning from manual records to electronic health records,but this shift introduces vulnerabilities such as secure communication issues,privacy concerns,and the presence of malicious nodes.Existing machine and deep learning-based anomalies detection methods often rely on centralized training,leading to reduced accuracy and potential privacy breaches.Therefore,this study proposes a Blockchain-based-Federated Learning architecture for Malicious Node Detection(BFL-MND)model.It trains models locally within healthcare clusters,sharing only model updates instead of patient data,preserving privacy and improving accuracy.Cloud and edge computing enhance the model’s scalability,while blockchain ensures secure,tamper-proof access to health data.Using the PhysioNet dataset,the proposed model achieves an accuracy of 0.95,F1 score of 0.93,precision of 0.94,and recall of 0.96,outperforming baseline models like random forest(0.88),adaptive boosting(0.90),logistic regression(0.86),perceptron(0.83),and deep neural networks(0.92). 展开更多
关键词 Authentication blockchain deep learning federated learning healthcare network machine learning wearable sensor nodes
在线阅读 下载PDF
Deep learning applications advance plant genomics research
11
作者 Wenyuan Fan Zhongwei Guo +5 位作者 Xiang Wang Lingkui Zhang Yuanhang Liu Chengcheng Cai Kang Zhang Feng Cheng 《Horticultural Plant Journal》 2025年第5期1791-1806,共16页
With the rapid development of high-throughput sequencing technologies and the accumulation of large-scale multi-omics data,deep learning(DL)has emerged as a powerful tool to solve complex biological problems,with part... With the rapid development of high-throughput sequencing technologies and the accumulation of large-scale multi-omics data,deep learning(DL)has emerged as a powerful tool to solve complex biological problems,with particular promise in plant genomics.This review systematically examines the progress of DL applications in DNA,RNA,and protein sequence analysis,covering key tasks such as gene regulatory element identification,gene function annotation,and protein structure prediction,and highlighting how these DL applications illuminate research of plants,including horticultural plants.We evaluate the advantages of different neural network architectures and their applications in different biology studies,as well as the development of large language models(LLMs)in genomic modelling,such as the plantspecific models PDLLMs and AgroNT.We also briefly introduce the general workflow of the basic DL model for plant genomics study.While DL has significantly improved prediction accuracy in plant genomics,its broader application remains constrained by several challenges,including the limited availability of well-annotated data,computational capacity,innovative model architectures adapted to plant genomes,and model interpretability.Future advances will require interdisciplinary collaborations to develop DL applications for intelligent plant genomic research frameworks with broader applicability. 展开更多
关键词 deep learning GENOMICS Transfer learning Language model Multi-omics Neural network architecture
在线阅读 下载PDF
Deep Learning and Artificial Intelligence-Driven Advanced Methods for Acute Lymphoblastic Leukemia Identification and Classification: A Systematic Review
12
作者 Syed Ijaz Ur Rahman Naveed Abbas +5 位作者 Sikandar Ali Muhammad Salman Ahmed Alkhayat Jawad Khan Dildar Hussain Yeong Hyeon Gu 《Computer Modeling in Engineering & Sciences》 2025年第2期1199-1231,共33页
Automatic detection of Leukemia or blood cancer is one of the most challenging tasks that need to be addressed in the healthcare system.Analysis of white blood cells(WBCs)in the blood or bone marrow microscopic slide ... Automatic detection of Leukemia or blood cancer is one of the most challenging tasks that need to be addressed in the healthcare system.Analysis of white blood cells(WBCs)in the blood or bone marrow microscopic slide images play a crucial part in early identification to facilitate medical experts.For Acute Lymphocytic Leukemia(ALL),the most preferred part of the blood or marrow is to be analyzed by the experts before it spreads in the whole body and the condition becomes worse.The researchers have done a lot of work in this field,to demonstrate a comprehensive analysis few literature reviews have been published focusing on various artificial intelligence-based techniques like machine and deep learning detection of ALL.The systematic review has been done in this article under the PRISMA guidelines which presents the most recent advancements in this field.Different image segmentation techniques were broadly studied and categorized from various online databases like Google Scholar,Science Direct,and PubMed as image processing-based,traditional machine and deep learning-based,and advanced deep learning-based models were presented.Convolutional Neural Networks(CNN)based on traditional models and then the recent advancements in CNN used for the classification of ALL into its subtypes.A critical analysis of the existing methods is provided to offer clarity on the current state of the field.Finally,the paper concludes with insights and suggestions for future research,aiming to guide new researchers in the development of advanced automated systems for detecting life-threatening diseases. 展开更多
关键词 Acute lymphoblastic bone marrow SEGMENTATION CLASSIFICATION machine learning deep learning convolutional neural network
暂未订购
An Efficient Deep Learning-Based Hybrid Framework for Personality Trait Prediction through Behavioral Analysis
13
作者 Nareshkumar Raveendhran Nimala Krishnan 《Computers, Materials & Continua》 2025年第11期3253-3265,共13页
Social media outlets deliver customers a medium for communication,exchange,and expression of their thoughts with others.The advent of social networks and the fast escalation of the quantity of data have created opport... Social media outlets deliver customers a medium for communication,exchange,and expression of their thoughts with others.The advent of social networks and the fast escalation of the quantity of data have created opportunities for textual evaluation.Utilising the user corpus,characteristics of social platform users,and other data,academic research may accurately discern the personality traits of users.This research examines the traits of consumer personalities.Usually,personality tests administered by psychological experts via interviews or self-report questionnaires are costly,time-consuming,complex,and labour-intensive.Currently,academics in computational linguistics are increasingly focused on predicting personality traits from social media data.An individual’s personality comprises their traits and behavioral habits.To address this distinction,we propose a novel LSTMapproach(BERT-LIWC-LSTM)that simultaneously incorporates users’enduring and immediate personality characteristics for textual personality recognition.Long-termPersonality Encoding in the proposed paradigmcaptures and represents persisting personality traits.Short-termPersonality Capturing records changing personality states.Experimental results demonstrate that the designed BERT-LIWC-LSTM model achieves an average improvement in accuracy of 3.41% on the Big Five dataset compared to current methods,thereby justifying the efficacy of encoding both stable and dynamic personality traits simultaneously through long-and short-term feature interaction. 展开更多
关键词 PERSONALITY deep learning online social network LSTM big five
在线阅读 下载PDF
Research on SQL Injection Detection Technology Based on Content Matching and Deep Learning
14
作者 Yuqi Chen Guangjun Liang Qun Wang 《Computers, Materials & Continua》 2025年第7期1145-1167,共23页
Structured Query Language(SQL)injection attacks have become the most common means of attacking Web applications due to their simple implementation and high degree of harm.Traditional injection attack detection techniq... Structured Query Language(SQL)injection attacks have become the most common means of attacking Web applications due to their simple implementation and high degree of harm.Traditional injection attack detection techniques struggle to accurately identify various types of SQL injection attacks.This paper presents an enhanced SQL injection detection method that utilizes content matching technology to improve the accuracy and efficiency of detection.Features are extracted through content matching,effectively avoiding the loss of valid information,and an improved deep learning model is employed to enhance the detection effect of SQL injections.Considering that grammar parsing and word embedding may conceal key features and introduce noise,we propose training the transformed data vectors by preprocessing the data in the dataset and post-processing the word segmentation based on content matching.We optimized and adjusted the traditional Convolutional Neural Network(CNN)model,trained normal data,SQL injection data,and XSS data,and used these three deep learning models for attack detection.The experimental results show that the accuracy rate reaches 98.35%,achieving excellent detection results. 展开更多
关键词 SQL injection network security deep learning convolution neural network
在线阅读 下载PDF
Applications of Deep Learning in Mineral Discrimination:A Case Study of Quartz,Biotite and K-Feldspar from Granite
15
作者 Wei Lou Dexian Zhang 《Journal of Earth Science》 2025年第1期29-45,共17页
Mineral identification and discrimination play a significant role in geological study.Intelligent mineral discrimination based on deep learning has the advantages of automation,low cost,less time consuming and low err... Mineral identification and discrimination play a significant role in geological study.Intelligent mineral discrimination based on deep learning has the advantages of automation,low cost,less time consuming and low error rate.In this article,characteristics of quartz,biotite and Kfeldspar from granite thin sections under cross-polarized light were studied for mineral images intelligent classification by Inception-v3 deep learning convolutional neural network(CNN),and transfer learning method.Dynamic images from multi-angles were employed to enhance the accuracy and reproducibility in the process of mineral discrimination.Test results show that the average discrimination accuracies of quartz,biotite and K-feldspar are 100.00%,96.88%and 90.63%.Results of this study prove the feasibility and reliability of the application of convolution neural network in mineral images classification.This study could have a significant impact in explorations of complicated mineral intelligent discrimination using deep learning methods and it will provide a new perspective for the development of more professional and practical mineral intelligent discrimination tools. 展开更多
关键词 deep learning mineral discrimination Inception-v3 CNN transfer learning convolutional neural network
原文传递
Role of deep learning in cognitive healthcare:Wearable signal analysis,algorithms,benefits,and challenges
16
作者 Md.Sakib Bin Alam Aiman Lameesa +4 位作者 Senzuti Sharmin Shaila Afrin Shams Forruque Ahmed Mohammad Reza Nikoo Amir H.Gandomi 《Digital Communications and Networks》 2025年第3期642-670,共29页
Deep Learning(DL)offers promising solutions for analyzing wearable signals and gaining valuable insights into cognitive disorders.While previous review studies have explored various aspects of DL in cognitive healthca... Deep Learning(DL)offers promising solutions for analyzing wearable signals and gaining valuable insights into cognitive disorders.While previous review studies have explored various aspects of DL in cognitive healthcare,there remains a lack of comprehensive analysis that integrates wearable signals,data processing techniques,and the broader applications,benefits,and challenges of DL methods.Addressing this limitation,our study provides an extensive review of DL’s role in cognitive healthcare,with a particular emphasis on wearables,data processing,and the inherent challenges in this field.This review also highlights the considerable promise of DL approaches in addressing a broad spectrum of cognitive issues.By enhancing the understanding and analysis of wearable signal modalities,DL models can achieve remarkable accuracy in cognitive healthcare.Convolutional Neural Network(CNN),Recurrent Neural Network(RNN),and Long Short-term Memory(LSTM)networks have demonstrated improved performance and effectiveness in the early diagnosis and progression monitoring of neurological disorders.Beyond cognitive impairment detection,DL has been applied to emotion recognition,sleep analysis,stress monitoring,and neurofeedback.These applications lead to advanced diagnosis,personalized treatment,early intervention,assistive technologies,remote monitoring,and reduced healthcare costs.Nevertheless,the integration of DL and wearable technologies presents several challenges,such as data quality,privacy,interpretability,model generalizability,ethical concerns,and clinical adoption.These challenges emphasize the importance of conducting future research in areas such as multimodal signal analysis and explainable AI.The findings of this review aim to benefit clinicians,healthcare professionals,and society by facilitating better patient outcomes in cognitive healthcare. 展开更多
关键词 Cognitive healthcare deep learning Wearable sensor Convolutional neural network Recurrent neural network
暂未订购
Container cluster placement in edge computing based on reinforcement learning incorporating graph convolutional networks scheme
17
作者 Zhuo Chen Bowen Zhu Chuan Zhou 《Digital Communications and Networks》 2025年第1期60-70,共11页
Container-based virtualization technology has been more widely used in edge computing environments recently due to its advantages of lighter resource occupation, faster startup capability, and better resource utilizat... Container-based virtualization technology has been more widely used in edge computing environments recently due to its advantages of lighter resource occupation, faster startup capability, and better resource utilization efficiency. To meet the diverse needs of tasks, it usually needs to instantiate multiple network functions in the form of containers interconnect various generated containers to build a Container Cluster(CC). Then CCs will be deployed on edge service nodes with relatively limited resources. However, the increasingly complex and timevarying nature of tasks brings great challenges to optimal placement of CC. This paper regards the charges for various resources occupied by providing services as revenue, the service efficiency and energy consumption as cost, thus formulates a Mixed Integer Programming(MIP) model to describe the optimal placement of CC on edge service nodes. Furthermore, an Actor-Critic based Deep Reinforcement Learning(DRL) incorporating Graph Convolutional Networks(GCN) framework named as RL-GCN is proposed to solve the optimization problem. The framework obtains an optimal placement strategy through self-learning according to the requirements and objectives of the placement of CC. Particularly, through the introduction of GCN, the features of the association relationship between multiple containers in CCs can be effectively extracted to improve the quality of placement.The experiment results show that under different scales of service nodes and task requests, the proposed method can obtain the improved system performance in terms of placement error ratio, time efficiency of solution output and cumulative system revenue compared with other representative baseline methods. 展开更多
关键词 Edge computing network virtualization Container cluster deep reinforcement learning Graph convolutional network
在线阅读 下载PDF
APFed: Adaptive personalized federated learning for intrusion detection in maritime meteorological sensor networks
18
作者 Xin Su Guifu Zhang 《Digital Communications and Networks》 2025年第2期401-411,共11页
With the rapid development of advanced networking and computing technologies such as the Internet of Things, network function virtualization, and 5G infrastructure, new development opportunities are emerging for Marit... With the rapid development of advanced networking and computing technologies such as the Internet of Things, network function virtualization, and 5G infrastructure, new development opportunities are emerging for Maritime Meteorological Sensor Networks(MMSNs). However, the increasing number of intelligent devices joining the MMSN poses a growing threat to network security. Current Artificial Intelligence(AI) intrusion detection techniques turn intrusion detection into a classification problem, where AI excels. These techniques assume sufficient high-quality instances for model construction, which is often unsatisfactory for real-world operation with limited attack instances and constantly evolving characteristics. This paper proposes an Adaptive Personalized Federated learning(APFed) framework that allows multiple MMSN owners to engage in collaborative training. By employing an adaptive personalized update and a shared global classifier, the adverse effects of imbalanced, Non-Independent and Identically Distributed(Non-IID) data are mitigated, enabling the intrusion detection model to possess personalized capabilities and good global generalization. In addition, a lightweight intrusion detection model is proposed to detect various attacks with an effective adaptation to the MMSN environment. Finally, extensive experiments on a classical network dataset show that the attack classification accuracy is improved by about 5% compared to most baselines in the global scenarios. 展开更多
关键词 Intrusion detection Maritime meteorological sensor network Federated learning Personalized model deep learning
在线阅读 下载PDF
ideo-Based Human Activity Recognition Using Hybrid Deep Learning Model
19
作者 Jungpil Shin Md.Al Mehedi Hasan +2 位作者 Md.Maniruzzaman Satoshi Nishimura Sultan Alfarhood 《Computer Modeling in Engineering & Sciences》 2025年第6期3615-3638,共24页
Activity recognition is a challenging topic in the field of computer vision that has various applications,including surveillance systems,industrial automation,and human-computer interaction.Today,the demand for automa... Activity recognition is a challenging topic in the field of computer vision that has various applications,including surveillance systems,industrial automation,and human-computer interaction.Today,the demand for automation has greatly increased across industries worldwide.Real-time detection requires edge devices with limited computational time.This study proposes a novel hybrid deep learning system for human activity recognition(HAR),aiming to enhance the recognition accuracy and reduce the computational time.The proposed system combines a pretrained image classification model with a sequence analysis model.First,the dataset was divided into a training set(70%),validation set(10%),and test set(20%).Second,all the videos were converted into frames and deep-based features were extracted from each frame using convolutional neural networks(CNNs)with a vision transformer.Following that,bidirectional long short-term memory(BiLSTM)-and temporal convolutional network(TCN)-based models were trained using the training set,and their performances were evaluated using the validation set and test set.Four benchmark datasets(UCF11,UCF50,UCF101,and JHMDB)were used to evaluate the performance of the proposed HAR-based system.The experimental results showed that the combination of ConvNeXt and the TCN-based model achieved a recognition accuracy of 97.73%for UCF11,98.81%for UCF50,98.46%for UCF101,and 83.38%for JHMDB,respectively.This represents improvements in the recognition accuracy of 4%,2.67%,3.67%,and 7.08%for the UCF11,UCF50,UCF101,and JHMDB datasets,respectively,over existing models.Moreover,the proposed HAR-based system obtained superior recognition accuracy,shorter computational times,and minimal memory usage compared to the existing models. 展开更多
关键词 Human activity recognition BiLSTM ConvNeXt temporal convolutional network deep learning
在线阅读 下载PDF
Deep learning based identification of rock minerals from un-processed digital microscopic images of undisturbed broken-surfaces
20
作者 M.A.Dalhat Sami A.Osman 《Artificial Intelligence in Geosciences》 2025年第1期250-270,共21页
This study employed convolutional neural networks(CNNs)for the classification of rock minerals based on 3179 RGB-scale original microstructural images of undisturbed broken surfaces.The image dataset covers 40 distinc... This study employed convolutional neural networks(CNNs)for the classification of rock minerals based on 3179 RGB-scale original microstructural images of undisturbed broken surfaces.The image dataset covers 40 distinct rock mineral-types.Three CNN architectures(Simple model,SqueezeNet,and Xception)were evaluated to compare their performance and feature extraction capabilities.Gradient-weighted Class Activation Mapping(Grad-CAM)was employed to visualize the features influencing model predictions,providing insights into how each model distinguishes between mineral classes.Key discriminative attributes included texture,grain size,pattern,and color variations.Texture and grain boundaries were identified as the most critical features,as they were strongly activated regions by the best model.Patterns such as banding and chromatic contrasts further enhanced classification accuracy.Performance analysis revealed that the Simple model had limited ability to isolate fine-grained details,producing broad and less specific activations(0.84 test accuracy).SqueezeNet demonstrated improved localization of discriminative features but occasionally missed finer textural details(0.95 test accuracy).The Xception model outperformed the others,achieving the highest classification accuracy(0.98 test accuracy)by exhibiting precise and tightly focused activations,capturing intricate textures and subtle chromatic variations.Its superior performance can be attributed to its deep architecture and efficient depth-wise separable convolutions,which enabled hierarchical and detailed feature extraction.Results underscores the importance of texture,pattern,and chromatic features in accurate mineral classification and highlights the suitability of deep,efficient architectures like Xception for such tasks.These findings demonstrate the potential of CNNs in geoscience research,offering a framework for automated mineral identification in industrial and scientific applications. 展开更多
关键词 deep learning Rock minerals Convolutional neural networks GEOSCIENCE Artificial intelligence
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部