期刊文献+
共找到9篇文章
< 1 >
每页显示 20 50 100
Cardiovascular Sound Classification Using Neural Architectures and Deep Learning for Advancing Cardiac Wellness
1
作者 Deepak Mahto Sudhakar Kumar +6 位作者 Sunil KSingh Amit Chhabra Irfan Ahmad Khan Varsha Arya Wadee Alhalabi Brij B.Gupta Bassma Saleh Alsulami 《Computer Modeling in Engineering & Sciences》 2025年第6期3743-3767,共25页
Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscul... Cardiovascular diseases(CVDs)remain one of the foremost causes of death globally;hence,the need for several must-have,advanced automated diagnostic solutions towards early detection and intervention.Traditional auscultation of cardiovascular sounds is heavily reliant on clinical expertise and subject to high variability.To counter this limitation,this study proposes an AI-driven classification system for cardiovascular sounds whereby deep learning techniques are engaged to automate the detection of an abnormal heartbeat.We employ FastAI vision-learner-based convolutional neural networks(CNNs)that include ResNet,DenseNet,VGG,ConvNeXt,SqueezeNet,and AlexNet to classify heart sound recordings.Instead of raw waveform analysis,the proposed approach transforms preprocessed cardiovascular audio signals into spectrograms,which are suited for capturing temporal and frequency-wise patterns.The models are trained on the PASCAL Cardiovascular Challenge dataset while taking into consideration the recording variations,noise levels,and acoustic distortions.To demonstrate generalization,external validation using Google’s Audio set Heartbeat Sound dataset was performed using a dataset rich in cardiovascular sounds.Comparative analysis revealed that DenseNet-201,ConvNext Large,and ResNet-152 could deliver superior performance to the other architectures,achieving an accuracy of 81.50%,a precision of 85.50%,and an F1-score of 84.50%.In the process,we performed statistical significance testing,such as the Wilcoxon signed-rank test,to validate performance improvements over traditional classification methods.Beyond the technical contributions,the research underscores clinical integration,outlining a pathway in which the proposed system can augment conventional electronic stethoscopes and telemedicine platforms in the AI-assisted diagnostic workflows.We also discuss in detail issues of computational efficiency,model interpretability,and ethical considerations,particularly concerning algorithmic bias stemming from imbalanced datasets and the need for real-time processing in clinical settings.The study describes a scalable,automated system combining deep learning,feature extraction using spectrograms,and external validation that can assist healthcare providers in the early and accurate detection of cardiovascular disease.AI-driven solutions can be viable in improving access,reducing delays in diagnosis,and ultimately even the continued global burden of heart disease. 展开更多
关键词 Healthy society cardiovascular system SPECTROGRAM FastAI audio signals computer vision neural network
在线阅读 下载PDF
Parallelized Jaccard-Based Learning Method and MapReduce Implementation for Mobile Devices Recognition from Massive Network Data 被引量:2
2
作者 刘军 李银周 +2 位作者 Felix Cuadrado Steve Uhlig 雷振明 《China Communications》 SCIE CSCD 2013年第7期71-84,共14页
The ability of accurate and scalable mobile device recognition is critically important for mobile network operators and ISPs to understand their customers' behaviours and enhance their user experience.In this pape... The ability of accurate and scalable mobile device recognition is critically important for mobile network operators and ISPs to understand their customers' behaviours and enhance their user experience.In this paper,we propose a novel method for mobile device model recognition by using statistical information derived from large amounts of mobile network traffic data.Specifically,we create a Jaccardbased coefficient measure method to identify a proper keyword representing each mobile device model from massive unstructured textual HTTP access logs.To handle the large amount of traffic data generated from large mobile networks,this method is designed as a set of parallel algorithms,and is implemented through the MapReduce framework which is a distributed parallel programming model with proven low-cost and high-efficiency features.Evaluations using real data sets show that our method can accurately recognise mobile client models while meeting the scalability and producer-independency requirements of large mobile network operators.Results show that a 91.5% accuracy rate is achieved for recognising mobile client models from 2 billion records,which is dramatically higher than existing solutions. 展开更多
关键词 mobile device recognition data mining Jaccard coefficient measurement distributed computing MAPREDUCE
在线阅读 下载PDF
Enhancing the clinical relevance of haemorrhage prediction models in trauma
3
作者 Sankalp Tandle Jared M.Wohlgemut +6 位作者 Max E.R.Marsden Erhan Pisirir Evangelia Kyrimi Rebecca S.Stoner William Marsh Zane B.Perkins Nigel R.M.Tai 《Military Medical Research》 SCIE CAS CSCD 2024年第3期467-468,共2页
We read with interest the recent systematic reviewaArtificial intelligence and machine learning for hemorrhagic trauma careoby Peng et al.[1],which evaluated literature on machine learning(ML)in the management of trau... We read with interest the recent systematic reviewaArtificial intelligence and machine learning for hemorrhagic trauma careoby Peng et al.[1],which evaluated literature on machine learning(ML)in the management of traumatic haemorrhage.We thank the authors for their contribution to the role of ML in trauma. 展开更多
关键词 TRAUMA INJURY Blood transfusion Massive transfusion PREDICTION Artificial intelligence Machine learning
原文传递
Optimized Phishing Detection with Recurrent Neural Network and Whale Optimizer Algorithm
4
作者 Brij Bhooshan Gupta Akshat Gaurav +3 位作者 Razaz Waheeb Attar Varsha Arya Ahmed Alhomoud Kwok Tai Chui 《Computers, Materials & Continua》 SCIE EI 2024年第9期4895-4916,共22页
Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detec... Phishing attacks present a persistent and evolving threat in the cybersecurity land-scape,necessitating the development of more sophisticated detection methods.Traditional machine learning approaches to phishing detection have relied heavily on feature engineering and have often fallen short in adapting to the dynamically changing patterns of phishingUniformResource Locator(URLs).Addressing these challenge,we introduce a framework that integrates the sequential data processing strengths of a Recurrent Neural Network(RNN)with the hyperparameter optimization prowess of theWhale Optimization Algorithm(WOA).Ourmodel capitalizes on an extensive Kaggle dataset,featuring over 11,000 URLs,each delineated by 30 attributes.The WOA’s hyperparameter optimization enhances the RNN’s performance,evidenced by a meticulous validation process.The results,encapsulated in precision,recall,and F1-score metrics,surpass baseline models,achieving an overall accuracy of 92%.This study not only demonstrates the RNN’s proficiency in learning complex patterns but also underscores the WOA’s effectiveness in refining machine learning models for the critical task of phishing detection. 展开更多
关键词 Phishing detection Recurrent Neural Network(RNN) Whale Optimization Algorithm(WOA) CYBERSECURITY machine learning optimization
在线阅读 下载PDF
A Hybrid CNN-Brown-Bear Optimization Framework for Enhanced Detection of URL Phishing Attacks
5
作者 Brij B.Gupta Akshat Gaurav +4 位作者 Razaz Waheeb Attar Varsha Arya Shavi Bansal Ahmed Alhomoud Kwok Tai Chui 《Computers, Materials & Continua》 SCIE EI 2024年第12期4853-4874,共22页
Phishing attacks are more than two-decade-old attacks that attackers use to steal passwords related to financial services.After the first reported incident in 1995,its impact keeps on increasing.Also,during COVID-19,d... Phishing attacks are more than two-decade-old attacks that attackers use to steal passwords related to financial services.After the first reported incident in 1995,its impact keeps on increasing.Also,during COVID-19,due to the increase in digitization,there is an exponential increase in the number of victims of phishing attacks.Many deep learning and machine learning techniques are available to detect phishing attacks.However,most of the techniques did not use efficient optimization techniques.In this context,our proposed model used random forest-based techniques to select the best features,and then the Brown-Bear optimization algorithm(BBOA)was used to fine-tune the hyper-parameters of the convolutional neural network(CNN)model.To test our model,we used a dataset from Kaggle comprising 11,000+websites.In addition to that,the dataset also consists of the 30 features that are extracted from the website uniform resource locator(URL).The target variable has two classes:“Safe”and“Phishing.”Due to the use of BBOA,our proposed model detects malicious URLs with an accuracy of 93%and a precision of 92%.In addition,comparing our model with standard techniques,such as GRU(Gated Recurrent Unit),LSTM(Long Short-Term Memory),RNN(Recurrent Neural Network),ANN(Artificial Neural Network),SVM(Support Vector Machine),and LR(Logistic Regression),presents the effectiveness of our proposed model.Also,the comparison with past literature showcases the contribution and novelty of our proposed model. 展开更多
关键词 Phishing attack CNN brown-bear optimization
在线阅读 下载PDF
Multi-Agent Reinforcement Learning for Resource Allocation in Io T Networks with Edge Computing 被引量:12
6
作者 Xiaolan Liu Jiadong Yu +1 位作者 Zhiyong Feng Yue Gao 《China Communications》 SCIE CSCD 2020年第9期220-236,共17页
To support popular Internet of Things(IoT)applications such as virtual reality and mobile games,edge computing provides a front-end distributed computing archetype of centralized cloud computing with low latency and d... To support popular Internet of Things(IoT)applications such as virtual reality and mobile games,edge computing provides a front-end distributed computing archetype of centralized cloud computing with low latency and distributed data processing.However,it is challenging for multiple users to offload their computation tasks because they are competing for spectrum and computation as well as Radio Access Technologies(RAT)resources.In this paper,we investigate computation offloading mechanism of multiple selfish users with resource allocation in IoT edge computing networks by formulating it as a stochastic game.Each user is a learning agent observing its local network environment to learn optimal decisions on either local computing or edge computing with a goal of minimizing long term system cost by choosing its transmit power level,RAT and sub-channel without knowing any information of the other users.Since users’decisions are coupling at the gateway,we define the reward function of each user by considering the aggregated effect of other users.Therefore,a multi-agent reinforcement learning framework is developed to solve the game with the proposed Independent Learners based Multi-Agent Q-learning(IL-based MA-Q)algorithm.Simulations demonstrate that the proposed IL-based MA-Q algorithm is feasible to solve the formulated problem and is more energy efficient without extra cost on channel estimation at the centralized gateway.Finally,compared with the other three benchmark algorithms,it has better system cost performance and achieves distributed computation offloading. 展开更多
关键词 edge computing multi-agent reinforcement learning internet of things
在线阅读 下载PDF
Human Physical Activity Measurement Method Based on Electrostatic Induction
7
作者 Koichi Kurita 《Journal of Sensor Technology》 2014年第3期139-147,共9页
In this study, an effective noncontact and nonattached technique that is based on electrostatic induction current generated during walking motion is proposed for the detection and assessment of human physical activity... In this study, an effective noncontact and nonattached technique that is based on electrostatic induction current generated during walking motion is proposed for the detection and assessment of human physical activity. In addition, a theoretical model is proposed for the electrostatic induction current generated owing to variation in the electric potential of the human body. The proposed electrostatic induction current model is compared with the theoretical model, and the proposed model is shown to effectively explain the behavior of the electrostatic induction current waveform. The normal walking motions of daily living are recorded with a portable sensor located in a regular house. The obtained results show that detailed information of physical activity such as a gait cycle can be estimated using our proposed technique. Additionally, the walking signal was measured when the subject walked with the ankle and knee fastened to a splint with bandages to simulate a limp. Therefore, the proposed technique, which is based on the detection of signal generated during walking, can be successfully employed to assess human physical activity. 展开更多
关键词 HUMAN WALKING ELECTROSTATIC INDUCTION HUMAN PHYSICAL Activity WALKING with a LIMP
暂未订购
Finding susceptible and protective interaction patterns in large-scale genetic association study
8
作者 Yuan LI Yuhai ZHAO +4 位作者 Guoren WANG Xiaofeng ZHU Xiang ZHANG Zhanghui WANG Jun PANG 《Frontiers of Computer Science》 SCIE EI CSCD 2017年第3期541-554,共14页
Interaction detection in large-scale genetic asso- ciation studies has attracted intensive research interest, since many diseases have complex traits. Various approaches have been developed for finding significant gen... Interaction detection in large-scale genetic asso- ciation studies has attracted intensive research interest, since many diseases have complex traits. Various approaches have been developed for finding significant genetic interactions. In this article, we propose a novel framework SRMiner to detect interacting susceptible and protective genotype patterns. SR- Miner can discover not only probable combination of single nucleotide polymorphisms (SNPs) causing diseases but also the corresponding SNPs suppressing their pathogenic func- tions, which provides a better prospective to uncover the un- derlying relevance between genetic variants and complex dis- eases. We have performed extensive experiments on several real WeUcome Trust Case Control Consortium (WTCCC) datasets. We use the pathway-based and the protein-protein interaction (PPI) network-based evaluation methods to verify the discovered patterns. The results show that SRMiner successfully identifies many disease-related genes verified by the existing work. Furthermore, SRMiner can also infer some uncomfirmed but highly possible disease-related genes. 展开更多
关键词 genetic association studies genotype pattern mining data mining BIOINFORMATICS
原文传递
Flight trajectory grafting:Leveraging historical trajectories for more efficient arrival air traffic management
9
作者 Richard Louie Tak Shing Tai Rhea P.Liem 《Journal of the Air Transport Research Society》 2025年第1期167-183,共17页
Inside the terminal maneuvering area(TMA),flight trajectories need to be determined to maintain safe and efficient arrival operations.Air traffic control officers(ATCOs)devise trajectories and provide instructions to ... Inside the terminal maneuvering area(TMA),flight trajectories need to be determined to maintain safe and efficient arrival operations.Air traffic control officers(ATCOs)devise trajectories and provide instructions to pilots.The subjectivity involved in the decision-making exposes operational efficiency to factors such as workload,experience,and TMA complexity.Suboptimal trajectory solutions can increase arrival transit times,i.e.,the time spent from entering TMA to landing,leading to congestion and flight delays.These adverse effects are particularly critical during peak hours.While existing methods provide efficient trajectory solutions,they often overlook critical embedded features that constitute trajectory solution feasibility in real operations.To address these challenges,we propose a trajectory grafting method to generate high-fidelity,feature-embedded trajectories compatible with existing air traffic management systems.Trajectory grafting utilizes historical trajectory segments as components to construct situational flight trajectories that conform to given traffic dynamics and constraints.Collectively,these trajectory segments constitute a feasible design space,thereby eliminating the need to explicitly model operational constraints,flight physics,and ATCOs’workload.Our results demonstrate the benefits of this method,which reduces the average arrival transit time by 3%during peak hours.The benefits are further amplified by its compound effect,with up to 24%reductions in accumulated arrival transit times. 展开更多
关键词 Flight trajectory Arrival efficiency Separation deviation Arrival transit time Terminal Maneuvering Area
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部