期刊文献+
共找到4,416篇文章
< 1 2 221 >
每页显示 20 50 100
Computer Modeling Approaches for Blockchain-Driven Supply Chain Intelligence:A Review on Enhancing Transparency,Security,and Efficiency
1
作者 Puranam Revanth Kumar Gouse Baig Mohammad +4 位作者 Pallati Narsimhulu Dharnisha Narasappa Lakshmana Phaneendra Maguluri Subhav Singh Shitharth Selvarajan 《Computer Modeling in Engineering & Sciences》 2025年第9期2779-2818,共40页
Blockchain Technology(BT)has emerged as a transformative solution for improving the efficacy,security,and transparency of supply chain intelligence.Traditional Supply Chain Management(SCM)systems frequently have probl... Blockchain Technology(BT)has emerged as a transformative solution for improving the efficacy,security,and transparency of supply chain intelligence.Traditional Supply Chain Management(SCM)systems frequently have problems such as data silos,a lack of visibility in real time,fraudulent activities,and inefficiencies in tracking and traceability.Blockchain’s decentralized and irreversible ledger offers a solid foundation for dealing with these issues;it facilitates trust,security,and the sharing of data in real-time among all parties involved.Through an examination of critical technologies,methodology,and applications,this paper delves deeply into computer modeling based-blockchain framework within supply chain intelligence.The effect of BT on SCM is evaluated by reviewing current research and practical applications in the field.As part of the process,we delved through the research on blockchain-based supply chain models,smart contracts,Decentralized Applications(DApps),and how they connect to other cutting-edge innovations like Artificial Intelligence(AI)and the Internet of Things(IoT).To quantify blockchain’s performance,the study introduces analytical models for efficiency improvement,security enhancement,and scalability,enabling computational assessment and simulation of supply chain scenarios.These models provide a structured approach to predicting system performance under varying parameters.According to the results,BT increases efficiency by automating transactions using smart contracts,increases security by using cryptographic techniques,and improves transparency in the supply chain by providing immutable records.Regulatory concerns,challenges with interoperability,and scalability all work against broad adoption.To fully automate and intelligently integrate blockchain with AI and the IoT,additional research is needed to address blockchain’s current limitations and realize its potential for supply chain intelligence. 展开更多
关键词 Blockchain supply chain management TRANSPARENCY SECURITY smart contracts DECENTRALIZATION EFFICIENCY
在线阅读 下载PDF
Digital Twins and Cyber-Physical Systems:A New Frontier in Computer Modeling
2
作者 Vidyalakshmi G S Gopikrishnan +2 位作者 Wadii Boulila Anis Koubaa Gautam Srivastava 《Computer Modeling in Engineering & Sciences》 2025年第4期51-113,共63页
Cyber-Physical Systems(CPS)represent an integration of computational and physical elements,revolutionizing industries by enabling real-time monitoring,control,and optimization.A complementary technology,Digital Twin(D... Cyber-Physical Systems(CPS)represent an integration of computational and physical elements,revolutionizing industries by enabling real-time monitoring,control,and optimization.A complementary technology,Digital Twin(DT),acts as a virtual replica of physical assets or processes,facilitating better decision making through simulations and predictive analytics.CPS and DT underpin the evolution of Industry 4.0 by bridging the physical and digital domains.This survey explores their synergy,highlighting how DT enriches CPS with dynamic modeling,realtime data integration,and advanced simulation capabilities.The layered architecture of DTs within CPS is examined,showcasing the enabling technologies and tools vital for seamless integration.The study addresses key challenges in CPS modeling,such as concurrency and communication,and underscores the importance of DT in overcoming these obstacles.Applications in various sectors are analyzed,including smart manufacturing,healthcare,and urban planning,emphasizing the transformative potential of CPS-DT integration.In addition,the review identifies gaps in existing methodologies and proposes future research directions to develop comprehensive,scalable,and secure CPSDT systems.By synthesizing insights fromthe current literature and presenting a taxonomy of CPS and DT,this survey serves as a foundational reference for academics and practitioners.The findings stress the need for unified frameworks that align CPS and DT with emerging technologies,fostering innovation and efficiency in the digital transformation era. 展开更多
关键词 Cyber physical systems digital twin efficiency Industry 4.0 robustness and intelligence
在线阅读 下载PDF
Type-I Heavy-Tailed Burr XII Distribution with Applications to Quality Control,Skewed Reliability Engineering Systems and Lifetime Data
3
作者 Okechukwu J.Obulezi Hatem E.Semary +4 位作者 Sadia Nadir Chinyere P.Igbokwe Gabriel O.Orji A.S.Al-Moisheer Mohammed Elgarhy 《Computer Modeling in Engineering & Sciences》 2025年第9期2991-3027,共37页
This study introduces the type-I heavy-tailed Burr XII(TIHTBXII)distribution,a highly flexible and robust statistical model designed to address the limitations of conventional distributions in analyzing data character... This study introduces the type-I heavy-tailed Burr XII(TIHTBXII)distribution,a highly flexible and robust statistical model designed to address the limitations of conventional distributions in analyzing data characterized by skewness,heavy tails,and diverse hazard behaviors.We meticulously develop the TIHTBXII’s mathematical foundations,including its probability density function(PDF),cumulative distribution function(CDF),and essential statistical properties,crucial for theoretical understanding and practical application.A comprehensive Monte Carlo simulation evaluates four parameter estimation methods:maximum likelihood(MLE),maximum product spacing(MPS),least squares(LS),and weighted least squares(WLS).The simulation results consistently show that as sample sizes increase,the Bias and RMSE of all estimators decrease,with WLS and LS often demonstrating superior and more stable performance.Beyond theoretical development,we present a practical application of the TIHTBXII distribution in constructing a group acceptance sampling plan(GASP)for truncated life tests.This application highlights how the TIHTBXII model can optimize quality control decisions by minimizing the average sample number(ASN)while effectively managing consumer and producer risks.Empirical validation using real-world datasets,including“Active Repair Duration,”“Groundwater Contaminant Measurements,”and“Dominica COVID-19 Mortality,”further demonstrates the TIHTBXII’s superior fit compared to existing models.Our findings confirm the TIHTBXII distribution as a powerful and reliable alternative for accurately modeling complex data in fields such as reliability engineering and quality assessment,leading to more informed and robust decision-making. 展开更多
关键词 Acceptance sampling heavy-tailed models parameter estimation reliability engineering
在线阅读 下载PDF
Bat algorithm based on kinetic adaptation and elite communication for engineering problems
4
作者 Chong Yuan Dong Zhao +4 位作者 Ali Asghar Heidari Lei Liu Shuihua Wang Huiling Chen Yudong Zhang 《CAAI Transactions on Intelligence Technology》 2025年第4期1174-1200,共27页
The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and stron... The Bat algorithm,a metaheuristic optimization technique inspired by the foraging behaviour of bats,has been employed to tackle optimization problems.Known for its ease of implementation,parameter tunability,and strong global search capabilities,this algorithm finds application across diverse optimization problem domains.However,in the face of increasingly complex optimization challenges,the Bat algorithm encounters certain limitations,such as slow convergence and sensitivity to initial solutions.In order to tackle these challenges,the present study incorporates a range of optimization compo-nents into the Bat algorithm,thereby proposing a variant called PKEBA.A projection screening strategy is implemented to mitigate its sensitivity to initial solutions,thereby enhancing the quality of the initial solution set.A kinetic adaptation strategy reforms exploration patterns,while an elite communication strategy enhances group interaction,to avoid algorithm from local optima.Subsequently,the effectiveness of the proposed PKEBA is rigorously evaluated.Testing encompasses 30 benchmark functions from IEEE CEC2014,featuring ablation experiments and comparative assessments against classical algorithms and their variants.Moreover,real-world engineering problems are employed as further validation.The results conclusively demonstrate that PKEBA ex-hibits superior convergence and precision compared to existing algorithms. 展开更多
关键词 Bat algorithm engineering optimization global optimization metaheuristic algorithms
在线阅读 下载PDF
A Study on Outlier Detection and Feature Engineering Strategies in Machine Learning for Heart Disease Prediction 被引量:2
5
作者 Varada Rajkumar Kukkala Surapaneni Phani Praveen +1 位作者 Naga Satya Koti Mani Kumar Tirumanadham Parvathaneni Naga Srinivasu 《Computer Systems Science & Engineering》 2024年第5期1085-1112,共28页
This paper investigates the application ofmachine learning to develop a response model to cardiovascular problems and the use of AdaBoost which incorporates an application of Outlier Detection methodologies namely;Z-S... This paper investigates the application ofmachine learning to develop a response model to cardiovascular problems and the use of AdaBoost which incorporates an application of Outlier Detection methodologies namely;Z-Score incorporated with GreyWolf Optimization(GWO)as well as Interquartile Range(IQR)coupled with Ant Colony Optimization(ACO).Using a performance index,it is shown that when compared with the Z-Score and GWO with AdaBoost,the IQR and ACO,with AdaBoost are not very accurate(89.0%vs.86.0%)and less discriminative(Area Under the Curve(AUC)score of 93.0%vs.91.0%).The Z-Score and GWO methods also outperformed the others in terms of precision,scoring 89.0%;and the recall was also found to be satisfactory,scoring 90.0%.Thus,the paper helps to reveal various specific benefits and drawbacks associated with different outlier detection and feature selection techniques,which can be important to consider in further improving various aspects of diagnostics in cardiovascular health.Collectively,these findings can enhance the knowledge of heart disease prediction and patient treatment using enhanced and innovativemachine learning(ML)techniques.These findings when combined improve patient therapy knowledge and cardiac disease prediction through the use of cutting-edge and improved machine learning approaches.This work lays the groundwork for more precise diagnosis models by highlighting the benefits of combining multiple optimization methodologies.Future studies should focus on maximizing patient outcomes and model efficacy through research on these combinations. 展开更多
关键词 Grey wolf optimization ant colony optimization Z-SCORE interquartile range(IQR) ADABOOST OUTLIER
在线阅读 下载PDF
Early identification of stroke through deep learning with multi-modal human speech and movement data 被引量:4
6
作者 Zijun Ou Haitao Wang +9 位作者 Bin Zhang Haobang Liang Bei Hu Longlong Ren Yanjuan Liu Yuhu Zhang Chengbo Dai Hejun Wu Weifeng Li Xin Li 《Neural Regeneration Research》 SCIE CAS 2025年第1期234-241,共8页
Early identification and treatment of stroke can greatly improve patient outcomes and quality of life.Although clinical tests such as the Cincinnati Pre-hospital Stroke Scale(CPSS)and the Face Arm Speech Test(FAST)are... Early identification and treatment of stroke can greatly improve patient outcomes and quality of life.Although clinical tests such as the Cincinnati Pre-hospital Stroke Scale(CPSS)and the Face Arm Speech Test(FAST)are commonly used for stroke screening,accurate administration is dependent on specialized training.In this study,we proposed a novel multimodal deep learning approach,based on the FAST,for assessing suspected stroke patients exhibiting symptoms such as limb weakness,facial paresis,and speech disorders in acute settings.We collected a dataset comprising videos and audio recordings of emergency room patients performing designated limb movements,facial expressions,and speech tests based on the FAST.We compared the constructed deep learning model,which was designed to process multi-modal datasets,with six prior models that achieved good action classification performance,including the I3D,SlowFast,X3D,TPN,TimeSformer,and MViT.We found that the findings of our deep learning model had a higher clinical value compared with the other approaches.Moreover,the multi-modal model outperformed its single-module variants,highlighting the benefit of utilizing multiple types of patient data,such as action videos and speech audio.These results indicate that a multi-modal deep learning model combined with the FAST could greatly improve the accuracy and sensitivity of early stroke identification of stroke,thus providing a practical and powerful tool for assessing stroke patients in an emergency clinical setting. 展开更多
关键词 artificial intelligence deep learning DIAGNOSIS early detection FAST SCREENING STROKE
在线阅读 下载PDF
SEFormer:A Lightweight CNN-Transformer Based on Separable Multiscale Depthwise Convolution and Efficient Self-Attention for Rotating Machinery Fault Diagnosis 被引量:1
7
作者 Hongxing Wang Xilai Ju +1 位作者 Hua Zhu Huafeng Li 《Computers, Materials & Continua》 SCIE EI 2025年第1期1417-1437,共21页
Traditional data-driven fault diagnosis methods depend on expert experience to manually extract effective fault features of signals,which has certain limitations.Conversely,deep learning techniques have gained promine... Traditional data-driven fault diagnosis methods depend on expert experience to manually extract effective fault features of signals,which has certain limitations.Conversely,deep learning techniques have gained prominence as a central focus of research in the field of fault diagnosis by strong fault feature extraction ability and end-to-end fault diagnosis efficiency.Recently,utilizing the respective advantages of convolution neural network(CNN)and Transformer in local and global feature extraction,research on cooperating the two have demonstrated promise in the field of fault diagnosis.However,the cross-channel convolution mechanism in CNN and the self-attention calculations in Transformer contribute to excessive complexity in the cooperative model.This complexity results in high computational costs and limited industrial applicability.To tackle the above challenges,this paper proposes a lightweight CNN-Transformer named as SEFormer for rotating machinery fault diagnosis.First,a separable multiscale depthwise convolution block is designed to extract and integrate multiscale feature information from different channel dimensions of vibration signals.Then,an efficient self-attention block is developed to capture critical fine-grained features of the signal from a global perspective.Finally,experimental results on the planetary gearbox dataset and themotor roller bearing dataset prove that the proposed framework can balance the advantages of robustness,generalization and lightweight compared to recent state-of-the-art fault diagnosis models based on CNN and Transformer.This study presents a feasible strategy for developing a lightweight rotating machinery fault diagnosis framework aimed at economical deployment. 展开更多
关键词 CNN-Transformer separable multiscale depthwise convolution efficient self-attention fault diagnosis
在线阅读 下载PDF
Exploring Deep Learning Methods for Computer Vision Applications across Multiple Sectors:Challenges and Future Trends
8
作者 Narayanan Ganesh Rajendran Shankar +3 位作者 Miroslav Mahdal Janakiraman SenthilMurugan Jasgurpreet Singh Chohan Kanak Kalita 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第4期103-141,共39页
Computer vision(CV)was developed for computers and other systems to act or make recommendations based on visual inputs,such as digital photos,movies,and other media.Deep learning(DL)methods are more successful than ot... Computer vision(CV)was developed for computers and other systems to act or make recommendations based on visual inputs,such as digital photos,movies,and other media.Deep learning(DL)methods are more successful than other traditional machine learning(ML)methods inCV.DL techniques can produce state-of-the-art results for difficult CV problems like picture categorization,object detection,and face recognition.In this review,a structured discussion on the history,methods,and applications of DL methods to CV problems is presented.The sector-wise presentation of applications in this papermay be particularly useful for researchers in niche fields who have limited or introductory knowledge of DL methods and CV.This review will provide readers with context and examples of how these techniques can be applied to specific areas.A curated list of popular datasets and a brief description of them are also included for the benefit of readers. 展开更多
关键词 Neural network machine vision classification object detection deep learning
在线阅读 下载PDF
IoT-Based Real-Time Medical-Related Human Activity Recognition Using Skeletons and Multi-Stage Deep Learning for Healthcare 被引量:1
9
作者 Subrata Kumer Paul Abu Saleh Musa Miah +3 位作者 Rakhi Rani Paul Md.EkramulHamid Jungpil Shin Md Abdur Rahim 《Computers, Materials & Continua》 2025年第8期2513-2530,共18页
The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for he... The Internet of Things(IoT)and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients.Recognizing Medical-Related Human Activities(MRHA)is pivotal for healthcare systems,particularly for identifying actions critical to patient well-being.However,challenges such as high computational demands,low accuracy,and limited adaptability persist in Human Motion Recognition(HMR).While some studies have integrated HMR with IoT for real-time healthcare applications,limited research has focused on recognizing MRHA as essential for effective patient monitoring.This study proposes a novel HMR method tailored for MRHA detection,leveraging multi-stage deep learning techniques integrated with IoT.The approach employs EfficientNet to extract optimized spatial features from skeleton frame sequences using seven Mobile Inverted Bottleneck Convolutions(MBConv)blocks,followed by Convolutional Long Short Term Memory(ConvLSTM)to capture spatio-temporal patterns.A classification module with global average pooling,a fully connected layer,and a dropout layer generates the final predictions.The model is evaluated on the NTU RGB+D 120 and HMDB51 datasets,focusing on MRHA such as sneezing,falling,walking,sitting,etc.It achieves 94.85%accuracy for cross-subject evaluations and 96.45%for cross-view evaluations on NTU RGB+D 120,along with 89.22%accuracy on HMDB51.Additionally,the system integrates IoT capabilities using a Raspberry Pi and GSM module,delivering real-time alerts via Twilios SMS service to caregivers and patients.This scalable and efficient solution bridges the gap between HMR and IoT,advancing patient monitoring,improving healthcare outcomes,and reducing costs. 展开更多
关键词 Real-time human motion recognition(HMR) ENConvLSTM EfficientNet ConvLSTM skeleton data NTU RGB+D 120 dataset MRHA
在线阅读 下载PDF
On large language models safety,security,and privacy:A survey 被引量:1
10
作者 Ran Zhang Hong-Wei Li +2 位作者 Xin-Yuan Qian Wen-Bo Jiang Han-Xiao Chen 《Journal of Electronic Science and Technology》 2025年第1期1-21,共21页
The integration of artificial intelligence(AI)technology,particularly large language models(LLMs),has become essential across various sectors due to their advanced language comprehension and generation capabilities.De... The integration of artificial intelligence(AI)technology,particularly large language models(LLMs),has become essential across various sectors due to their advanced language comprehension and generation capabilities.Despite their transformative impact in fields such as machine translation and intelligent dialogue systems,LLMs face significant challenges.These challenges include safety,security,and privacy concerns that undermine their trustworthiness and effectiveness,such as hallucinations,backdoor attacks,and privacy leakage.Previous works often conflated safety issues with security concerns.In contrast,our study provides clearer and more reasonable definitions for safety,security,and privacy within the context of LLMs.Building on these definitions,we provide a comprehensive overview of the vulnerabilities and defense mechanisms related to safety,security,and privacy in LLMs.Additionally,we explore the unique research challenges posed by LLMs and suggest potential avenues for future research,aiming to enhance the robustness and reliability of LLMs in the face of emerging threats. 展开更多
关键词 Large language models Privacy issues Safety issues Security issues
在线阅读 下载PDF
LoRa Sense:Sensing and Optimization of LoRa Link Behavior Using Path-Loss Models in Open-Cast Mines
11
作者 Bhanu Pratap Reddy Bhavanam Prashanth Ragam 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期425-466,共42页
The Internet of Things(IoT)has orchestrated various domains in numerous applications,contributing significantly to the growth of the smart world,even in regions with low literacy rates,boosting socio-economic developm... The Internet of Things(IoT)has orchestrated various domains in numerous applications,contributing significantly to the growth of the smart world,even in regions with low literacy rates,boosting socio-economic development.This study provides valuable insights into optimizing wireless communication,paving the way for a more connected and productive future in the mining industry.The IoT revolution is advancing across industries,but harsh geometric environments,including open-pit mines,pose unique challenges for reliable communication.The advent of IoT in the mining industry has significantly improved communication for critical operations through the use of Radio Frequency(RF)protocols such as Bluetooth,Wi-Fi,GSM/GPRS,Narrow Band(NB)-IoT,SigFox,ZigBee,and Long Range Wireless Area Network(LoRaWAN).This study addresses the optimization of network implementations by comparing two leading free-spreading IoT-based RF protocols such as ZigBee and LoRaWAN.Intensive field tests are conducted in various opencast mines to investigate coverage potential and signal attenuation.ZigBee is tested in the Tadicherla open-cast coal mine in India.Similarly,LoRaWAN field tests are conducted at one of the associated cement companies(ACC)in the limestone mine in Bargarh,India,covering both Indoor-toOutdoor(I2O)and Outdoor-to-Outdoor(O2O)environments.A robust framework of path-loss models,referred to as Free space,Egli,Okumura-Hata,Cost231-Hata and Ericsson models,combined with key performance metrics,is employed to evaluate the patterns of signal attenuation.Extensive field testing and careful data analysis revealed that the Egli model is the most consistent path-loss model for the ZigBee protocol in an I2O environment,with a coefficient of determination(R^(2))of 0.907,balanced error metrics such as Normalized Root Mean Square Error(NRMSE)of 0.030,Mean Square Error(MSE)of 4.950,Mean Absolute Percentage Error(MAPE)of 0.249 and Scatter Index(SI)of 2.723.In the O2O scenario,the Ericsson model showed superior performance,with the highest R^(2)value of 0.959,supported by strong correlation metrics:NRMSE of 0.026,MSE of 8.685,MAPE of 0.685,Mean Absolute Deviation(MAD)of 20.839 and SI of 2.194.For the LoRaWAN protocol,the Cost-231 model achieved the highest R^(2)value of 0.921 in the I2O scenario,complemented by the lowest metrics:NRMSE of 0.018,MSE of 1.324,MAPE of 0.217,MAD of 9.218 and SI of 1.238.In the O2O environment,the Okumura-Hata model achieved the highest R^(2)value of 0.978,indicating a strong fit with metrics NRMSE of 0.047,MSE of 27.807,MAPE of 27.494,MAD of 37.287 and SI of 3.927.This advancement in reliable communication networks promises to transform the opencast landscape into networked signal attenuation.These results support decision-making for mining needs and ensure reliable communications even in the face of formidable obstacles. 展开更多
关键词 Internet of things long range wireless area network ZigBee mining environments path-loss models coefficient of determination mean square error
在线阅读 下载PDF
Providing Robust and Low-Cost Edge Computing in Smart Grid:An Energy Harvesting Based Task Scheduling and Resource Management Framework 被引量:1
12
作者 Xie Zhigang Song Xin +1 位作者 Xu Siyang Cao Jing 《China Communications》 2025年第2期226-240,共15页
Recently,one of the main challenges facing the smart grid is insufficient computing resources and intermittent energy supply for various distributed components(such as monitoring systems for renewable energy power sta... Recently,one of the main challenges facing the smart grid is insufficient computing resources and intermittent energy supply for various distributed components(such as monitoring systems for renewable energy power stations).To solve the problem,we propose an energy harvesting based task scheduling and resource management framework to provide robust and low-cost edge computing services for smart grid.First,we formulate an energy consumption minimization problem with regard to task offloading,time switching,and resource allocation for mobile devices,which can be decoupled and transformed into a typical knapsack problem.Then,solutions are derived by two different algorithms.Furthermore,we deploy renewable energy and energy storage units at edge servers to tackle intermittency and instability problems.Finally,we design an energy management algorithm based on sampling average approximation for edge computing servers to derive the optimal charging/discharging strategies,number of energy storage units,and renewable energy utilization.The simulation results show the efficiency and superiority of our proposed framework. 展开更多
关键词 edge computing energy harvesting energy storage unit renewable energy sampling average approximation task scheduling
在线阅读 下载PDF
Modeling and Comprehensive Review of Signaling Storms in 3GPP-Based Mobile Broadband Networks:Causes,Solutions,and Countermeasures
13
作者 Muhammad Qasim Khan Fazal Malik +1 位作者 Fahad Alturise Noor Rahman 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期123-153,共31页
Control signaling is mandatory for the operation and management of all types of communication networks,including the Third Generation Partnership Project(3GPP)mobile broadband networks.However,they consume important a... Control signaling is mandatory for the operation and management of all types of communication networks,including the Third Generation Partnership Project(3GPP)mobile broadband networks.However,they consume important and scarce network resources such as bandwidth and processing power.There have been several reports of these control signaling turning into signaling storms halting network operations and causing the respective Telecom companies big financial losses.This paper draws its motivation from such real network disaster incidents attributed to signaling storms.In this paper,we present a thorough survey of the causes,of the signaling storm problems in 3GPP-based mobile broadband networks and discuss in detail their possible solutions and countermeasures.We provide relevant analytical models to help quantify the effect of the potential causes and benefits of their corresponding solutions.Another important contribution of this paper is the comparison of the possible causes and solutions/countermeasures,concerning their effect on several important network aspects such as architecture,additional signaling,fidelity,etc.,in the form of a table.This paper presents an update and an extension of our earlier conference publication.To our knowledge,no similar survey study exists on the subject. 展开更多
关键词 Signaling storm problems control signaling load analytical modeling 3GPP networks smart devices diameter signaling mobile broadband data access data traffic mobility management signaling network architecture 5G mobile communication
在线阅读 下载PDF
ChatGPT in Research and Education:A SWOT Analysis of Its Academic Impact
14
作者 Abu Saleh Musa Miah Md Mahbubur Rahman Tusher +5 位作者 Md.Moazzem Hossain Md Mamun Hossain Md Abdur Rahim Md Ekramul Hamid Md.Saiful Islam Jungpil Shin 《Computer Modeling in Engineering & Sciences》 2025年第6期2573-2614,共42页
Advanced artificial intelligence technologies such as ChatGPT and other large language models(LLMs)have significantly impacted fields such as education and research in recent years.ChatGPT benefits students and educat... Advanced artificial intelligence technologies such as ChatGPT and other large language models(LLMs)have significantly impacted fields such as education and research in recent years.ChatGPT benefits students and educators by providing personalized feedback,facilitating interactive learning,and introducing innovative teaching methods.While many researchers have studied ChatGPT across various subject domains,few analyses have focused on the engineering domain,particularly in addressing the risks of academic dishonesty and potential declines in critical thinking skills.To address this gap,this study explores both the opportunities and limitations of ChatGPT in engineering contexts through a two-part analysis.First,we conducted experiments with ChatGPT to assess its effectiveness in tasks such as code generation,error checking,and solution optimization.Second,we surveyed 125 users,predominantly engineering students,to analyze ChatGPTs role in academic support.Our findings reveal that 93.60%of respondents use ChatGPT for quick academic answers,particularly among early-stage university students,and that 84.00%find it helpful for sourcing research materials.The study also highlights ChatGPT’s strengths in programming assistance,with 84.80%of users utilizing it for debugging and 86.40%for solving coding problems.However,limitations persist,with many users reporting inaccuracies in mathematical solutions and occasional false citations.Furthermore,the reliance on the free version by 96%of users underscores its accessibility but also suggests limitations in resource availability.This work provides key insights into ChatGPT’s strengths and limitations,establishing a framework for responsible AI use in education.Highlighting areas for improvement marks a milestone in understanding and optimizing AI’s role in academia for sustainable future use. 展开更多
关键词 Academic course planning ChatGPT educational technology RESEARCH programming education large language model GPT-3 ChatGPT survey GPT-4 artificial intelligence SWOT
在线阅读 下载PDF
Navigating the Blockchain Trilemma:A Review of Recent Advances and Emerging Solutions in Decentralization,Security,and Scalability Optimization
15
作者 Saha Reno Koushik Roy 《Computers, Materials & Continua》 2025年第8期2061-2119,共59页
The blockchain trilemma—balancing decentralization,security,and scalability—remains a critical challenge in distributed ledger technology.Despite significant advancements,achieving all three attributes simultaneousl... The blockchain trilemma—balancing decentralization,security,and scalability—remains a critical challenge in distributed ledger technology.Despite significant advancements,achieving all three attributes simultaneously continues to elude most blockchain systems,often forcing trade-offs that limit their real-world applicability.This review paper synthesizes current research efforts aimed at resolving the trilemma,focusing on innovative consensus mechanisms,sharding techniques,layer-2 protocols,and hybrid architectural models.We critically analyze recent breakthroughs,including Directed Acyclic Graph(DAG)-based structures,cross-chain interoperability frameworks,and zero-knowledge proof(ZKP)enhancements,which aimto reconcile scalability with robust security and decentralization.Furthermore,we evaluate the trade-offs inherent in these approaches,highlighting their practical implications for enterprise adoption,decentralized finance(DeFi),and Web3 ecosystems.By mapping the evolving landscape of solutions,this review identifies gaps in currentmethodologies and proposes future research directions,such as adaptive consensus algorithms and artificial intelligence-driven(AI-driven)governance models.Our analysis underscores that while no universal solution exists,interdisciplinary innovations are progressively narrowing the trilemma’s constraints,paving the way for next-generation blockchain infrastructures. 展开更多
关键词 Blockchain trilemma SCALABILITY DECENTRALIZATION SECURITY consensus algorithms sharding layer-2 solutions DAG-based architectures cross-chain interoperability blockchain optimization
在线阅读 下载PDF
Adaptive Relay-Assisted WBAN Protocol:Enhancing Energy Efficiency and QoS through Advanced Multi-Criteria Decision-Making
16
作者 Surender Singh Naveen Bilandi 《Computer Modeling in Engineering & Sciences》 2025年第7期489-509,共21页
Wireless Body Area Network(WBAN)is essential for continuous health monitoring.However,they face energy efficiency challenges due to the low power consumption of sensor nodes.Current WBAN routing protocols face limitat... Wireless Body Area Network(WBAN)is essential for continuous health monitoring.However,they face energy efficiency challenges due to the low power consumption of sensor nodes.Current WBAN routing protocols face limitations in strategically minimizing energy consumption during the retrieval of vital health parameters.Efficient network traffic management remains a challenge,with existing approaches often resulting in increased delay and reduced throughput.Additionally,insufficient attention has been paid to enhancing channel capacity to maintain signal strength and mitigate fading effects under dynamic and robust operating scenarios.Several routing strategies and procedures have been developed to effectively reduce communication-related energy consumption based on the selection of relay nodes.The relay node selection is essential for data transmission in WBAN.This paper introduces an Adaptive Relay-Assisted Protocol(ARAP)for WBAN,a hybrid routing protocol designed to optimize energy use and Quality of Service(QoS)metrics such as network longevity,latency,throughput,and residual energy.ARAP employs neutrosophic relay node selection techniques,including the Analytic Hierarchy Process(AHP)and Technique for Order Preference by Similarity to Ideal Solution(TOPSIS)to optimally resolve data and decision-making uncertainties.The protocol was compared with existing protocols such as Low-Energy Adaptive Clustering Hierarchy(LEACH),Modified-Adaptive Threshold Testing and Evaluation Methodology for Performance Testing(M-ATTEMPT),Wireless Adaptive Sampling Protocol(WASP),and Tree-Based Multicast Quality of Service(TMQoS).The comparative results show that the ARAP significantly outperformed these protocols in terms of network longevity and energy efficiency.ARAP has lower communication cost,better throughput,reduced delay,increased network lifetime,and enhanced residual energy.The simulation results indicate that the proposed approach performed better than the conventional methods,with 68%,62%,25%,and 50%improvements in network longevity,residual energy,throughput,and latency,respectively.This significantly improves the functional lifespan of WBAN and makes them promising candidates for sophisticated health monitoring systems. 展开更多
关键词 WBAN energy efficiency neutrosophic-AHP TOPSIS relay node QOS hybrid routing
暂未订购
A Comprehensive Study of Resource Provisioning and Optimization in Edge Computing
17
作者 Sreebha Bhaskaran Supriya Muthuraman 《Computers, Materials & Continua》 2025年第6期5037-5070,共34页
Efficient resource provisioning,allocation,and computation offloading are critical to realizing lowlatency,scalable,and energy-efficient applications in cloud,fog,and edge computing.Despite its importance,integrating ... Efficient resource provisioning,allocation,and computation offloading are critical to realizing lowlatency,scalable,and energy-efficient applications in cloud,fog,and edge computing.Despite its importance,integrating Software Defined Networks(SDN)for enhancing resource orchestration,task scheduling,and traffic management remains a relatively underexplored area with significant innovation potential.This paper provides a comprehensive review of existing mechanisms,categorizing resource provisioning approaches into static,dynamic,and user-centric models,while examining applications across domains such as IoT,healthcare,and autonomous systems.The survey highlights challenges such as scalability,interoperability,and security in managing dynamic and heterogeneous infrastructures.This exclusive research evaluates how SDN enables adaptive policy-based handling of distributed resources through advanced orchestration processes.Furthermore,proposes future directions,including AI-driven optimization techniques and hybrid orchestrationmodels.By addressing these emerging opportunities,thiswork serves as a foundational reference for advancing resource management strategies in next-generation cloud,fog,and edge computing ecosystems.This survey concludes that SDN-enabled computing environments find essential guidance in addressing upcoming management opportunities. 展开更多
关键词 Cloud computing edge computing fog computing resource provisioning resource allocation computation offloading optimization techniques software defined network
在线阅读 下载PDF
Evaluating Domain Randomization Techniques in DRL Agents:A Comparative Study of Normal,Randomized,and Non-Randomized Resets
18
作者 Abubakar Elsafi 《Computer Modeling in Engineering & Sciences》 2025年第8期1749-1766,共18页
Domain randomization is a widely adopted technique in deep reinforcement learning(DRL)to improve agent generalization by exposing policies to diverse environmental conditions.This paper investigates the impact of diff... Domain randomization is a widely adopted technique in deep reinforcement learning(DRL)to improve agent generalization by exposing policies to diverse environmental conditions.This paper investigates the impact of different reset strategies,normal,non-randomized,and randomized,on agent performance using the Deep Deterministic Policy Gradient(DDPG)and Twin Delayed DDPG(TD3)algorithms within the CarRacing-v2 environment.Two experimental setups were conducted:an extended training regime with DDPG for 1000 steps per episode across 1000 episodes,and a fast execution setup comparing DDPG and TD3 for 30 episodes with 50 steps per episode under constrained computational resources.A step-based reward scaling mechanism was applied under the randomized reset condition to promote broader state exploration.Experimental results showthat randomized resets significantly enhance learning efficiency and generalization,with DDPG demonstrating superior performance across all reset strategies.In particular,DDPG combined with randomized resets achieves the highest smoothed rewards(reaching approximately 15),best stability,and fastest convergence.These differences are statistically significant,as confirmed by t-tests:DDPG outperforms TD3 under randomized(t=−101.91,p<0.0001),normal(t=−21.59,p<0.0001),and non-randomized(t=−62.46,p<0.0001)reset conditions.The findings underscore the critical role of reset strategy and reward shaping in enhancing the robustness and adaptability of DRL agents in continuous control tasks,particularly in environments where computational efficiency and training stability are crucial. 展开更多
关键词 DDPG agent TD3 agent deep reinforcement learning domain randomization generalization non-randomized reset normal reset randomized reset
在线阅读 下载PDF
Edge-Fog Enhanced Post-Quantum Network Security: Applications, Challenges and Solutions
19
作者 Seo Yeon Moon Byung Hyun Jo +2 位作者 Abir El Azzaoui Sushil Kumar Singh Jong Hyuk Park 《Computers, Materials & Continua》 2025年第7期25-55,共31页
With the rapid advancement of ICT and IoT technologies,the integration of Edge and Fog Computing has become essential to meet the increasing demands for real-time data processing and network efficiency.However,these t... With the rapid advancement of ICT and IoT technologies,the integration of Edge and Fog Computing has become essential to meet the increasing demands for real-time data processing and network efficiency.However,these technologies face critical security challenges,exacerbated by the emergence of quantum computing,which threatens traditional encryption methods.The rise in cyber-attacks targeting IoT and Edge/Fog networks underscores the need for robust,quantum-resistant security solutions.To address these challenges,researchers are focusing on Quantum Key Distribution and Post-Quantum Cryptography,which utilize quantum-resistant algorithms and the principles of quantum mechanics to ensure data confidentiality and integrity.This paper reviews the current security practices in IoT and Edge/Fog environments,explores the latest advancements in QKD and PQC technologies,and discusses their integration into distributed computing systems.Additionally,this paper proposes an enhanced QKD protocol combining the Cascade protocol and Kyber algorithm to address existing limitations.Finally,we highlight future research directions aimed at improving the scalability,efficiency,and practicality of QKD and PQC for securing IoT and Edge/Fog networks against evolving quantum threats. 展开更多
关键词 Edge computing fog computing quantum key distribution security post-quantum cryptography cascade protocol
在线阅读 下载PDF
An Analytical Review of Large Language Models Leveraging KDGI Fine-Tuning,Quantum Embedding’s,and Multimodal Architectures
20
作者 Uddagiri Sirisha Chanumolu Kiran Kumar +2 位作者 Revathi Durgam Poluru Eswaraiah G Muni Nagamani 《Computers, Materials & Continua》 2025年第6期4031-4059,共29页
A complete examination of Large Language Models’strengths,problems,and applications is needed due to their rising use across disciplines.Current studies frequently focus on single-use situations and lack a comprehens... A complete examination of Large Language Models’strengths,problems,and applications is needed due to their rising use across disciplines.Current studies frequently focus on single-use situations and lack a comprehensive understanding of LLM architectural performance,strengths,and weaknesses.This gap precludes finding the appropriate models for task-specific applications and limits awareness of emerging LLM optimization and deployment strategies.In this research,50 studies on 25+LLMs,including GPT-3,GPT-4,Claude 3.5,DeepKet,and hybrid multimodal frameworks like ContextDET and GeoRSCLIP,are thoroughly reviewed.We propose LLM application taxonomy by grouping techniques by task focus—healthcare,chemistry,sentiment analysis,agent-based simulations,and multimodal integration.Advanced methods like parameter-efficient tuning(LoRA),quantumenhanced embeddings(DeepKet),retrieval-augmented generation(RAG),and safety-focused models(GalaxyGPT)are evaluated for dataset requirements,computational efficiency,and performance measures.Frameworks for ethical issues,data limited hallucinations,and KDGI-enhanced fine-tuning like Woodpecker’s post-remedy corrections are highlighted.The investigation’s scope,mad,and methods are described,but the primary results are not.The work reveals that domain-specialized fine-tuned LLMs employing RAG and quantum-enhanced embeddings performbetter for context-heavy applications.In medical text normalization,ChatGPT-4 outperforms previous models,while two multimodal frameworks,GeoRSCLIP,increase remote sensing.Parameter-efficient tuning technologies like LoRA have minimal computing cost and similar performance,demonstrating the necessity for adaptive models in multiple domains.To discover the optimum domain-specific models,explain domain-specific fine-tuning,and present quantum andmultimodal LLMs to address scalability and cross-domain issues.The framework helps academics and practitioners identify,adapt,and innovate LLMs for different purposes.This work advances the field of efficient,interpretable,and ethical LLM application research. 展开更多
关键词 Large languagemodels quantum embeddings fine-tuning techniques multimodal architectures ethical AI scenarios
在线阅读 下载PDF
上一页 1 2 221 下一页 到第
使用帮助 返回顶部