期刊文献+
共找到12,986篇文章
< 1 2 250 >
每页显示 20 50 100
Introduction to the Special Issue on Cutting-Edge Security and Privacy Solutions for Next-Generation Intelligent Mobile Internet Technologies and Applications
1
作者 Ilsun You Gaurav Choudhary +1 位作者 Gökhan Kul Francesco Falmieri 《Computer Modeling in Engineering & Sciences》 2026年第3期34-36,共3页
1 Introduction The growing connectivity with mobile internet has significantly enhanced our day-to-day life support through various services and applications with on-demand availability at any time or anywhere.As emer... 1 Introduction The growing connectivity with mobile internet has significantly enhanced our day-to-day life support through various services and applications with on-demand availability at any time or anywhere.As emerging technologies with continuous revolutions in the digital transformations,various add-on technologies such as quantum computing,AI,and next-generation networks such as 6G are becoming an integral support to mobile internet systems.The emerging technologies in the next-generation mobile internet bring a lot of new security and privacy challenges. 展开更多
关键词 mobile internet emerging technologies next generation networks services applications AI quantum computing quantum computingaiand digital transformationsvarious
在线阅读 下载PDF
基于OneClass SVM的应用层CC攻击检测模型研究
2
作者 胡鑫 张欣 张巧 《现代传输》 2026年第1期51-56,共6页
为了应对应用层CC攻击隐蔽性强、检测难度大的问题,本文提出了一种基于集成One-Class SVM模型的CC攻击检测方法。首先,从实际Web访问日志中提取多维特征,构建训练数据集,并采用特征子空间扰动、样本空间扰动及参数扰动等策略,提升子模... 为了应对应用层CC攻击隐蔽性强、检测难度大的问题,本文提出了一种基于集成One-Class SVM模型的CC攻击检测方法。首先,从实际Web访问日志中提取多维特征,构建训练数据集,并采用特征子空间扰动、样本空间扰动及参数扰动等策略,提升子模型的多样性和整体鲁棒性。随后,通过集成多个One-Class SVM子模型,形成综合判别机制,以提高检测准确率与降低误报率。实验结果表明,集成One-Class SVM模型在准确率、精确率、召回率、假正率及AUC等指标上均优于单一模型及传统方法,其中AUC值达到0.935。进一步通过消融实验验证了各模块对整体性能的贡献,充分证明了所提方法在应用层CC攻击检测中的有效性和实用性。 展开更多
关键词 OneClass SVM 应用层 cc攻击检测
在线阅读 下载PDF
Architecture, challenges and applications of dynamic reconfigurable computing 被引量:6
3
作者 Yanan Lu Leibo Liu +2 位作者 Jianfeng Zhu Shouyi Yin Shaojun Wei 《Journal of Semiconductors》 EI CAS CSCD 2020年第2期4-13,共10页
As a computing paradigm that combines temporal and spatial computations,dynamic reconfigurable computing provides superiorities of flexibility,energy efficiency and area efficiency,attracting interest from both academ... As a computing paradigm that combines temporal and spatial computations,dynamic reconfigurable computing provides superiorities of flexibility,energy efficiency and area efficiency,attracting interest from both academia and industry.However,dynamic reconfigurable computing is not yet mature because of several unsolved problems.This work introduces the concept,architecture,and compilation techniques of dynamic reconfigurable computing.It also discusses the existing major challenges and points out its potential applications. 展开更多
关键词 reconfigurable computing ARCHITECTURE CHALLENGE APPLICATION
在线阅读 下载PDF
Design of ECC based Secured Cloud Storage Mechanism for Transaction Rich Applications 被引量:5
4
作者 V.Gopinath R.S.Bhuvaneswaran 《Computers, Materials & Continua》 SCIE EI 2018年第11期341-352,共12页
Cloud computing is the highly demanded technology nowadays.Due to the service oriented architecture,seamless accessibility and other advantages of this advent technology,many transaction rich applications are making u... Cloud computing is the highly demanded technology nowadays.Due to the service oriented architecture,seamless accessibility and other advantages of this advent technology,many transaction rich applications are making use of it.At the same time,it is vulnerable to hacks and threats.Hence securing this environment is of at most important and many research works are being reported focusing on it.This paper proposes a safe storage mechanism using Elliptic curve cryptography(ECC)for the Transaction Rich Applications(TRA).With ECC based security scheme,the security level of the protected system will be increased and it is more suitable to secure the delivered data in the portable devices.The proposed scheme shields the aligning of different kind of data elements to each provider using an ECC algorithm.Analysis,comparison and simulation prove that the proposed system is more effective and secure for the Transaction rich applications in Cloud. 展开更多
关键词 Ecc SSL VPN cloud computing BANKING security transaction rich applications
在线阅读 下载PDF
Virtualization Technology in Cloud Computing Based Radio Access Networks:A Primer 被引量:2
5
作者 ZHANG Xian PENG Mugen 《ZTE Communications》 2017年第4期47-66,共20页
Since virtualization technology enables the abstraction and sharing of resources in a flexible management way, the overall expenses of network deployment can be significantly reduced. Therefore, the technology has bee... Since virtualization technology enables the abstraction and sharing of resources in a flexible management way, the overall expenses of network deployment can be significantly reduced. Therefore, the technology has been widely applied in the core network. With the tremendous growth in mobile traffic and services, it is natural to extend virtualization technology to the cloud computing based radio access networks(CCRANs) for achieving high spectral efficiency with low cost.In this paper, the virtualization technologies in CC-RANs are surveyed, including the system architecture, key enabling techniques, challenges, and open issues. The enabling key technologies for virtualization in CC-RANs mainly including virtual resource allocation, radio access network(RAN) slicing, mobility management, and social-awareness have been comprehensively surveyed to satisfy the isolation, customization and high-efficiency utilization of radio resources. The challenges and open issues mainly focus on virtualization levels for CC-RANs, signaling design for CC-RAN virtualization, performance analysis for CC-RAN virtualization, and network security for virtualized CC-RANs. 展开更多
关键词 network VIRTUALIZATION cc-RAN RAN SLICING FOG computing
在线阅读 下载PDF
Impact of Coronavirus Pandemic Crisis on Technologies and Cloud Computing Applications
6
作者 Ziyad R.Alashhab Mohammed Anbar +3 位作者 Manmeet Mahinderjit Singh Yu-Beng Leau Zaher Ali Al-Sai Sami Abu Alhayja’a 《Journal of Electronic Science and Technology》 CAS CSCD 2021年第1期25-40,共16页
In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of ... In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic. 展开更多
关键词 Big data privacy cloud computing(cc)applications COVID-19 digital transformation security challenge work from home
在线阅读 下载PDF
Improving Performance of Cloud Computing and Big Data Technologies and Applications 被引量:1
7
作者 Zhenjiang Dong 《ZTE Communications》 2014年第4期1-2,共2页
Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved c... Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios. 展开更多
关键词 Improving Performance of Cloud computing and Big Data Technologies and applications HBASE
在线阅读 下载PDF
Mobile Cloud Computing and Applications
8
作者 Chengzhong Xu 《ZTE Communications》 2011年第1期3-3,共1页
In 2010, cloud computing gained momentum. Cloud computing is a model for real-time, on-demand, pay-for-use network access to a shared pool of configurable computing and storage resources. It has matured from a promisi... In 2010, cloud computing gained momentum. Cloud computing is a model for real-time, on-demand, pay-for-use network access to a shared pool of configurable computing and storage resources. It has matured from a promising business concept to a working reality in both the private and public IT sectors. The U.S. government, for example, has requested all its agencies to evaluate cloud computing alternatives as part of their budget submissions for new IT investment. 展开更多
关键词 Mobile Cloud computing and applications IAAS
在线阅读 下载PDF
Edge Cloud Selection in Mobile Edge Computing(MEC)-Aided Applications for Industrial Internet of Things(IIoT)Services
9
作者 Dae-Young Kim SoYeon Lee +1 位作者 MinSeung Kim Seokhoon Kim 《Computer Systems Science & Engineering》 SCIE EI 2023年第11期2049-2060,共12页
In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to im... In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to improve IIoT service efficiency.There are two types of costs for this kind of IoT network:a communication cost and a computing cost.For service efficiency,the communication cost of data transmission should be minimized,and the computing cost in the edge cloud should be also minimized.Therefore,in this paper,the communication cost for data transmission is defined as the delay factor,and the computing cost in the edge cloud is defined as the waiting time of the computing intensity.The proposed method selects an edge cloud that minimizes the total cost of the communication and computing costs.That is,a device chooses a routing path to the selected edge cloud based on the costs.The proposed method controls the data flows in a mesh-structured network and appropriately distributes the data processing load.The performance of the proposed method is validated through extensive computer simulation.When the transition probability from good to bad is 0.3 and the transition probability from bad to good is 0.7 in wireless and edge cloud states,the proposed method reduced both the average delay and the service pause counts to about 25%of the existing method. 展开更多
关键词 Industrial Internet of Things(IIoT)network IIoT service mobile edge computing(MEC) edge cloud selection MEC-aided application
在线阅读 下载PDF
Research on the Development Trend and General Applications of the Fuzzy Rough Granularity Computing Theory
10
作者 Xin Li 《International Journal of Technology Management》 2016年第5期22-24,共3页
In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing pa... In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing paradigm which covers all the granularity the study of the theory, methods, techniques and the tools. In many areas are the basic ideas of granular computing, such as the interval analysis, rough set theory, clustering analysis and information retrieval, machine learning, database, etc. With the theory of domain known division of target concept and rule acquisition, in knowledge discovery, data mining and the pattern recognition is widely used. Under this basis, in this paper, we propose the fuzzy rough theory based computing paradigm that gains ideal performance. 展开更多
关键词 Development TREND applications Fuzzy ROUGH Granular computing THEORY
在线阅读 下载PDF
The Benefits of Using Google Cloud Computing for Developing Distributed Applications
11
作者 Isak Shabani Amir Kovaci Agni Dika 《Journal of Mathematics and System Science》 2015年第4期156-164,共9页
IT as a dynamic filed changes very rapidly; efficient management of such systems for the most of the companies requires handling tremendous complex situations in terms of hardware and software setup. Hardware and soft... IT as a dynamic filed changes very rapidly; efficient management of such systems for the most of the companies requires handling tremendous complex situations in terms of hardware and software setup. Hardware and software itself changes quickly with the time and keeping them updated is a difficult problem for the most of the companies; the problem is more emphasized for the companies having large infrastructure of IT facilities such as data centers which are expensive to be maintained. Many applications run on the company premises which require well prepared staff for successfully maintaining them. With the inception of Cloud Computing many companies have transferred their applications and data into cloud computing based platforms in order to have reduced maintaining cost, easier maintenance in terms of hardware and software, reliable and securely accessible services. The benefits of building distributed applications using Google infrastructure are conferred in this paper. 展开更多
关键词 Datastore BIGTABLE Distributed application Cloud computing
在线阅读 下载PDF
Scaled Up Chip Pushes Quantum Computing a Bit Closer to Reality 被引量:1
12
作者 Chris Palmer 《Engineering》 2025年第7期6-8,共3页
In the 9 December 2024 issue of Nature[1],a team of Google engineers reported breakthrough results using“Willow”,their lat-est quantum computing chip(Fig.1).By meeting a milestone“below threshold”reduction in the ... In the 9 December 2024 issue of Nature[1],a team of Google engineers reported breakthrough results using“Willow”,their lat-est quantum computing chip(Fig.1).By meeting a milestone“below threshold”reduction in the rate of errors that plague super-conducting circuit-based quantum computing systems(Fig.2),the work moves the field another step towards its promised super-charged applications,albeit likely still many years away.Areas expected to benefit from quantum computing include,among others,drug discovery,materials science,finance,cybersecurity,and machine learning. 展开更多
关键词 materials science BREAKTHROUGH drug discovery willow chip quantum computing superconducting circuits error reduction applications
在线阅读 下载PDF
Applications of photon-counting CT in oncologic imaging:A systematic review 被引量:1
13
作者 Arosh S Perera Molligoda Arachchige Anna Dashiell +7 位作者 Anton Shiraan Jesuraj Antonia Immacolata D’Urso Benedetta Fiore Martina Cattaneo Emilia Pierzynska Sandra Szydelko Francesca Romana Centini Yash Verma 《World Journal of Radiology》 2025年第8期74-83,共10页
BACKGROUND Photon-counting detector(PCD)CT represents a transformative advancement in radiological imaging,offering superior spatial resolution,enhanced contrast-tonoise ratio,and reduced radiation dose compared with ... BACKGROUND Photon-counting detector(PCD)CT represents a transformative advancement in radiological imaging,offering superior spatial resolution,enhanced contrast-tonoise ratio,and reduced radiation dose compared with the conventional energyintegrating detector CT.AIM To evaluate PCD CT in oncologic imaging,focusing on its role in tumor detection,staging,and treatment response assessment.METHODS We performed a systematic PubMed search from January 1,2017 to December 31,2024,using the keywords“photon-counting CT”,“cancer”,and“tumor”to identify studies on its use in oncologic imaging.We included experimental studies on humans or human phantoms and excluded reviews,commentaries,editorials,non-English,animal,and non-experimental studies.Study selection followed Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines.Out of 175 initial studies,39 met the inclusion criteria after screening and full-text review.Data extraction focused on study type,country of origin,and oncologic applications of photon-counting CT.No formal risk of bias assessment was performed,and the review was not registered in PROSPERO as it did not include a meta-analysis.RESULTS Key findings highlighted the advantages of PCD CT in imaging renal masses,adrenal adenomas,ovarian cancer,breast cancer,prostate cancer,pancreatic tumors,hepatocellular carcinoma,metastases,multiple myeloma,and lung cancer.Additionally,PCD CT has demonstrated improved lesion characterization and enhanced diagnostic accuracy in oncology.Despite its promising capabilities challenges related to data processing,storage,and accessibility remain.CONCLUSION As PCD CT technology evolves,its integration into routine oncologic imaging has the potential to significantly enhance cancer diagnosis and patient management. 展开更多
关键词 Photon-counting detector CT Oncologic imaging Cancer detection Tumor characterization Spectral imaging Radiology Computed tomography Photon-counting detector CT applications Diagnostic imaging Radiation dose reduction
暂未订购
A Decade Review of Video Compressive Sensing:A Roadmap to Practical Applications
14
作者 Zhihong Zhang Siming Zheng +5 位作者 Min Qiu Guohai Situ David J.Brady Qionghai Dai Jinli Suo Xin Yuan 《Engineering》 2025年第3期172-185,共14页
It has been over a decade since the first coded aperture video compressive sensing(CS)system was reported.The underlying principle of this technology is to employ a high-frequency modulator in the optical path to modu... It has been over a decade since the first coded aperture video compressive sensing(CS)system was reported.The underlying principle of this technology is to employ a high-frequency modulator in the optical path to modulate a recorded high-speed scene within one integration time.The superimposed image captured in this manner is modulated and compressed,since multiple modulation patterns are imposed.Following this,reconstruction algorithms are utilized to recover the desired high-speed scene.One leading advantage of video CS is that a single captured measurement can be used to reconstruct a multi-frame video,thereby enabling a low-speed camera to capture high-speed scenes.Inspired by this,a number of variants of video CS systems have been built,mainly using different modulation devices.Meanwhile,in order to obtain high-quality reconstruction videos,many algorithms have been developed,from optimization-based iterative algorithms to deep-learning-based ones.Recently,emerging deep learning methods have been dominant due to their high-speed inference and high-quality reconstruction,highlighting the possibility of deploying video CS in practical applications.Toward this end,this paper reviews the progress that has been achieved in video CS during the past decade.We further analyze the efforts that need to be made—in terms of both hardware and algorithms—to enable real applications.Research gaps are put forward and future directions are summarized to help researchers and engineers working on this topic. 展开更多
关键词 Video compressive sensing Computational imaging Deep learning Practical applications
在线阅读 下载PDF
Stochastic Fractal Search:A Decade Comprehensive Review on Its Theory,Variants,and Applications
15
作者 Mohammed A.El-Shorbagy Anas Bouaouda +1 位作者 Laith Abualigah Fatma A.Hashim 《Computer Modeling in Engineering & Sciences》 2025年第3期2339-2404,共66页
With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heurist... With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heuristic algorithms have shown remarkable effectiveness in solving these challenges across diverse domains,such as machine learning,process control,and engineering design,showcasing their capability to address complex optimization problems.The Stochastic Fractal Search(SFS)algorithm is one of the most popular meta-heuristic optimization methods inspired by the fractal growth patterns of natural materials.Since its introduction by Hamid Salimi in 2015,SFS has garnered significant attention from researchers and has been applied to diverse optimization problems acrossmultiple disciplines.Its popularity can be attributed to several factors,including its simplicity,practical computational efficiency,ease of implementation,rapid convergence,high effectiveness,and ability to address singleandmulti-objective optimization problems,often outperforming other established algorithms.This review paper offers a comprehensive and detailed analysis of the SFS algorithm,covering its standard version,modifications,hybridization,and multi-objective implementations.The paper also examines several SFS applications across diverse domains,including power and energy systems,image processing,machine learning,wireless sensor networks,environmental modeling,economics and finance,and numerous engineering challenges.Furthermore,the paper critically evaluates the SFS algorithm’s performance,benchmarking its effectiveness against recently published meta-heuristic algorithms.In conclusion,the review highlights key findings and suggests potential directions for future developments and modifications of the SFS algorithm. 展开更多
关键词 Meta-heuristic algorithms stochastic fractal search evolutionary computation engineering applications swarm intelligence optimization
在线阅读 下载PDF
A Quality of Service Analysis of FPGA-Accelerated Conv2D Architectures for Brain Tumor Multi-Classification
16
作者 Ayoub Mhaouch Wafa Gtifa +2 位作者 Turke Althobaiti Hamzah Faraj Mohsen Machhout 《Computers, Materials & Continua》 2025年第9期5637-5663,共27页
In medical imaging,accurate brain tumor classification in medical imaging requires real-time processing and efficient computation,making hardware acceleration essential.Field Programmable Gate Arrays(FPGAs)offer paral... In medical imaging,accurate brain tumor classification in medical imaging requires real-time processing and efficient computation,making hardware acceleration essential.Field Programmable Gate Arrays(FPGAs)offer parallelism and reconfigurability,making them well-suited for such tasks.In this study,we propose a hardware-accelerated Convolutional Neural Network(CNN)for brain cancer classification,implemented on the PYNQ-Z2 FPGA.Our approach optimizes the first Conv2D layer using different numerical representations:8-bit fixed-point(INT8),16-bit fixed-point(FP16),and 32-bit fixed-point(FP32),while the remaining layers run on an ARM Cortex-A9 processor.Experimental results demonstrate that FPGA acceleration significantly outperforms the CPU(Central Processing Unit)based approach.The obtained results emphasize the critical importance of selecting the appropriate numerical representation for hardware acceleration in medical imaging.On the PYNQ-Z2 FPGA,the INT8 achieves a 16.8%reduction in latency and 22.2%power savings compared to FP32,making it ideal for real-time and energy-constrained applications.FP16 offers a strong balance,delivering only a 0.1%drop in accuracy compared to FP32(94.1%vs.94.2%)while improving latency by 5%and reducing power consumption by 11.1%.Compared to prior works,the proposed FPGA-based CNN model achieves the highest classification accuracy(94.2%)with a throughput of up to 1.562 FPS,outperforming GPU-based and traditional CPU methods in both accuracy and hardware efficiency.These findings demonstrate the effectiveness of FPGA-based AI acceleration for real-time,power-efficient,and high-performance brain tumor classification,showcasing its practical potential in next-generation medical imaging systems. 展开更多
关键词 Brain cancer hardware implementation convolutional neural networks performance evaluation efficient computing real-time medical applications
暂未订购
Intelligent Energy-Efficient Resource Allocation for Multi-UAV-Assisted Mobile Edge Computing Networks
17
作者 Hu Han Shen Le +2 位作者 Zhou Fuhui Wang Qun Zhu Hongbo 《China Communications》 2025年第4期339-355,共17页
The unmanned aerial vehicle(UAV)-assisted mobile edge computing(MEC)has been deemed a promising solution for energy-constrained devices to run smart applications with computationintensive and latency-sensitive require... The unmanned aerial vehicle(UAV)-assisted mobile edge computing(MEC)has been deemed a promising solution for energy-constrained devices to run smart applications with computationintensive and latency-sensitive requirements,especially in some infrastructure-limited areas or some emergency scenarios.However,the multi-UAVassisted MEC network remains largely unexplored.In this paper,the dynamic trajectory optimization and computation offloading are studied in a multi-UAVassisted MEC system where multiple UAVs fly over a target area with different trajectories to serve ground users.By considering the dynamic channel condition and random task arrival and jointly optimizing UAVs'trajectories,user association,and subchannel assignment,the average long-term sum of the user energy consumption minimization problem is formulated.To address the problem involving both discrete and continuous variables,a hybrid decision deep reinforcement learning(DRL)-based intelligent energyefficient resource allocation and trajectory optimization algorithm is proposed,named HDRT algorithm,where deep Q network(DQN)and deep deterministic policy gradient(DDPG)are invoked to process discrete and continuous variables,respectively.Simulation results show that the proposed HDRT algorithm converges fast and outperforms other benchmarks in the aspect of user energy consumption and latency. 展开更多
关键词 dynamic trajectory optimization intelligent resource allocation unmanned aerial vehicle uav assisted uav assisted mec energy efficiency smart applications mobile edge computing mec deep reinforcement learning
在线阅读 下载PDF
Deep Learning in Medical Image Analysis: A Comprehensive Review of Algorithms, Trends, Applications, and Challenges
18
作者 Dawa Chyophel Lepcha Bhawna Goyal +4 位作者 Ayush Dogra Ahmed Alkhayyat Prabhat Kumar Sahu Aaliya Ali Vinay Kukreja 《Computer Modeling in Engineering & Sciences》 2025年第11期1487-1573,共87页
Medical image analysis has become a cornerstone of modern healthcare,driven by the exponential growth of data from imaging modalities such as MRI,CT,PET,ultrasound,and X-ray.Traditional machine learning methods have m... Medical image analysis has become a cornerstone of modern healthcare,driven by the exponential growth of data from imaging modalities such as MRI,CT,PET,ultrasound,and X-ray.Traditional machine learning methods have made early contributions;however,recent advancements in deep learning(DL)have revolutionized the field,offering state-of-the-art performance in image classification,segmentation,detection,fusion,registration,and enhancement.This comprehensive review presents an in-depth analysis of deep learning methodologies applied across medical image analysis tasks,highlighting both foundational models and recent innovations.The article begins by introducing conventional techniques and their limitations,setting the stage for DL-based solutions.Core DL architectures,including Convolutional Neural Networks(CNNs),Recurrent Neural Networks(RNNs),Generative Adversarial Networks(GANs),Vision Transformers(ViTs),and hybrid models,are discussed in detail,including their advantages and domain-specific adaptations.Advanced learning paradigms such as semi-supervised learning,selfsupervised learning,and few-shot learning are explored for their potential to mitigate data annotation challenges in clinical datasets.This review further categorizes major tasks in medical image analysis,elaborating on how DL techniques have enabled precise tumor segmentation,lesion detection,modality fusion,super-resolution,and robust classification across diverse clinical settings.Emphasis is placed on applications in oncology,cardiology,neurology,and infectious diseases,including COVID-19.Challenges such as data scarcity,label imbalance,model generalizability,interpretability,and integration into clinical workflows are critically examined.Ethical considerations,explainable AI(XAI),federated learning,and regulatory compliance are discussed as essential components of real-world deployment.Benchmark datasets,evaluation metrics,and comparative performance analyses are presented to support future research.The article concludes with a forward-looking perspective on the role of foundation models,multimodal learning,edge AI,and bio-inspired computing in the future of medical imaging.Overall,this review serves as a valuable resource for researchers,clinicians,and developers aiming to harness deep learning for intelligent,efficient,and clinically viable medical image analysis. 展开更多
关键词 Medical image analysis deep learning(DL) artificial intelligence(AI) neural networks convolutional neural networks(CNNs) generative adversarial networks(GANs) TRANSFORMERS natural language processing(NLP) computational applications comprehensive analysis
在线阅读 下载PDF
计算机办公自动化系统中Access数据库与其他办公软件的集成应用研究
19
作者 杨小蓉 《科技资讯》 2025年第22期33-35,共3页
随着电子信息技术在创新链、产业链应用持续深化,建设计算机办公自动化系统已成为各行业组织系统高效运行的必要前提。实现计算机办公自动化是提高工作效率、优化管理流程、驱动企业数字化转型的重要举措。本文研究Access数据库与其他... 随着电子信息技术在创新链、产业链应用持续深化,建设计算机办公自动化系统已成为各行业组织系统高效运行的必要前提。实现计算机办公自动化是提高工作效率、优化管理流程、驱动企业数字化转型的重要举措。本文研究Access数据库与其他办公软件集成应用的底层逻辑,详细解构集成设计环节,旨在全方位适应时代变革下自动化办公需求,为办公效率提升提供有益参考。 展开更多
关键词 计算机办公自动化 AccESS数据库 办公软件 集成应用
在线阅读 下载PDF
The Competence of Volunteer Computing for MapReduce Big Data Applications
20
作者 Wei Li William Guo 《国际计算机前沿大会会议论文集》 2018年第1期2-2,共1页
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部