期刊文献+
共找到253,342篇文章
< 1 2 250 >
每页显示 20 50 100
Offload Strategy for Edge Computing in Satellite Networks Based on Software Defined Network 被引量:1
1
作者 Zhiguo Liu Yuqing Gui +1 位作者 Lin Wang Yingru Jiang 《Computers, Materials & Continua》 SCIE EI 2025年第1期863-879,共17页
Satellite edge computing has garnered significant attention from researchers;however,processing a large volume of tasks within multi-node satellite networks still poses considerable challenges.The sharp increase in us... Satellite edge computing has garnered significant attention from researchers;however,processing a large volume of tasks within multi-node satellite networks still poses considerable challenges.The sharp increase in user demand for latency-sensitive tasks has inevitably led to offloading bottlenecks and insufficient computational capacity on individual satellite edge servers,making it necessary to implement effective task offloading scheduling to enhance user experience.In this paper,we propose a priority-based task scheduling strategy based on a Software-Defined Network(SDN)framework for satellite-terrestrial integrated networks,which clarifies the execution order of tasks based on their priority.Subsequently,we apply a Dueling-Double Deep Q-Network(DDQN)algorithm enhanced with prioritized experience replay to derive a computation offloading strategy,improving the experience replay mechanism within the Dueling-DDQN framework.Next,we utilize the Deep Deterministic Policy Gradient(DDPG)algorithm to determine the optimal resource allocation strategy to reduce the processing latency of sub-tasks.Simulation results demonstrate that the proposed d3-DDPG algorithm outperforms other approaches,effectively reducing task processing latency and thus improving user experience and system efficiency. 展开更多
关键词 Satellite network edge computing task scheduling computing offloading
在线阅读 下载PDF
Role of photon-counting computed tomography in pediatric cardiovascular imaging 被引量:1
2
作者 Arosh S Perera Molligoda Arachchige Yash Verma 《World Journal of Clinical Pediatrics》 2025年第1期55-62,共8页
Photon-counting computed tomography(PCCT)represents a significant advancement in pediatric cardiovascular imaging.Traditional CT systems employ energy-integrating detectors that convert X-ray photons into visible ligh... Photon-counting computed tomography(PCCT)represents a significant advancement in pediatric cardiovascular imaging.Traditional CT systems employ energy-integrating detectors that convert X-ray photons into visible light,whereas PCCT utilizes photon-counting detectors that directly transform X-ray photons into electric signals.This direct conversion allows photon-counting detectors to sort photons into discrete energy levels,thereby enhancing image quality through superior noise reduction,improved spatial and contrast resolution,and reduced artifacts.In pediatric applications,PCCT offers substantial benefits,including lower radiation doses,which may help reduce the risk of malignancy in pediatric patients,with perhaps greater potential to benefit those with repeated exposure from a young age.Enhanced spatial resolution facilitates better visualization of small structures,vital for diagnosing congenital heart defects.Additionally,PCCT’s spectral capabilities improve tissue characterization and enable the creation of virtual monoenergetic images,which enhance soft-tissue contrast and potentially reduce contrast media doses.Initial clinical results indicate that PCCT provides superior image quality and diagnostic accuracy compared to conven-tional CT,particularly in challenging pediatric cardiovascular cases.As PCCT technology matures,further research and standardized protocols will be essential to fully integrate it into pediatric imaging practices,ensuring optimized diagnostic outcomes and patient safety. 展开更多
关键词 CARDIOVASCULAR Photon-counting detectors PEDIATRIC Photon-counting computed tomography computed tomography
暂未订购
Streamlined photonic reservoir computer with augmented memory capabilities 被引量:3
3
作者 Changdi Zhou Yu Huang +5 位作者 Yigong Yang Deyu Cai Pei Zhou Kuenyao Lau Nianqiang Li Xiaofeng Li 《Opto-Electronic Advances》 2025年第1期45-57,共13页
Photonic platforms are gradually emerging as a promising option to encounter the ever-growing demand for artificial intelligence,among which photonic time-delay reservoir computing(TDRC)is widely anticipated.While suc... Photonic platforms are gradually emerging as a promising option to encounter the ever-growing demand for artificial intelligence,among which photonic time-delay reservoir computing(TDRC)is widely anticipated.While such a computing paradigm can only employ a single photonic device as the nonlinear node for data processing,the performance highly relies on the fading memory provided by the delay feedback loop(FL),which sets a restriction on the extensibility of physical implementation,especially for highly integrated chips.Here,we present a simplified photonic scheme for more flexible parameter configurations leveraging the designed quasi-convolution coding(QC),which completely gets rid of the dependence on FL.Unlike delay-based TDRC,encoded data in QC-based RC(QRC)enables temporal feature extraction,facilitating augmented memory capabilities.Thus,our proposed QRC is enabled to deal with time-related tasks or sequential data without the implementation of FL.Furthermore,we can implement this hardware with a low-power,easily integrable vertical-cavity surface-emitting laser for high-performance parallel processing.We illustrate the concept validation through simulation and experimental comparison of QRC and TDRC,wherein the simpler-structured QRC outperforms across various benchmark tasks.Our results may underscore an auspicious solution for the hardware implementation of deep neural networks. 展开更多
关键词 photonic reservoir computing machine learning vertical-cavity surface-emitting laser quasi-convolution coding augmented memory capabilities
在线阅读 下载PDF
Optoelectronic memristor based on a-C:Te film for muti-mode reservoir computing 被引量:2
4
作者 Qiaoling Tian Kuo Xun +7 位作者 Zhuangzhuang Li Xiaoning Zhao Ya Lin Ye Tao Zhongqiang Wang Daniele Ielmini Haiyang Xu Yichun Liu 《Journal of Semiconductors》 2025年第2期144-149,共6页
Optoelectronic memristor is generating growing research interest for high efficient computing and sensing-memory applications.In this work,an optoelectronic memristor with Au/a-C:Te/Pt structure is developed.Synaptic ... Optoelectronic memristor is generating growing research interest for high efficient computing and sensing-memory applications.In this work,an optoelectronic memristor with Au/a-C:Te/Pt structure is developed.Synaptic functions,i.e.,excita-tory post-synaptic current and pair-pulse facilitation are successfully mimicked with the memristor under electrical and optical stimulations.More importantly,the device exhibited distinguishable response currents by adjusting 4-bit input electrical/opti-cal signals.A multi-mode reservoir computing(RC)system is constructed with the optoelectronic memristors to emulate human tactile-visual fusion recognition and an accuracy of 98.7%is achieved.The optoelectronic memristor provides potential for developing multi-mode RC system. 展开更多
关键词 optoelectronic memristor volatile switching muti-mode reservoir computing
在线阅读 下载PDF
Dynamic Task Offloading Scheme for Edge Computing via Meta-Reinforcement Learning 被引量:1
5
作者 Jiajia Liu Peng Xie +2 位作者 Wei Li Bo Tang Jianhua Liu 《Computers, Materials & Continua》 2025年第2期2609-2635,共27页
As an important complement to cloud computing, edge computing can effectively reduce the workload of the backbone network. To reduce latency and energy consumption of edge computing, deep learning is used to learn the... As an important complement to cloud computing, edge computing can effectively reduce the workload of the backbone network. To reduce latency and energy consumption of edge computing, deep learning is used to learn the task offloading strategies by interacting with the entities. In actual application scenarios, users of edge computing are always changing dynamically. However, the existing task offloading strategies cannot be applied to such dynamic scenarios. To solve this problem, we propose a novel dynamic task offloading framework for distributed edge computing, leveraging the potential of meta-reinforcement learning (MRL). Our approach formulates a multi-objective optimization problem aimed at minimizing both delay and energy consumption. We model the task offloading strategy using a directed acyclic graph (DAG). Furthermore, we propose a distributed edge computing adaptive task offloading algorithm rooted in MRL. This algorithm integrates multiple Markov decision processes (MDP) with a sequence-to-sequence (seq2seq) network, enabling it to learn and adapt task offloading strategies responsively across diverse network environments. To achieve joint optimization of delay and energy consumption, we incorporate the non-dominated sorting genetic algorithm II (NSGA-II) into our framework. Simulation results demonstrate the superiority of our proposed solution, achieving a 21% reduction in time delay and a 19% decrease in energy consumption compared to alternative task offloading schemes. Moreover, our scheme exhibits remarkable adaptability, responding swiftly to changes in various network environments. 展开更多
关键词 Edge computing adaptive META task offloading joint optimization
在线阅读 下载PDF
Near‑Sensor Edge Computing System Enabled by a CMOS Compatible Photonic Integrated Circuit Platform Using Bilayer AlN/Si Waveguides 被引量:1
6
作者 Zhihao Ren Zixuan Zhang +4 位作者 Yangyang Zhuge Zian Xiao Siyu Xu Jingkai Zhou Chengkuo Lee 《Nano-Micro Letters》 2025年第11期1-20,共20页
The rise of large-scale artificial intelligence(AI)models,such as ChatGPT,Deep-Seek,and autonomous vehicle systems,has significantly advanced the boundaries of AI,enabling highly complex tasks in natural language proc... The rise of large-scale artificial intelligence(AI)models,such as ChatGPT,Deep-Seek,and autonomous vehicle systems,has significantly advanced the boundaries of AI,enabling highly complex tasks in natural language processing,image recognition,and real-time decisionmaking.However,these models demand immense computational power and are often centralized,relying on cloud-based architectures with inherent limitations in latency,privacy,and energy efficiency.To address these challenges and bring AI closer to real-world applications,such as wearable health monitoring,robotics,and immersive virtual environments,innovative hardware solutions are urgently needed.This work introduces a near-sensor edge computing(NSEC)system,built on a bilayer AlN/Si waveguide platform,to provide real-time,energy-efficient AI capabilities at the edge.Leveraging the electro-optic properties of AlN microring resonators for photonic feature extraction,coupled with Si-based thermo-optic Mach-Zehnder interferometers for neural network computations,the system represents a transformative approach to AI hardware design.Demonstrated through multimodal gesture and gait analysis,the NSEC system achieves high classification accuracies of 96.77%for gestures and 98.31%for gaits,ultra-low latency(<10 ns),and minimal energy consumption(<0.34 pJ).This groundbreaking system bridges the gap between AI models and real-world applications,enabling efficient,privacy-preserving AI solutions for healthcare,robotics,and next-generation human-machine interfaces,marking a pivotal advancement in edge computing and AI deployment. 展开更多
关键词 Photonic integrated circuits Edge computing Aluminum nitride Neural networks Wearable sensors
在线阅读 下载PDF
Synaptic devices based on silicon carbide for neuromorphic computing 被引量:1
7
作者 Boyu Ye Xiao Liu +2 位作者 Chao Wu Wensheng Yan Xiaodong Pi 《Journal of Semiconductors》 2025年第2期38-51,共14页
To address the increasing demand for massive data storage and processing,brain-inspired neuromorphic comput-ing systems based on artificial synaptic devices have been actively developed in recent years.Among the vario... To address the increasing demand for massive data storage and processing,brain-inspired neuromorphic comput-ing systems based on artificial synaptic devices have been actively developed in recent years.Among the various materials inves-tigated for the fabrication of synaptic devices,silicon carbide(SiC)has emerged as a preferred choices due to its high electron mobility,superior thermal conductivity,and excellent thermal stability,which exhibits promising potential for neuromorphic applications in harsh environments.In this review,the recent progress in SiC-based synaptic devices is summarized.Firstly,an in-depth discussion is conducted regarding the categories,working mechanisms,and structural designs of these devices.Subse-quently,several application scenarios for SiC-based synaptic devices are presented.Finally,a few perspectives and directions for their future development are outlined. 展开更多
关键词 silicon carbide wide bandgap semiconductors synaptic devices neuromorphic computing high temperature
在线阅读 下载PDF
Providing Robust and Low-Cost Edge Computing in Smart Grid:An Energy Harvesting Based Task Scheduling and Resource Management Framework 被引量:1
8
作者 Xie Zhigang Song Xin +1 位作者 Xu Siyang Cao Jing 《China Communications》 2025年第2期226-240,共15页
Recently,one of the main challenges facing the smart grid is insufficient computing resources and intermittent energy supply for various distributed components(such as monitoring systems for renewable energy power sta... Recently,one of the main challenges facing the smart grid is insufficient computing resources and intermittent energy supply for various distributed components(such as monitoring systems for renewable energy power stations).To solve the problem,we propose an energy harvesting based task scheduling and resource management framework to provide robust and low-cost edge computing services for smart grid.First,we formulate an energy consumption minimization problem with regard to task offloading,time switching,and resource allocation for mobile devices,which can be decoupled and transformed into a typical knapsack problem.Then,solutions are derived by two different algorithms.Furthermore,we deploy renewable energy and energy storage units at edge servers to tackle intermittency and instability problems.Finally,we design an energy management algorithm based on sampling average approximation for edge computing servers to derive the optimal charging/discharging strategies,number of energy storage units,and renewable energy utilization.The simulation results show the efficiency and superiority of our proposed framework. 展开更多
关键词 edge computing energy harvesting energy storage unit renewable energy sampling average approximation task scheduling
在线阅读 下载PDF
CBBM-WARM:A Workload-Aware Meta-Heuristic for Resource Management in Cloud Computing 被引量:1
9
作者 K Nivitha P Pabitha R Praveen 《China Communications》 2025年第6期255-275,共21页
The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achievi... The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks. 展开更多
关键词 autonomic resource management cloud computing coot bird behavior model SLA violation cost WORKLOAD
在线阅读 下载PDF
DeepSeek vs.ChatGPT vs.Claude:A comparative study for scientific computing and scientific machine learning tasks 被引量:1
10
作者 Qile Jiang Zhiwei Gao George Em Karniadakis 《Theoretical & Applied Mechanics Letters》 2025年第3期194-206,共13页
Large language models(LLMs)have emerged as powerful tools for addressing a wide range of problems,including those in scientific computing,particularly in solving partial differential equations(PDEs).However,different ... Large language models(LLMs)have emerged as powerful tools for addressing a wide range of problems,including those in scientific computing,particularly in solving partial differential equations(PDEs).However,different models exhibit distinct strengths and preferences,resulting in varying levels of performance.In this paper,we compare the capabilities of the most advanced LLMs—DeepSeek,ChatGPT,and Claude—along with their reasoning-optimized versions in addressing computational challenges.Specifically,we evaluate their proficiency in solving traditional numerical problems in scientific computing as well as leveraging scientific machine learning techniques for PDE-based problems.We designed all our experiments so that a nontrivial decision is required,e.g,defining the proper space of input functions for neural operator learning.Our findings show that reasoning and hybrid-reasoning models consistently and significantly outperform non-reasoning ones in solving challenging problems,with ChatGPT o3-mini-high generally offering the fastest reasoning speed. 展开更多
关键词 Large language models(LLM) Scientific computing Scientific machine learning Physics-informed neural network
在线阅读 下载PDF
Data-Driven Healthcare:The Role of Computational Methods in Medical Innovation
11
作者 Hariharasakthisudhan Ponnarengan Sivakumar Rajendran +2 位作者 Vikas Khalkar Gunapriya Devarajan Logesh Kamaraj 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期1-48,共48页
The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical r... The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical research.The review covers key topics such as computational modelling,bioinformatics,machine learning in medical diagnostics,and the integration of wearable technology for real-time health monitoring.Major findings indicate that computational models have significantly enhanced the understanding of complex biological systems,while machine learning algorithms have improved the accuracy of disease prediction and diagnosis.The synergy between bioinformatics and computational techniques has led to breakthroughs in personalized medicine,enabling more precise treatment strategies.Additionally,the integration of wearable devices with advanced computational methods has opened new avenues for continuous health monitoring and early disease detection.The review emphasizes the need for interdisciplinary collaboration to further advance this field.Future research should focus on developing more robust and scalable computational models,enhancing data integration techniques,and addressing ethical considerations related to data privacy and security.By fostering innovation at the intersection of these disciplines,the potential to revolutionize healthcare delivery and outcomes becomes increasingly attainable. 展开更多
关键词 computational models biomedical engineering BIOINFORMATICS machine learning wearable technology
在线阅读 下载PDF
Experimental Observing Damage Evolution in Cement Pastes Exposed to External Sulfate Attack by in situ X-ray Computed Tomography
12
作者 WU Min CAO Kailei +4 位作者 XIAO Weirong YU Zetai CAO Jierong DING Qingjun LI Jinhui 《Journal of Wuhan University of Technology(Materials Science)》 SCIE EI CAS 2025年第1期164-170,共7页
The paper presents experimental investigation results of crack pattern change in cement pastes caused by external sulfate attack(ESA).To visualize the formation and development of cracks in cement pastes under ESA,an ... The paper presents experimental investigation results of crack pattern change in cement pastes caused by external sulfate attack(ESA).To visualize the formation and development of cracks in cement pastes under ESA,an X-ray computed tomography(X-ray CT)was used,i e,the tomography system of Zeiss Xradia 510 versa.The results indicate that X-CT can monitor the development process and distribution characteristics of the internal cracks of cement pastes under ESA with attack time.In addition,the C3A content in the cement significantly affects the damage mode of cement paste specimens during sulfate erosion.The damage of ordinary Portland cement(OPC)pastes subjected to sulfate attack with high C3A content are severe,while the damage of sulfate resistant Portland cement(SRPC)pastes is much smaller than that of OPC pastes.Furthermore,a quadratic function describes the correlation between the crack volume fraction and development depth for two cement pastes immermed in sulfate solution. 展开更多
关键词 CONCRETE external sulfate attack damage evolution situ X-ray computed tomography
原文传递
Computed tomography enterography-based radiomics for assessing mucosal healing in patients with small bowel Crohn's disease
13
作者 Hao Ding Yuan-Yuan Fang +5 位作者 Wen-Jie Fan Chen-Yu Zhang Shao-Fei Wang Jing Hu Wei Han Qiao Mei 《World Journal of Gastroenterology》 SCIE CAS 2025年第3期62-72,共11页
BACKGROUND Mucosal healing(MH)is the major therapeutic target for Crohn's disease(CD).As the most commonly involved intestinal segment,small bowel(SB)assessment is crucial for CD patients.Yet,it poses a significan... BACKGROUND Mucosal healing(MH)is the major therapeutic target for Crohn's disease(CD).As the most commonly involved intestinal segment,small bowel(SB)assessment is crucial for CD patients.Yet,it poses a significant challenge due to its limited accessibility through conventional endoscopic methods.AIM To establish a noninvasive radiomic model based on computed tomography enterography(CTE)for MH assessment in SBCD patients.METHODS Seventy-three patients diagnosed with SBCD were included and divided into a training cohort(n=55)and a test cohort(n=18).Radiomic features were obtained from CTE images to establish a radiomic model.Patient demographics were analysed to establish a clinical model.A radiomic-clinical nomogram was constructed by combining significant clinical and radiomic features.The diagnostic efficacy and clinical benefit were evaluated via receiver operating characteristic(ROC)curve analysis and decision curve analysis(DCA),respectively.RESULTS Of the 73 patients enrolled,25 patients achieved MH.The radiomic-clinical nomogram had an area under the ROC curve of 0.961(95%confidence interval:0.886-1.000)in the training cohort and 0.958(0.877-1.000)in the test cohort and provided superior clinical benefit to either the clinical or radiomic models alone,as demonstrated by DCA.CONCLUSION These results indicate that the CTE-based radiomic-clinical nomogram is a promising imaging biomarker for MH and serves as a potential noninvasive alternative to enteroscopy for MH assessment in SBCD patients. 展开更多
关键词 Crohn’s disease computed tomography enterography Mucosal healing NOMOGRAM Radiomics
暂未订购
Digital Humanities,Computational Criticism and the Stanford Literary Lab:An Interviewwith Mark Algee-Hewittr
14
作者 Hui Haifeng Mark Algee-Hewitt 《外国文学研究》 北大核心 2025年第4期1-10,共10页
The Literary Lab at Stanford University is one of the birthplaces of digital humanities and has maintained significant influence in this field over the years.Professor Hui Haifeng has been engaged in research on digit... The Literary Lab at Stanford University is one of the birthplaces of digital humanities and has maintained significant influence in this field over the years.Professor Hui Haifeng has been engaged in research on digital humanities and computational criticism in recent years.During his visiting scholarship at Stanford University,he participated in the activities of the Literary Lab.Taking this opportunity,he interviewed Professor Mark Algee-Hewitt,the director of the Literary Lab,discussing important topics such as the current state and reception of DH(digital humanities)in the English Department,the operations of the Literary Lab,and the landscape of computational criticism.Mark Algee-Hewitt's research focuses on the eighteenth and early nineteenth centuries in England and Germany and seeks to combine literary criticism with digital and quantitative analyses of literary texts.In particular,he is interested in the history of aesthetic theory and the development and transmission of aesthetic and philosophical concepts during the Enlightenment and Romantic periods.He is also interested in the relationship between aesthetic theory and the poetry of the long eighteenth century.Although his primary background is English literature,he also has a degree in computer science.He believes that the influence of digital humanities within the humanities disciplines is growing increasingly significant.This impact is evident in both the attraction and assistance it offers to students,as well as in the new interpretations it brings to traditional literary studies.He argues that the key to effectively integrating digital humanities into the English Department is to focus on literary research questions,exploring how digital tools can raise new questions or provide new insights into traditional research. 展开更多
关键词 digital humanities computational criticism literary research Literary Lab
原文传递
A Comprehensive Study of Resource Provisioning and Optimization in Edge Computing
15
作者 Sreebha Bhaskaran Supriya Muthuraman 《Computers, Materials & Continua》 2025年第6期5037-5070,共34页
Efficient resource provisioning,allocation,and computation offloading are critical to realizing lowlatency,scalable,and energy-efficient applications in cloud,fog,and edge computing.Despite its importance,integrating ... Efficient resource provisioning,allocation,and computation offloading are critical to realizing lowlatency,scalable,and energy-efficient applications in cloud,fog,and edge computing.Despite its importance,integrating Software Defined Networks(SDN)for enhancing resource orchestration,task scheduling,and traffic management remains a relatively underexplored area with significant innovation potential.This paper provides a comprehensive review of existing mechanisms,categorizing resource provisioning approaches into static,dynamic,and user-centric models,while examining applications across domains such as IoT,healthcare,and autonomous systems.The survey highlights challenges such as scalability,interoperability,and security in managing dynamic and heterogeneous infrastructures.This exclusive research evaluates how SDN enables adaptive policy-based handling of distributed resources through advanced orchestration processes.Furthermore,proposes future directions,including AI-driven optimization techniques and hybrid orchestrationmodels.By addressing these emerging opportunities,thiswork serves as a foundational reference for advancing resource management strategies in next-generation cloud,fog,and edge computing ecosystems.This survey concludes that SDN-enabled computing environments find essential guidance in addressing upcoming management opportunities. 展开更多
关键词 Cloud computing edge computing fog computing resource provisioning resource allocation computation offloading optimization techniques software defined network
在线阅读 下载PDF
Efficient rock joint detection from large-scale 3D point clouds using vectorization and parallel computing approaches
16
作者 Yunfeng Ge Zihao Li +2 位作者 Huiming Tang Qian Chen Zhongxu Wen 《Geoscience Frontiers》 2025年第5期1-15,共15页
The application of three-dimensional(3D)point cloud parametric analyses on exposed rock surfaces,enabled by Light Detection and Ranging(LiDAR)technology,has gained significant popularity due to its efficiency and the ... The application of three-dimensional(3D)point cloud parametric analyses on exposed rock surfaces,enabled by Light Detection and Ranging(LiDAR)technology,has gained significant popularity due to its efficiency and the high quality of data it provides.However,as research extends to address more regional and complex geological challenges,the demand for algorithms that are both robust and highly efficient in processing large datasets continues to grow.This study proposes an advanced rock joint identification algorithm leveraging artificial neural networks(ANNs),incorporating parallel computing and vectorization of high-performance computing.The algorithm utilizes point cloud attributes—specifically point normal and point curvatures-as input parameters for ANNs,which classify data into rock joints and non-rock joints.Subsequently,individual rock joints are extracted using the density-based spatial clustering of applications with noise(DBSCAN)technique.Principal component analysis(PCA)is subsequently employed to calculate their orientations.By fully utilizing the computational power of parallel computing and vectorization,the algorithm increases the running speed by 3–4 times,enabling the processing of large-scale datasets within seconds.This breakthrough maximizes computational efficiency while maintaining high accuracy(compared with manual measurement,the deviation of the automatic measurement is within 2°),making it an effective solution for large-scale rock joint detection challenges.©2025 China University of Geosciences(Beijing)and Peking University. 展开更多
关键词 Rock joints Pointclouds Artificialneuralnetwork High-performance computing Parallel computing VECTORIZATION
在线阅读 下载PDF
In-Memory Probabilistic Computing Using Gate-Tunable Layer Pseudospins in van der Waals Heterostructures
17
作者 Jiao Xie Jun-Lin Xiong +2 位作者 Bin Cheng Shi-Jun Liang Feng Miao 《Chinese Physics Letters》 2025年第4期9-22,共14页
Layer pseudospins,exhibiting quantum coherence and precise multistate controllability,present significant potential for the advancement of future computing technologies.In this work,we propose an in-memory probabilist... Layer pseudospins,exhibiting quantum coherence and precise multistate controllability,present significant potential for the advancement of future computing technologies.In this work,we propose an in-memory probabilistic computing scheme based on the electrical manipulation of layer pseudospins in layered materials,by exploiting the interaction between real spins and layer pseudospins. 展开更多
关键词 layer pseudospinsexhibiting layered materialsby real spins probabilistic computing advancement future computing technologiesin electrical manipulation layer pseudospins memory computing gate tunable layer pseudospins
原文传递
Comparative study of IoT-and AI-based computing disease detection approaches
18
作者 Wasiur Rhmann Jalaluddin Khan +8 位作者 Ghufran Ahmad Khan Zubair Ashraf Babita Pandey Mohammad Ahmar Khan Ashraf Ali Amaan Ishrat Abdulrahman Abdullah Alghamdi Bilal Ahamad Mohammad Khaja Shaik 《Data Science and Management》 2025年第1期94-106,共13页
The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machin... The emergence of different computing methods such as cloud-,fog-,and edge-based Internet of Things(IoT)systems has provided the opportunity to develop intelligent systems for disease detection.Compared to other machine learning models,deep learning models have gained more attention from the research community,as they have shown better results with a large volume of data compared to shallow learning.However,no comprehensive survey has been conducted on integrated IoT-and computing-based systems that deploy deep learning for disease detection.This study evaluated different machine learning and deep learning algorithms and their hybrid and optimized algorithms for IoT-based disease detection,using the most recent papers on IoT-based disease detection systems that include computing approaches,such as cloud,edge,and fog.Their analysis focused on an IoT deep learning architecture suitable for disease detection.It also recognizes the different factors that require the attention of researchers to develop better IoT disease detection systems.This study can be helpful to researchers interested in developing better IoT-based disease detection and prediction systems based on deep learning using hybrid algorithms. 展开更多
关键词 Deep learning Internet of Things(IoT) Cloud computing Fog computing Edge computing
在线阅读 下载PDF
A comprehensive survey of orbital edge computing:Systems,applications,and algorithms
19
作者 Zengshan YIN Changhao WU +4 位作者 Chongbin GUO Yuanchun LI Mengwei XU Weiwei GAO Chuanxiu CHI 《Chinese Journal of Aeronautics》 2025年第7期310-339,共30页
The number of satellites,especially those operating in Low-Earth Orbit(LEO),has been exploding in recent years.Additionally,the burgeoning development of Artificial Intelligence(AI)software and hardware has opened up ... The number of satellites,especially those operating in Low-Earth Orbit(LEO),has been exploding in recent years.Additionally,the burgeoning development of Artificial Intelligence(AI)software and hardware has opened up new industrial opportunities in both air and space,with satellite-powered computing emerging as a new computing paradigm:Orbital Edge Computing(OEC).Compared to terrestrial edge computing,the mobility of LEO satellites and their limited communication,computation,and storage resources pose challenges in designing task-specific scheduling algorithms.Previous survey papers have largely focused on terrestrial edge computing or the integration of space and ground technologies,lacking a comprehensive summary of OEC architecture,algorithms,and case studies.This paper conducts a comprehensive survey and analysis of OEC's system architecture,applications,algorithms,and simulation tools,providing a solid background for researchers in the field.By discussing OEC use cases and the challenges faced,potential research directions for future OEC research are proposed. 展开更多
关键词 Orbital edge computing Ubiquitous computing Large-scale satellite constellations computation offloading
原文传递
Role of computed tomography in the assessment of caustic ingestion severity:A comprehensive review
20
作者 Alberto Martino Marco Di Serafino +8 位作者 Francesco Paolo Zito Luigi Orsini Lorena Pietrini Antonella Menchise Martina Cargiolli Lorenzo Anastasio Rossana Martino Raffaele Bennato Giovanni Lombardi 《World Journal of Radiology》 2025年第7期69-77,共9页
Caustic ingestion is a relatively rare but potentially catastrophic gastroentero-logical emergency.Upper gastrointestinal(GI)endoscopy is currently regarded as the gold standard modality not only to assess the depth a... Caustic ingestion is a relatively rare but potentially catastrophic gastroentero-logical emergency.Upper gastrointestinal(GI)endoscopy is currently regarded as the gold standard modality not only to assess the depth and the extension of GI caustic injury,but also to guide the appropriate treatment.Intriguingly,contrast-enhanced computed tomography(CECT)has recently emerged as a promising non-invasive and more accurate alternative to endoscopy in this setting.However,to date,evidence concerning the role of CECT as an alternative or complementary diagnostic tool to endoscopy in caustic ingestion is still limited.The aim of our review was to summarize and discuss the current evidence concerning the role of CECT in the emergency diagnosis of caustic ingestion and its value in assessing injury severity among non-pediatric patients. 展开更多
关键词 Caustic ingestion Corrosive ingestion Contrast-enhanced computed tomo-graphy computed tomography ENDOSCOPY Upper gastrointestinal endoscopy
暂未订购
上一页 1 2 250 下一页 到第
使用帮助 返回顶部