期刊文献+
共找到28,387篇文章
< 1 2 250 >
每页显示 20 50 100
Enhancing the data processing speed of a deep-learning-based three-dimensional single molecule localization algorithm (FD-DeepLoc) with a combination of feature compression and pipeline programming
1
作者 Shuhao Guo Jiaxun Lin +1 位作者 Yingjun Zhang Zhen-Li Huang 《Journal of Innovative Optical Health Sciences》 2025年第2期150-160,共11页
Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.... Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm. 展开更多
关键词 Real-time data processing feature compression pipeline programming
原文传递
IDCE:Integrated Data Compression and Encryption for Enhanced Security and Efficiency
2
作者 Muhammad Usama Arshad Aziz +2 位作者 Suliman A.Alsuhibany Imtiaz Hassan Farrukh Yuldashev 《Computer Modeling in Engineering & Sciences》 2025年第4期1029-1048,共20页
Data compression plays a vital role in datamanagement and information theory by reducing redundancy.However,it lacks built-in security features such as secret keys or password-based access control,leaving sensitive da... Data compression plays a vital role in datamanagement and information theory by reducing redundancy.However,it lacks built-in security features such as secret keys or password-based access control,leaving sensitive data vulnerable to unauthorized access and misuse.With the exponential growth of digital data,robust security measures are essential.Data encryption,a widely used approach,ensures data confidentiality by making it unreadable and unalterable through secret key control.Despite their individual benefits,both require significant computational resources.Additionally,performing them separately for the same data increases complexity and processing time.Recognizing the need for integrated approaches that balance compression ratios and security levels,this research proposes an integrated data compression and encryption algorithm,named IDCE,for enhanced security and efficiency.Thealgorithmoperates on 128-bit block sizes and a 256-bit secret key length.It combines Huffman coding for compression and a Tent map for encryption.Additionally,an iterative Arnold cat map further enhances cryptographic confusion properties.Experimental analysis validates the effectiveness of the proposed algorithm,showcasing competitive performance in terms of compression ratio,security,and overall efficiency when compared to prior algorithms in the field. 展开更多
关键词 Chaotic maps SECURITY data compression data encryption integrated compression and encryption
在线阅读 下载PDF
Battery pack capacity prediction using deep learning and data compression technique:A method for real-world vehicles
3
作者 Yi Yang Jibin Yang +4 位作者 Xiaohua Wu Liyue Fu Xinmei Gao Xiandong Xie Quan Ouyang 《Journal of Energy Chemistry》 2025年第7期553-564,共12页
The accurate prediction of battery pack capacity in electric vehicles(EVs)is crucial for ensuring safety and optimizing performance.Despite extensive research on predicting cell capacity using laboratory data,predicti... The accurate prediction of battery pack capacity in electric vehicles(EVs)is crucial for ensuring safety and optimizing performance.Despite extensive research on predicting cell capacity using laboratory data,predicting the capacity of onboard battery packs from field data remains challenging due to complex operating conditions and irregular EV usage in real-world settings.Most existing methods rely on extracting health feature parameters from raw data for capacity prediction of onboard battery packs,however,selecting specific parameters often results in a loss of critical information,which reduces prediction accuracy.To this end,this paper introduces a novel framework combining deep learning and data compression techniques to accurately predict battery pack capacity onboard.The proposed data compression method converts monthly EV charging data into feature maps,which preserve essential data characteristics while reducing the volume of raw data.To address missing capacity labels in field data,a capacity labeling method is proposed,which calculates monthly battery capacity by transforming the ampere-hour integration formula and applying linear regression.Subsequently,a deep learning model is proposed to build a capacity prediction model,using feature maps from historical months to predict the battery capacity of future months,thus facilitating accurate forecasts.The proposed framework,evaluated using field data from 20 EVs,achieves a mean absolute error of 0.79 Ah,a mean absolute percentage error of 0.65%,and a root mean square error of 1.02 Ah,highlighting its potential for real-world EV applications. 展开更多
关键词 Lithium-ion battery Capacity prediction Real-world vehicle data data compression Deep learning
在线阅读 下载PDF
A review of test methods for uniaxial compressive strength of rocks:Theory,apparatus and data processing
4
作者 Wei-Qiang Xie Xiao-Li Liu +2 位作者 Xiao-Ping Zhang Quan-Sheng Liu En-ZhiWang 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第3期1889-1905,共17页
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ... The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors. 展开更多
关键词 Uniaxial compressive strength(UCS) UCS testing methods Test apparatus data processing
在线阅读 下载PDF
Modeling and Performance Evaluation of Streaming Data Processing System in IoT Architecture
5
作者 Feng Zhu Kailin Wu Jie Ding 《Computers, Materials & Continua》 2025年第5期2573-2598,共26页
With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Alth... With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads. 展开更多
关键词 System modeling performance evaluation streaming data process IoT system PEPA
在线阅读 下载PDF
Automation and parallelization scheme to accelerate pulsar observation data processing
6
作者 Xingnan Zhang Minghui Li 《Astronomical Techniques and Instruments》 2025年第4期226-238,共13页
Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing t... Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2. 展开更多
关键词 Astronomical data Parallel processing PulsaR Exploration and Search TOolkit(PRESTO) CPU FAST Parkes
在线阅读 下载PDF
Multi-scale intelligent fusion and dynamic validation for high-resolution seismic data processing in drilling
7
作者 YUAN Sanyi XU Yanwu +2 位作者 XIE Renjun CHEN Shuai YUAN Junliang 《Petroleum Exploration and Development》 2025年第3期680-691,共12页
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol... During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency. 展开更多
关键词 high-resolution seismic data processing while-drilling update while-drilling logging multi-source information fusion thin-bed weak reflection artificial intelligence modeling
在线阅读 下载PDF
From Static and Dynamic Perspectives:A Survey on Historical Data Benchmarks of Control Performance Monitoring 被引量:1
8
作者 Pengyu Song Jie Wang +1 位作者 Chunhui Zhao Biao Huang 《IEEE/CAA Journal of Automatica Sinica》 2025年第2期300-316,共17页
In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data be... In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data benchmark(HIS)has garnered the most attention due to its practicality and effectiveness.However,existing CPM reviews usually focus on the theoretical benchmark,and there is a lack of an in-depth review that thoroughly explores HIS-based methods.In this article,a comprehensive overview of HIS-based CPM is provided.First,we provide a novel static-dynamic perspective on data-level manifestations of control performance underlying typical controller capacities including regulation and servo:static and dynamic properties.The static property portrays time-independent variability in system output,and the dynamic property describes temporal behavior driven by closed-loop feedback.Accordingly,existing HIS-based CPM approaches and their intrinsic motivations are classified and analyzed from these two perspectives.Specifically,two mainstream solutions for CPM methods are summarized,including static analysis and dynamic analysis,which match data-driven techniques with actual controlling behavior.Furthermore,this paper also points out various opportunities and challenges faced in CPM for modern industry and provides promising directions in the context of artificial intelligence for inspiring future research. 展开更多
关键词 Control performance monitoring(CPM) datadriven method historical data benchmark(HIS) industrial process performance index static and dynamic analysis.
在线阅读 下载PDF
An Advanced Image Processing Technique for Backscatter-Electron Data by Scanning Electron Microscopy for Microscale Rock Exploration 被引量:2
9
作者 Zhaoliang Hou Kunfeng Qiu +1 位作者 Tong Zhou Yiwei Cai 《Journal of Earth Science》 SCIE CAS CSCD 2024年第1期301-305,共5页
Backscatter electron analysis from scanning electron microscopes(BSE-SEM)produces high-resolution image data of both rock samples and thin-sections,showing detailed structural and geochemical(mineralogical)information... Backscatter electron analysis from scanning electron microscopes(BSE-SEM)produces high-resolution image data of both rock samples and thin-sections,showing detailed structural and geochemical(mineralogical)information.This allows an in-depth exploration of the rock microstructures and the coupled chemical characteristics in the BSE-SEM image to be made using image processing techniques.Although image processing is a powerful tool for revealing the more subtle data“hidden”in a picture,it is not a commonly employed method in geoscientific microstructural analysis.Here,we briefly introduce the general principles of image processing,and further discuss its application in studying rock microstructures using BSE-SEM image data. 展开更多
关键词 Image processing rock microstructures electron-based imaging data mining
原文传递
Big Data Application Simulation Platform Design for Onboard Distributed Processing of LEO Mega-Constellation Networks 被引量:1
10
作者 Zhang Zhikai Gu Shushi +1 位作者 Zhang Qinyu Xue Jiayin 《China Communications》 SCIE CSCD 2024年第7期334-345,共12页
Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In exist... Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In existing technologies,the efficiency of big data applications(BDAs)in distributed systems hinges on the stable-state and low-latency links between worker nodes.However,LMCNs with high-dynamic nodes and long-distance links can not provide the above conditions,which makes the performance of OBDP hard to be intuitively measured.To bridge this gap,a multidimensional simulation platform is indispensable that can simulate the network environment of LMCNs and put BDAs in it for performance testing.Using STK's APIs and parallel computing framework,we achieve real-time simulation for thousands of satellite nodes,which are mapped as application nodes through software defined network(SDN)and container technologies.We elaborate the architecture and mechanism of the simulation platform,and take the Starlink and Hadoop as realistic examples for simulations.The results indicate that LMCNs have dynamic end-to-end latency which fluctuates periodically with the constellation movement.Compared to ground data center networks(GDCNs),LMCNs deteriorate the computing and storage job throughput,which can be alleviated by the utilization of erasure codes and data flow scheduling of worker nodes. 展开更多
关键词 big data application Hadoop LEO mega-constellation multidimensional simulation onboard distributed processing
在线阅读 下载PDF
Data processing method for aerial testing of rotating accelerometer gravity gradiometer 被引量:1
11
作者 QIAN Xuewu TANG Hailiang 《中国惯性技术学报》 EI CSCD 北大核心 2024年第8期743-752,共10页
A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det... A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry. 展开更多
关键词 airborne gravity gradiometer data processing band-passing filter evaluation function
在线阅读 下载PDF
Research on the Development Strategies of Realtime Data Analysis and Decision-support Systems
12
作者 Wei Tang 《Journal of Electronic Research and Application》 2025年第2期204-210,共7页
With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This... With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques. 展开更多
关键词 Real-time data analysis Decision-support system Big data System architecture data processing Visualization technology
在线阅读 下载PDF
Remote Diagnosis and Analysis of Rail Vehicle Status Based on Train Control Management System Data
13
作者 Qiang Zhang Feng Jiao +2 位作者 Fan Liu Mengqi Yan Xiaoyu Bai 《Journal of Electronic Research and Application》 2025年第5期100-110,共11页
This article focuses on the remote diagnosis and analysis of rail vehicle status based on the data of the Train Control Management System(TCMS).It first expounds on the importance of train diagnostic analysis and desi... This article focuses on the remote diagnosis and analysis of rail vehicle status based on the data of the Train Control Management System(TCMS).It first expounds on the importance of train diagnostic analysis and designs a unified TCMS data frame transmission format.Subsequently,a remote data transmission link using 4G signals and data processing methods is introduced.The advantages of remote diagnosis are analyzed,and common methods such as correlation analysis,fault diagnosis,and fault prediction are explained in detail.Then,challenges such as data security and the balance between diagnostic accuracy and real-time performance are discussed,along with development prospects in technological innovation,algorithm optimization,and application promotion.This research provides ideas for remote analysis and diagnosis based on TCMS data,contributing to the safe and efficient operation of rail vehicles. 展开更多
关键词 Rail vehicle TCMS data Remote diagnosis data processing Fault prediction
在线阅读 下载PDF
Development of data acquisition system for induction heating equipment of large caliber coated tubes
14
作者 HE Chunyao WEN Hongquan 《Baosteel Technical Research》 2025年第1期41-46,共6页
In the anticorrosive coating line of a welded tube plant, the current status and existing problems of the medium-frequency induction heating equipment were discussed.Partial renovations of the power control cabinet ha... In the anticorrosive coating line of a welded tube plant, the current status and existing problems of the medium-frequency induction heating equipment were discussed.Partial renovations of the power control cabinet have been conducted.Parameters such as the DC current, DC voltage, intermediate frequency power, heating temperature, and the positioning signal at the pipe end were collected.A data acquisition and processing system, which can process data according to user needs and provide convenient data processing functions, has been developed using LabVIEW software.This system has been successfully applied in the coating line for the automatic control of high-power induction heating equipment, production management, and digital steel tube and/or digital delivery. 展开更多
关键词 induction heating data acquisition data processing coating line welded steel tube
在线阅读 下载PDF
Analysis of the Impact of Legal Digital Currencies on Bank Big Data Practices
15
作者 Zhengkun Xiu 《Journal of Electronic Research and Application》 2025年第1期23-27,共5页
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e... This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies. 展开更多
关键词 Legal digital currency Bank big data data processing efficiency data analysis and application Countermeasures and suggestions
在线阅读 下载PDF
Preliminary exploration of constructing a standardized process for prognostic biomarker discovery based on genetic big data
16
作者 Wang Min Yang Yongqi Li Xiawei 《China Standardization》 2025年第3期60-64,共5页
The paper utilized a standardized methodology to identify prognostic biomarkers in hepatocellular carcinoma(HCC)by analyzing transcriptomic and clinical data from The Cancer Genome Atlas(TCGA)database.The approach,whi... The paper utilized a standardized methodology to identify prognostic biomarkers in hepatocellular carcinoma(HCC)by analyzing transcriptomic and clinical data from The Cancer Genome Atlas(TCGA)database.The approach,which included stringent data preprocessing,differential gene expression analysis,and Kaplan-Meier survival analysis,provided valuable insights into the genetic underpinnings of HCC.The comprehensive analysis of a dataset involving 370 HCC patients uncovered correlations between survival status and pathological characteristics,including tumor size,lymph node involvement,and distant metastasis.The processed transcriptome dataset,comprising 420 samples and annotating 26,783 genes,served as a robust platform for identifying differential gene expression patterns.Among the significant differential expression genes,the key genes such as FBXO43,HAGLROS,CRISPLD1,LRRC3.DT,and ERN2,were pinpointed,which showed significant associations with patient survival outcomes,indicating their potential as novel prognostic biomarkers.This study can not only enhance the understanding of HCC’s genetic landscape but also establish a blueprint for a standardized process to discover prognostic biomarkers of various diseases using genetic big data.Future research should focus on validating these biomarkers through independent cohorts and exploring their utility in the development of personalized treatment strategies. 展开更多
关键词 standardized process genetic big data prognostic biomarkers Kaplan-Meier survival analysis hepatocellular carcinoma
暂未订购
Research on Airborne Point Cloud Data Registration Using Urban Buildings as an Example
17
作者 Yajun Fan Yujun Shi +1 位作者 Chengjie Su Kai Wang 《Journal of World Architecture》 2025年第4期35-42,共8页
Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)... Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field. 展开更多
关键词 Airborne LiDAR Point cloud registration Point cloud data processing Systematic error
在线阅读 下载PDF
Data matching and association based on the arc-segment difference method
18
作者 Jiannan Sun Zhe Kang +1 位作者 Zhenwei Li Cunbo Fan 《Astronomical Techniques and Instruments》 2025年第5期299-309,共11页
In response to the issue of fuzzy matching and association when optical observation data are matched with the orbital elements in a catalog database,this paper proposes a matching and association strategy based on the... In response to the issue of fuzzy matching and association when optical observation data are matched with the orbital elements in a catalog database,this paper proposes a matching and association strategy based on the arcsegment difference method.First,a matching error threshold is set to match the observation data with the known catalog database.Second,the matching results for the same day are sorted on the basis of target identity and observation residuals.Different matching error thresholds and arc-segment dynamic association thresholds are then applied to categorize the observation residuals of the same target across different arc-segments,yielding matching results under various thresholds.Finally,the orbital residual is computed through orbit determination(OD),and the positional error is derived by comparing the OD results with the orbit track from the catalog database.The appropriate matching error threshold is then selected on the basis of these results,leading to the final matching and association of the fuzzy correlation data.Experimental results showed that the correct matching rate for data arc-segments is 92.34% when the matching error threshold is set to 720″,with the arc-segment difference method processing the results of an average matching rate of 97.62% within 8 days.The remaining 5.28% of the fuzzy correlation data are correctly matched and associated,enabling identification of orbital maneuver targets through further processing and analysis.This method substantially enhances the efficiency and accuracy of space target cataloging,offering robust technical support for dynamic maintenance of the space target database. 展开更多
关键词 Optical data processing Space target identification Fuzzy correlation Arc-segment difference method Orbit determination
在线阅读 下载PDF
ADGAP:a user-friendly online ancient DNA database and genome analysis platform
19
作者 Yanwei Chen Yu Xu +1 位作者 Kongyang Zhu Chuan-Chao Wang 《Journal of Genetics and Genomics》 2025年第8期1058-1061,共4页
The analysis of ancient genomics provides opportunities to explore human population history across both temporal and geographic dimensions(Haak et al.,2015;Wang et al.,2021,2024)to enhance the accessibility and utilit... The analysis of ancient genomics provides opportunities to explore human population history across both temporal and geographic dimensions(Haak et al.,2015;Wang et al.,2021,2024)to enhance the accessibility and utility of these ancient genomic datasets,a range of databases and advanced statistical models have been developed,including the Allen Ancient DNA Resource(AADR)(Mallick et al.,2024)and AdmixTools(Patterson et al.,2012).While upstream processes such as sequencing and raw data processing have been streamlined by resources like the AADR,the downstream analysis of these datasets-encompassing population genetics inference and spatiotemporal interpretation-remains a significant challenge.The AADR provides a unified collection of published ancient DNA(aDNA)data,yet its file-based format and reliance on command-line tools,such as those in Admix-Tools(Patterson et al.,2012),require advanced computational expertise for effective exploration and analysis.These requirements can present significant challenges forresearchers lackingadvanced computational expertise,limiting the accessibility and broader application of these valuable genomic resources. 展开更多
关键词 dataBASE raw data processing analysis ancient genomics upstream processes ancient DNA explore human population history allen ancient dna resource aadr mallick ancient genomic datasetsa
原文传递
DSP-free coherent receivers in frequency-synchronous optical networks for next-generation data center interconnects
20
作者 Lei Liu Feng Liu +2 位作者 Cheng Peng Bo Xue William Shieh 《Advanced Photonics Nexus》 2025年第3期141-148,共8页
Propelled by the rise of artificial intelligence,cloud services,and data center applications,next-generation,low-power,local-oscillator-less,digital signal processing(DSP)-free,and short-reach coherent optical communi... Propelled by the rise of artificial intelligence,cloud services,and data center applications,next-generation,low-power,local-oscillator-less,digital signal processing(DSP)-free,and short-reach coherent optical communication has evolved into an increasingly prominent area of research in recent years.Here,we demonstrate DSP-free coherent optical transmission by analog signal processing in frequency synchronous optical network(FSON)architecture,which supports polarization multiplexing and higher-order modulation formats.The FSON architecture that allows the numerous laser sources of optical transceivers within a data center can be quasi-synchronized by means of a tree-distributed homology architecture.In conjunction with our proposed pilot-tone assisted Costas loop for an analog coherent receiver,we achieve a record dual-polarization 224-Gb/s 16-QAM 5-km mismatch transmission with reset-free carrier phase recovery in the optical domain.Our proposed DSP-free analog coherent detection system based on the FSON makes it a promising solution for next-generation,low-power,and high-capacity coherent data center interconnects. 展开更多
关键词 digital signal processing-free data center interconnect frequency synchronous optical network analog signal processing
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部