期刊文献+
共找到25,981篇文章
< 1 2 250 >
每页显示 20 50 100
Enhancing the data processing speed of a deep-learning-based three-dimensional single molecule localization algorithm (FD-DeepLoc) with a combination of feature compression and pipeline programming
1
作者 Shuhao Guo Jiaxun Lin +1 位作者 Yingjun Zhang Zhen-Li Huang 《Journal of Innovative Optical Health Sciences》 2025年第2期150-160,共11页
Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.... Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm. 展开更多
关键词 Real-time data processing feature compression pipeline programming
原文传递
Multi-scale intelligent fusion and dynamic validation for high-resolution seismic data processing in drilling
2
作者 YUAN Sanyi XU Yanwu +2 位作者 XIE Renjun CHEN Shuai YUAN Junliang 《Petroleum Exploration and Development》 2025年第3期680-691,共12页
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol... During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency. 展开更多
关键词 high-resolution seismic data processing while-drilling update while-drilling logging multi-source information fusion thin-bed weak reflection artificial intelligence modeling
在线阅读 下载PDF
Modeling and Performance Evaluation of Streaming Data Processing System in IoT Architecture
3
作者 Feng Zhu Kailin Wu Jie Ding 《Computers, Materials & Continua》 2025年第5期2573-2598,共26页
With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Alth... With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads. 展开更多
关键词 System modeling performance evaluation streaming data process IoT system PEPA
在线阅读 下载PDF
Automation and parallelization scheme to accelerate pulsar observation data processing
4
作者 Xingnan Zhang Minghui Li 《Astronomical Techniques and Instruments》 2025年第4期226-238,共13页
Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing t... Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2. 展开更多
关键词 Astronomical data Parallel processing PulsaR Exploration and Search TOolkit(PRESTO) CPU FAST Parkes
在线阅读 下载PDF
A review of test methods for uniaxial compressive strength of rocks:Theory,apparatus and data processing
5
作者 Wei-Qiang Xie Xiao-Li Liu +2 位作者 Xiao-Ping Zhang Quan-Sheng Liu En-ZhiWang 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第3期1889-1905,共17页
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ... The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors. 展开更多
关键词 Uniaxial compressive strength(UCS) UCS testing methods Test apparatus data processing
在线阅读 下载PDF
GT-scopy:A Data Processing and Enhancing Package(Level 1.0-1.5)for Ground Solar Telescopes——Based on the 1.6 m Goode Solar Telescope
6
作者 Ding Yuan Wei Wu +4 位作者 Song Feng Libo Fu Wenda Cao Jianchuan Zheng Lin Mei 《Research in Astronomy and Astrophysics》 2025年第11期191-197,共7页
The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a pytho... The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community. 展开更多
关键词 techniques:image processing methods:data analysis Astronomical Instrumentation Methods and Techniques
在线阅读 下载PDF
Status of UnDifferenced and Uncombined GNSS Data Processing Activities in China 被引量:1
7
作者 Pengyu HOU Delu CHE +3 位作者 Teng LIU Jiuping ZHA Yunbin YUAN Baocheng ZHANG 《Journal of Geodesy and Geoinformation Science》 CSCD 2023年第3期135-144,共10页
With the continued development of multiple Global Navigation Satellite Systems(GNSS)and the emergence of various frequencies,UnDifferenced and UnCombined(UDUC)data processing has become an increasingly attractive opti... With the continued development of multiple Global Navigation Satellite Systems(GNSS)and the emergence of various frequencies,UnDifferenced and UnCombined(UDUC)data processing has become an increasingly attractive option.In this contribution,we provide an overview of the current status of UDUC GNSS data processing activities in China.These activities encompass the formulation of Precise Point Positioning(PPP)models and PPP-Real-Time Kinematic(PPP-RTK)models for processing single-station and multi-station GNSS data,respectively.Regarding single-station data processing,we discuss the advancements in PPP models,particularly the extension from a single system to multiple systems,and from dual frequencies to single and multiple frequencies.Additionally,we introduce the modified PPP model,which accounts for the time variation of receiver code biases,a departure from the conventional PPP model that typically assumes these biases to be time-constant.In the realm of multi-station PPP-RTK data processing,we introduce the ionosphere-weighted PPP-RTK model,which enhances the model strength by considering the spatial correlation of ionospheric delays.We also review the phase-only PPP-RTK model,designed to mitigate the impact of unmodelled code-related errors.Furthermore,we explore GLONASS PPP-RTK,achieved through the application of the integer-estimable model.For large-scale network data processing,we introduce the all-in-view PPP-RTK model,which alleviates the strict common-view requirement at all receivers.Moreover,we present the decentralized PPP-RTK data processing strategy,designed to improve computational efficiency.Overall,this work highlights the various advancements in UDUC GNSS data processing,providing insights into the state-of-the-art techniques employed in China to achieve precise GNSS applications. 展开更多
关键词 Global Navigation Satellite Systems(GNSS) UnDifferenced and UnCombined(UDUC) Precise Point Positioning(PPP) PPP-Real-Time Kinematic(PPP-RTK) single-station data processing multi-station data processing
在线阅读 下载PDF
A novel technique for automatic seismic data processing using both integral and local feature of seismograms 被引量:3
8
作者 Ping Jin Chengliu Zhang +4 位作者 Xufeng Shen Hongchun Wang Changzhou Pan Na Lu Xiong Xu 《Earthquake Science》 2014年第3期337-349,共13页
A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depi... A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depict the shape of the whole seismograms. However, unlike some previous efforts which completely abandon the DIAL approach, i.e., signal detection, phase identifi- cation, association, and event localization, and seek to use envelope cross-correlation to detect seismic events directly, our technique keeps following the DIAL approach, but in addition to detect signals corresponding to individual seismic phases, it also detects continuous wave-trains and explores their feature for phase-type identification and signal association. More concrete ideas about how to define wave-trains and combine them with various detections, as well as how to measure and utilize their feature in the seismic data processing were expatiated in the paper. This approach has been applied to the routine data processing by us for years, and test results for a 16 days' period using data from the Xinjiang seismic station network were presented. The automatic processing results have fairly low false and missed event rate simultaneously, showing that the new technique has good application prospects for improvement of the automatic seismic data processing. 展开更多
关键词 Seismic - Automatic data processing Feature of seismograms
在线阅读 下载PDF
Cost of Multicast Logical Key Tree Based on Hierarchical Data Processing 被引量:2
9
作者 ZHOU Fucai XU Jian LI Ting 《Wuhan University Journal of Natural Sciences》 CAS 2006年第5期1172-1176,共5页
How to design a multicast key management system with high performance is a hot issue now. This paper will apply the idea of hierarchical data processing to construct a common analytic model based on directed logical k... How to design a multicast key management system with high performance is a hot issue now. This paper will apply the idea of hierarchical data processing to construct a common analytic model based on directed logical key tree and supply two important metrics to this problem: re-keying cost and key storage cost. The paper gives the basic theory to the hierarchical data processing and the analyzing model to multieast key management based on logical key tree. It has been proved that the 4-ray tree has the best performance in using these metrics. The key management problem is also investigated based on user probability model, and gives two evaluating parameters to re-keying and key storage cost. 展开更多
关键词 MULTICAST logical key tree hierarchical data processing
在线阅读 下载PDF
GF-3 data real-time processing method based on multi-satellite distributed data processing system 被引量:7
10
作者 YANG Jun CAO Yan-dong +2 位作者 SUN Guang-cai XING Meng-dao GUO Liang 《Journal of Central South University》 SCIE EI CAS CSCD 2020年第3期842-852,共11页
Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process... Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified. 展开更多
关键词 synthetic aperture radar full-track utilization rate distributed data processing CS imaging algorithm field programmable gate array Gaofen-3
在线阅读 下载PDF
SLC-index: A scalable skip list-based index for cloud data processing 被引量:2
11
作者 HE Jing YAO Shao-wen +1 位作者 CAI Li ZHOU Wei 《Journal of Central South University》 SCIE EI CAS CSCD 2018年第10期2438-2450,共13页
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu... Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures. 展开更多
关键词 cloud computing distributed index cloud data processing skip list
在线阅读 下载PDF
Research on Constructing Contoursfrom Regular Terrain GridsContaining Invalid Data 被引量:1
12
作者 粟卫民 吴凡 《Journal of China University of Mining and Technology》 2004年第2期138-142,共5页
A new method for constructing contours from complicated terrain elevation grids containing invalid data is put forward. By using this method, the topological consistency of contours in groups can be maintained effecti... A new method for constructing contours from complicated terrain elevation grids containing invalid data is put forward. By using this method, the topological consistency of contours in groups can be maintained effectively and the contours can be drawn smoothly based on boundaries pre-searching and local correction. An experimental example is given to demonstrate that the contours constructed by this method are of good quality. 展开更多
关键词 contourS TERRAIN ELEVATION grid INVALID data curves INTERSECTION topological consistency
在线阅读 下载PDF
Effects of the differences between the ITRF2000 and ITRF2005 models in GNSS data processing 被引量:1
13
作者 Zhan Wei Zhu Shuang +3 位作者 Yang Bo Wu Yanqiang Liu Zhiguang Meng Xiangang 《Geodesy and Geodynamics》 2013年第4期46-50,共5页
In comparison with the ITRF2000 model, the ITRF2005 model represents a significant improvement in solution generation, datum definition and realization. However, these improvements cause a frame difference between the... In comparison with the ITRF2000 model, the ITRF2005 model represents a significant improvement in solution generation, datum definition and realization. However, these improvements cause a frame difference between the ITRF2000 and ITRF2005 models, which may impact GNSS data processing. To quantify this im- pact, the differences of the GNSS results obtained using the two models, including station coordinates, base- line length and horizontal velocity field, were analyzed. After transformation, the differences in position were at the millimeter level, and the differences in baseline length were less than 1 ram. The differences in the hori- zontal velocity fields decreased with as the study area was reduced. For a large region, the differences in these value were less than 1 mm/a, with a systematic difference of approximately 2 degrees in direction, while for a medium-sized region, the differences in value and direction were not significant. 展开更多
关键词 ITRF2000 ITRF2005 GNSS data processing DIFFERENCE effect
原文传递
APPLICATION OF NOISE REDUCTION METHOD BASED ON CURVELET THRESHOLDING NEURAL NETWORK FOR POLAR ICE RADAR DATA PROCESSING 被引量:1
14
作者 Wang Wenpeng Zhao Bo Liu Xiaojun 《Journal of Electronics(China)》 2013年第4期377-383,共7页
Due to the demand of data processing for polar ice radar in our laboratory, a Curvelet Thresholding Neural Network (TNN) noise reduction method is proposed, and a new threshold function with infinite-order continuous ... Due to the demand of data processing for polar ice radar in our laboratory, a Curvelet Thresholding Neural Network (TNN) noise reduction method is proposed, and a new threshold function with infinite-order continuous derivative is constructed. The method is based on TNN model. In the learning process of TNN, the gradient descent method is adopted to solve the adaptive optimal thresholds of different scales and directions in Curvelet domain, and to achieve an optimal mean square error performance. In this paper, the specific implementation steps are presented, and the superiority of this method is verified by simulation. Finally, the proposed method is used to process the ice radar data obtained during the 28th Chinese National Antarctic Research Expedition in the region of Zhongshan Station, Antarctica. Experimental results show that the proposed method can reduce the noise effectively, while preserving the edge of the ice layers. 展开更多
关键词 Radar data processing Thresholding Neural Network (TNN) CURVELET Ice radar
在线阅读 下载PDF
Seismic Data Processing Approaches for the Study of Gas Hydrates in the East China Sea 被引量:1
15
作者 LIU Huaishan ZHOU Zhengyun 《Journal of Ocean University of Qingdao》 2002年第1期87-92,共6页
A comprehensive study of the data profiles, including the 2D seismic data, single channel seismic data, shallow sections, etc., reveals that gas hydrates occur in the East China Sea. A series of special techniques are... A comprehensive study of the data profiles, including the 2D seismic data, single channel seismic data, shallow sections, etc., reveals that gas hydrates occur in the East China Sea. A series of special techniques are used in the processing of seismic data, which include enhancing the accuracy of velocity analysis and resolution, estimating the wavelet, suppressing the multiple, preserving the relative amplitude, using the DMO and AVO techniques and some special techniques in dealing with the wave impedance. The existence of gas hydrates is reflected in the subbottom profiles in the appearance of BSRs, amplitude anomalies, velocity anomalies and AVO anomalies, etc. Hence the gas hydrates can be identified and predicted. It is pointed out that the East China Sea is a favorable area of the gas hydrates resource, and the Okinawa Trough is a target area of gas hydrates reservoir. 展开更多
关键词 gas hydrates seismic data processing seismic characteristic bottom simulating reflector(BSR)
在线阅读 下载PDF
Research on the Algorithm of Extracting Ridgeand Valley Lines from Contour Data
16
作者 JIN Hailiang GAO Jingxiang 《Geo-Spatial Information Science》 2005年第4期282-286,共5页
In this paper, the authors put forward a brief and practical arithmetic for auto extracting terrain lines from digital terrain data after analyzing and comparing current arithmetics.This paper gives an experiment resu... In this paper, the authors put forward a brief and practical arithmetic for auto extracting terrain lines from digital terrain data after analyzing and comparing current arithmetics.This paper gives an experiment result that the ridge and valley extracted by the arithmetic is consistent with the experimental terrain. 展开更多
关键词 contour data ridge line valley line
在线阅读 下载PDF
Seismic Data Processing and Interpretation on the Loess Plateau, Part1: Seismic Data Processing
17
作者 蒋茄钰 付守献 李九灵 《Applied Geophysics》 SCIE CSCD 2005年第4期241-246,共6页
Branching river channels and the coexistence of valleys, ridges, hiils, and slopes'as the result of long-term weathering and erosion form the unique loess topography. The Changqing Geophysical Company, working in the... Branching river channels and the coexistence of valleys, ridges, hiils, and slopes'as the result of long-term weathering and erosion form the unique loess topography. The Changqing Geophysical Company, working in these complex conditions, has established a suite of technologies for high-fidelity processing and fine interpretation of seismic data. This article introduces the processes involved in the data processing and interpretation and illustrates the results. 展开更多
关键词 loess plateau seismic data processing STATICS bin optimization noise attenuation data interpretation
在线阅读 下载PDF
Data Augmentation Using Contour Image for Convolutional Neural Network
18
作者 Seung-Yeon Hwang Jeong-Joon Kim 《Computers, Materials & Continua》 SCIE EI 2023年第6期4669-4680,共12页
With the development of artificial intelligence-related technologies such as deep learning,various organizations,including the government,are making various efforts to generate and manage big data for use in artificia... With the development of artificial intelligence-related technologies such as deep learning,various organizations,including the government,are making various efforts to generate and manage big data for use in artificial intelligence.However,it is difficult to acquire big data due to various social problems and restrictions such as personal information leakage.There are many problems in introducing technology in fields that do not have enough training data necessary to apply deep learning technology.Therefore,this study proposes a mixed contour data augmentation technique,which is a data augmentation technique using contour images,to solve a problem caused by a lack of data.ResNet,a famous convolutional neural network(CNN)architecture,and CIFAR-10,a benchmark data set,are used for experimental performance evaluation to prove the superiority of the proposed method.And to prove that high performance improvement can be achieved even with a small training dataset,the ratio of the training dataset was divided into 70%,50%,and 30%for comparative analysis.As a result of applying the mixed contour data augmentation technique,it was possible to achieve a classification accuracy improvement of up to 4.64%and high accuracy even with a small amount of data set.In addition,it is expected that the mixed contour data augmentation technique can be applied in various fields by proving the excellence of the proposed data augmentation technique using benchmark datasets. 展开更多
关键词 data augmentation image classification deep learning convolutional neural network mixed contour image benchmark dataset
在线阅读 下载PDF
Reconfigurable-System-on-Chip Implementation of Data Processing Units for Space Applications
19
作者 张宇宁 常亮 +1 位作者 杨根庆 李华旺 《Transactions of Tianjin University》 EI CAS 2010年第4期270-274,共5页
Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial d... Application-specific data processing units (DPUs) are commonly adopted for operational control and data processing in space missions. To overcome the limitations of traditional radiation-hardened or fully commercial design approaches, a reconfigurable-system-on-chip (RSoC) solution based on state-of-the-art FPGA is introduced. The flexibility and reliability of this approach are outlined, and the requirements for an enhanced RSoC design with in-flight reconfigurability for space applications are presented. This design has been demonstrated as an on-board computer prototype, providing an in-flight reconfigurable DPU design approach using integrated hardwired processors. 展开更多
关键词 RSoC in-flight reconfigurability spaceborne data processing unit
在线阅读 下载PDF
Application and evaluation of layering shear method in LADCP data processing
20
作者 Zijian Cui Chujin Liang +2 位作者 Binbin Guo Feilong Lin Yong Mu 《Acta Oceanologica Sinica》 SCIE CAS CSCD 2023年第12期9-21,共13页
The current velocity observation of LADCP(Lowered Acoustic Doppler Current Profiler)has the advantages of a large vertical range of observation and high operability compared with traditional current measurement method... The current velocity observation of LADCP(Lowered Acoustic Doppler Current Profiler)has the advantages of a large vertical range of observation and high operability compared with traditional current measurement methods,and is being widely used in the field of ocean observation.Shear and inverse methods are now commonly used by the international marine community to process LADCP data and calculate ocean current profiles.The two methods have their advantages and shortcomings.The shear method calculates the value of current shear more accurately,while the accuracy in an absolute value of the current is lower.The inverse method calculates the absolute value of the current velocity more accurately,but the current shear is less accurate.Based on the shear method,this paper proposes a layering shear method to calculate the current velocity profile by“layering averaging”,and proposes corresponding current calculation methods according to the different types of problems in several field observation data from the western Pacific,forming an independent LADCP data processing system.The comparison results have shown that the layering shear method can achieve the same effect as the inverse method in the calculation of the absolute value of current velocity,while retaining the advantages of the shear method in the calculation of a value of the current shear. 展开更多
关键词 LADCP data processing layering shear method Western Pacific
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部