期刊文献+
共找到24,542篇文章
< 1 2 250 >
每页显示 20 50 100
Improved SDT Process Data Compression Algorithm 被引量:3
1
作者 冯晓东 Cheng +4 位作者 Changling Liu Changling Shao Huihe 《High Technology Letters》 EI CAS 2003年第2期91-96,共6页
Process data compression and trending are essential for improving control system performances. Swing Door Trending (SDT) algorithm is well designed to adapt the process trend while retaining the merit of simplicity. B... Process data compression and trending are essential for improving control system performances. Swing Door Trending (SDT) algorithm is well designed to adapt the process trend while retaining the merit of simplicity. But it cannot handle outliers and adapt to the fluctuations of actual data. An Improved SDT (ISDT) algorithm is proposed in this paper. The effectiveness and applicability of the ISDT algorithm are demonstrated by computations on both synthetic and real process data. By applying an adaptive recording limit as well as outliers-detecting rules, a higher compression ratio is achieved and outliers are identified and eliminated. The fidelity of the algorithm is also improved. It can be used both in online and batch mode, and integrated into existing software packages without change. 展开更多
关键词 SDT data compression process data treatment
在线阅读 下载PDF
A Mixture Modeling Approach to Detect Different Behavioral Patterns for Process Data
2
作者 Yue Xiao Hongyun Liu 《Fudan Journal of the Humanities and Social Sciences》 2025年第1期79-113,共35页
Process data recorded by computer-based assessments reflect how respondents solve problems and thus contain rich information about respondents as well as tasks.Considering that different respondents may exhibit differ... Process data recorded by computer-based assessments reflect how respondents solve problems and thus contain rich information about respondents as well as tasks.Considering that different respondents may exhibit different behavioral characteristics during problem-solving process,in this study,we propose a mixture one-parameter state response(Mix1P-SR)measurement model.This model assumes that respondents belong to discrete latent classes with different propensities towards responses to task states during the problem-solving process,and the varying response propensities are captured by different state parameters across classes.A Markov Chain Monte Carlo algorithm for the estimation of model parameters and classification of respondents is described.The simulation study shows that the Mix1P-SR model could recover parameters well on the premise that the average sequence length was not too short.Moreover,larger sample size,longer sequences,more uniform mixing proportions,and lower interclass similarity facilitated model convergence,model selection,and parameter estimation accuracy,with sequence length being particularly important.Based on the empirical data from PISA 2012,the Mix1P-SR model identified two latent classes of respondents.They had different patterns of state easiness parameters and exhibited different state response patterns,which affected their problem solving results.Implications for model application and future research directions are discussed. 展开更多
关键词 process data Mixture modeling State response Behavioral patterns Markov Chain Monte Carlo estimation
原文传递
Basic processing of the InSight seismic data from Mars for further seismological research
3
作者 Shuguang Wang Shuoxian Ning +4 位作者 Zhixiang Yao Jiaqi Li Wanbo Xiao Tianfan Yan Feng Xu 《Earthquake Science》 2025年第5期450-460,共11页
The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,suc... The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology. 展开更多
关键词 MARS INSIGHT SEISMOLOGY data process seismic noise
在线阅读 下载PDF
Modeling and Performance Evaluation of Streaming Data Processing System in IoT Architecture
4
作者 Feng Zhu Kailin Wu Jie Ding 《Computers, Materials & Continua》 2025年第5期2573-2598,共26页
With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Alth... With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads. 展开更多
关键词 System modeling performance evaluation streaming data process IoT system PEPA
在线阅读 下载PDF
Enhancing the data processing speed of a deep-learning-based three-dimensional single molecule localization algorithm (FD-DeepLoc) with a combination of feature compression and pipeline programming
5
作者 Shuhao Guo Jiaxun Lin +1 位作者 Yingjun Zhang Zhen-Li Huang 《Journal of Innovative Optical Health Sciences》 2025年第2期150-160,共11页
Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.... Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm. 展开更多
关键词 Real-time data processing feature compression pipeline programming
原文传递
A review of test methods for uniaxial compressive strength of rocks:Theory,apparatus and data processing
6
作者 Wei-Qiang Xie Xiao-Li Liu +2 位作者 Xiao-Ping Zhang Quan-Sheng Liu En-ZhiWang 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第3期1889-1905,共17页
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ... The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors. 展开更多
关键词 Uniaxial compressive strength(UCS) UCS testing methods Test apparatus data processing
在线阅读 下载PDF
Multi-scale intelligent fusion and dynamic validation for high-resolution seismic data processing in drilling
7
作者 YUAN Sanyi XU Yanwu +2 位作者 XIE Renjun CHEN Shuai YUAN Junliang 《Petroleum Exploration and Development》 2025年第3期680-691,共12页
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol... During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency. 展开更多
关键词 high-resolution seismic data processing while-drilling update while-drilling logging multi-source information fusion thin-bed weak reflection artificial intelligence modeling
在线阅读 下载PDF
APPLICATION OF GREY SYSTEM THEORY TO PROCESSING OF MEASURING DATA IN REVERSE ENGINEERING 被引量:3
8
作者 平雪良 周儒荣 安鲁陵 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2003年第1期36-41,共6页
The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured d... The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence. 展开更多
关键词 reverse engineering gr ey system theory DIGITIZATION data processing grey generation
在线阅读 下载PDF
Modeling and Analysis of Data Dependencies in Business Process for Data-Intensive Services 被引量:1
9
作者 yuze huang jiwei huang +1 位作者 budan wu junliang chen 《China Communications》 SCIE CSCD 2017年第10期151-163,共13页
With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependenc... With the growing popularity of data-intensive services on the Internet, the traditional process-centric model for business process meets challenges due to the lack of abilities to describe data semantics and dependencies, resulting in the inflexibility of the design and implement for the processes. This paper proposes a novel data-aware business process model which is able to describe both explicit control flow and implicit data flow. Data model with dependencies which are formulated by Linear-time Temporal Logic(LTL) is presented, and their satisfiability is validated by an automaton-based model checking algorithm. Data dependencies are fully considered in modeling phase, which helps to improve the efficiency and reliability of programming during developing phase. Finally, a prototype system based on j BPM for data-aware workflow is designed using such model, and has been deployed to Beijing Kingfore heating management system to validate the flexibility, efficacy and convenience of our approach for massive coding and large-scale system management in reality. 展开更多
关键词 data-aware business process data-intensive services data dependency linear-time temporal logic(LTL) services computing
在线阅读 下载PDF
Data processing of small samples based on grey distance information approach 被引量:14
10
作者 Ke Hongfa, Chen Yongguang & Liu Yi 1. Coll. of Electronic Science and Engineering, National Univ. of Defense Technology, Changsha 410073, P. R. China 2. Unit 63880, Luoyang 471003, P. R. China 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2007年第2期281-289,共9页
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di... Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective. 展开更多
关键词 data processing Grey theory Norm theory Small samples Uncertainty assessments Grey distance measure Information whitening ratio.
在线阅读 下载PDF
Optimal design of hot rolling process for C-Mn steel by combining industrial data-driven model and multi-objective optimization algorithm 被引量:7
11
作者 Si-wei Wu Xiao-guang Zhou +3 位作者 Jia-kuang Ren Guang-ming Cao Zhen-yu Liu Nai-an Shi 《Journal of Iron and Steel Research International》 SCIE EI CAS CSCD 2018年第7期700-705,共6页
A successful mechanical property data-driven prediction model is the core of the optimal design of hot rolling process for hot-rolled strips. However, the original industrial data, usually unbalanced, are inevitably m... A successful mechanical property data-driven prediction model is the core of the optimal design of hot rolling process for hot-rolled strips. However, the original industrial data, usually unbalanced, are inevitably mixed with fluctuant and abnormal values. Models established on the basis of the data without data processing can cause misleading results, which cannot be used for the optimal design of hot rolling process. Thus, a method of industrial data processing of C-Mn steel was proposed based on the data analysis. The Bayesian neural network was employed to establish the reliable mechanical property prediction models for the optimal design of hot rolling process. By using the multi-objective optimization algorithm and considering the individual requirements of costumers and the constraints of the equipment, the optimal design of hot rolling process was successfully applied to the rolling process design for Q345B steel with 0.017% Nb and 0.046% Ti content removed. The optimal process design results were in good agreement with the industrial trials results, which verify the effectiveness of the optimal design of hot rolling process. 展开更多
关键词 Industrial data data processing - Mechanical property Optimal design Hot rolling process C-Mn steel
原文传递
Magnetic field data processing methods of the China Seismo-Electromagnetic Satellite 被引量:15
12
作者 Bin Zhou YanYan Yang +4 位作者 YiTeng Zhang XiaoChen Gou BingJun Cheng JinDong Wang Lei Li 《Earth and Planetary Physics》 2018年第6期455-461,共7页
The High Precision Magnetometer(HPM) on board the China Seismo-Electromagnetic Satellite(CSES) allows highly accurate measurement of the geomagnetic field; it includes FGM(Fluxgate Magnetometer) and CDSM(Coupled Dark ... The High Precision Magnetometer(HPM) on board the China Seismo-Electromagnetic Satellite(CSES) allows highly accurate measurement of the geomagnetic field; it includes FGM(Fluxgate Magnetometer) and CDSM(Coupled Dark State Magnetometer)probes. This article introduces the main processing method, algorithm, and processing procedure of the HPM data. First, the FGM and CDSM probes are calibrated according to ground sensor data. Then the FGM linear parameters can be corrected in orbit, by applying the absolute vector magnetic field correction algorithm from CDSM data. At the same time, the magnetic interference of the satellite is eliminated according to ground-satellite magnetic test results. Finally, according to the characteristics of the magnetic field direction in the low latitude region, the transformation matrix between FGM probe and star sensor is calibrated in orbit to determine the correct direction of the magnetic field. Comparing the magnetic field data of CSES and SWARM satellites in five continuous geomagnetic quiet days, the difference in measurements of the vector magnetic field is about 10 nT, which is within the uncertainty interval of geomagnetic disturbance. 展开更多
关键词 China Seismo-Electromagnetic Satellite(CSES) High Precision Magnetometer(HPM) fluxgate magnetometer CPT magnetometer data processing
在线阅读 下载PDF
The development of data acquisition and processing application system for RF ion source 被引量:3
13
作者 Xiaodan ZHANG Xiaoying WANG +3 位作者 Chundong HU Caichao JIANG Yahong XlE Yuanzhe ZHAO 《Plasma Science and Technology》 SCIE EI CAS CSCD 2017年第7期124-129,共6页
As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and... As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments. 展开更多
关键词 RF ion source data acquisition data processing TCP LZO algorithm
在线阅读 下载PDF
A machine learning framework for low-field NMR data processing 被引量:5
14
作者 Si-Hui Luo Li-Zhi Xiao +4 位作者 Yan Jin Guang-Zhi Liao Bin-Sen Xu Jun Zhou Can Liang 《Petroleum Science》 SCIE CAS CSCD 2022年第2期581-593,共13页
Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strengt... Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strength of NMR tools and the complex petrophysical properties of detected samples.Suppressing the noise and highlighting the available NMR signals is very important for subsequent data processing.Most denoising methods are normally based on fixed mathematical transformation or handdesign feature selectors to suppress noise characteristics,which may not perform well because of their non-adaptive performance to different noisy signals.In this paper,we proposed a“data processing framework”to improve the quality of low field NMR echo data based on dictionary learning.Dictionary learning is a machine learning method based on redundancy and sparse representation theory.Available information in noisy NMR echo data can be adaptively extracted and reconstructed by dictionary learning.The advantages and application effectiveness of the proposed method were verified with a number of numerical simulations,NMR core data analyses,and NMR logging data processing.The results show that dictionary learning can significantly improve the quality of NMR echo data with high noise level and effectively improve the accuracy and reliability of inversion results. 展开更多
关键词 Dictionary learning Low-field NMR DENOISING data processing T_(2)distribution
原文传递
Data Processing Model of Coalmine Gas Early-Warning System 被引量:8
15
作者 QIAN Jian-sheng YIN Hong-sheng +2 位作者 LIU Xiu-rong HUA Gang XU Yong-gang 《Journal of China University of Mining and Technology》 EI 2007年第1期20-24,共5页
The data processing mode is vital to the performance of an entire coalmine gas early-warning system, especially in real-time performance. Our objective was to present the structural features of coalmine gas data, so t... The data processing mode is vital to the performance of an entire coalmine gas early-warning system, especially in real-time performance. Our objective was to present the structural features of coalmine gas data, so that the data could be processed at different priority levels in C language. Two different data processing models, one with priority and the other without priority, were built based on queuing theory. Their theoretical formulas were determined via a M/M/I model in order to calculate average occupation time of each measuring point in an early-warning program. We validated the model with the gas early-warning system of the Huaibei Coalmine Group Corp. The results indicate that the average occupation time for gas data processing by using the queuing system model with priority is nearly 1/30 of that of the model without priority. 展开更多
关键词 gas early-warning data processing queuing theory priority model high efficiency
在线阅读 下载PDF
GF-3 data real-time processing method based on multi-satellite distributed data processing system 被引量:7
16
作者 YANG Jun CAO Yan-dong +2 位作者 SUN Guang-cai XING Meng-dao GUO Liang 《Journal of Central South University》 SCIE EI CAS CSCD 2020年第3期842-852,共11页
Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process... Due to the limited scenes that synthetic aperture radar(SAR)satellites can detect,the full-track utilization rate is not high.Because of the computing and storage limitation of one satellite,it is difficult to process large amounts of data of spaceborne synthetic aperture radars.It is proposed to use a new method of networked satellite data processing for improving the efficiency of data processing.A multi-satellite distributed SAR real-time processing method based on Chirp Scaling(CS)imaging algorithm is studied in this paper,and a distributed data processing system is built with field programmable gate array(FPGA)chips as the kernel.Different from the traditional CS algorithm processing,the system divides data processing into three stages.The computing tasks are reasonably allocated to different data processing units(i.e.,satellites)in each stage.The method effectively saves computing and storage resources of satellites,improves the utilization rate of a single satellite,and shortens the data processing time.Gaofen-3(GF-3)satellite SAR raw data is processed by the system,with the performance of the method verified. 展开更多
关键词 synthetic aperture radar full-track utilization rate distributed data processing CS imaging algorithm field programmable gate array Gaofen-3
在线阅读 下载PDF
A novel technique for automatic seismic data processing using both integral and local feature of seismograms 被引量:3
17
作者 Ping Jin Chengliu Zhang +4 位作者 Xufeng Shen Hongchun Wang Changzhou Pan Na Lu Xiong Xu 《Earthquake Science》 2014年第3期337-349,共13页
A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depi... A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depict the shape of the whole seismograms. However, unlike some previous efforts which completely abandon the DIAL approach, i.e., signal detection, phase identifi- cation, association, and event localization, and seek to use envelope cross-correlation to detect seismic events directly, our technique keeps following the DIAL approach, but in addition to detect signals corresponding to individual seismic phases, it also detects continuous wave-trains and explores their feature for phase-type identification and signal association. More concrete ideas about how to define wave-trains and combine them with various detections, as well as how to measure and utilize their feature in the seismic data processing were expatiated in the paper. This approach has been applied to the routine data processing by us for years, and test results for a 16 days' period using data from the Xinjiang seismic station network were presented. The automatic processing results have fairly low false and missed event rate simultaneously, showing that the new technique has good application prospects for improvement of the automatic seismic data processing. 展开更多
关键词 Seismic - Automatic data processing Feature of seismograms
在线阅读 下载PDF
Cost of Multicast Logical Key Tree Based on Hierarchical Data Processing 被引量:2
18
作者 ZHOU Fucai XU Jian LI Ting 《Wuhan University Journal of Natural Sciences》 CAS 2006年第5期1172-1176,共5页
How to design a multicast key management system with high performance is a hot issue now. This paper will apply the idea of hierarchical data processing to construct a common analytic model based on directed logical k... How to design a multicast key management system with high performance is a hot issue now. This paper will apply the idea of hierarchical data processing to construct a common analytic model based on directed logical key tree and supply two important metrics to this problem: re-keying cost and key storage cost. The paper gives the basic theory to the hierarchical data processing and the analyzing model to multieast key management based on logical key tree. It has been proved that the 4-ray tree has the best performance in using these metrics. The key management problem is also investigated based on user probability model, and gives two evaluating parameters to re-keying and key storage cost. 展开更多
关键词 MULTICAST logical key tree hierarchical data processing
在线阅读 下载PDF
SLC-index: A scalable skip list-based index for cloud data processing 被引量:2
19
作者 HE Jing YAO Shao-wen +1 位作者 CAI Li ZHOU Wei 《Journal of Central South University》 SCIE EI CAS CSCD 2018年第10期2438-2450,共13页
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu... Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures. 展开更多
关键词 cloud computing distributed index cloud data processing skip list
在线阅读 下载PDF
The Implementation of Computer Data Processing Software for EAST NBI 被引量:1
20
作者 张小丹 胡纯栋 +3 位作者 盛鹏 赵远哲 吴德云 崔庆龙 《Plasma Science and Technology》 SCIE EI CAS CSCD 2014年第10期984-987,共4页
One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy t.o the plasma of the EAST superconducting tokamak. Correspondi... One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy t.o the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well. 展开更多
关键词 NBI data processing LZO algorithm TCP
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部