While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as...While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.展开更多
As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and...As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.展开更多
In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for search...In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for searching residual oil in matured fields. Time-lapse seismic reservoir monitoring technique is one of most important techniques to define residual oil distribution. According to the demand for and development of time-lapse seismic reservoir monitoring in China, purposeless repeated acquisition time-lapse seismic data processing was studied. The four key steps in purposeless repeated acquisition time-lapse seismic data processing, including amplitude-preserved processing with relative consistency, rebinning, match filtering and difference calculation, were analyzed by combining theory and real seismic data processing. Meanwhile, quality control during real time-lapse seismic processing was emphasized.展开更多
Research for detecting or obtaining radionuclide by gamma energy spectrum data acquisition and process system is one of the key issues about intelligent measurement of gamma-ray spectrum. For this reason, a software a...Research for detecting or obtaining radionuclide by gamma energy spectrum data acquisition and process system is one of the key issues about intelligent measurement of gamma-ray spectrum. For this reason, a software and hardware implementation schematic design based on ARM ( Advanced RISC Machines) + DSP ( Digital Signal Processor) architecture for gamma energy spectrum data acquisition and processing system is proposed. The paper discusses in detail some key technologies such as communication interface design between microcontroller ARM and digital signal processor DSP,distribution scheduling under multi-task in the ARM-Linux,DSP handling procedures for multi-channel A / D high-speed sample. At the same time,because the traditional Gaussian fitting to determine the boundary of peak is not ideal,it puts forward a weighting factor of Gaussian function least squares fitting realize boundary determined. Finally gamma-spectrum data from sodium iodide NaI( TI) scintillation detector is tested and processed in the new system. The results show that gamma energy spectrum data acquisition and process system is perfect functionality, stable and convergence in unimodal. Compared with data from conventional energy spectrometers,the system can keep better energy resolution in a wide range of pulse pass rate.展开更多
In order to realize the high speed data acquisition and fast Fourier analysis, the paper put forward a kind of high speed data acquisition and analysis system based on FPGA, the system uses Cyclone series FPGA with hi...In order to realize the high speed data acquisition and fast Fourier analysis, the paper put forward a kind of high speed data acquisition and analysis system based on FPGA, the system uses Cyclone series FPGA with high-speed A/D converter, and use the fast Fourier custom analysis nucleation of Altera company, using the standard TCP/IP protocol communication with PC, match up the master machine based on Matlab GUI analysis software. We experiment high speed data acquisition and fast Fourier analysis for a plurality of groups of high frequency analog signals, at the same time the results display on the computer. The experimental results validate the fast Fourier analysis theory, and has realized the low cost, high performance data acquisition and analysis of the complete system design.展开更多
A multi-beam chirp sonar based on IP connections and DSP processing nodes was proposed and designed to provide an expandable system with high-speed processing and mass-storage of real-time signals for multi-beam profi...A multi-beam chirp sonar based on IP connections and DSP processing nodes was proposed and designed to provide an expandable system with high-speed processing and mass-storage of real-time signals for multi-beam profiling sonar.The system was designed for seabed petroleum pipeline detection and orientation,and can receive echo signals and process the data in real time,refreshing the display 10 times per second.Every node of the chirp sonar connects with data processing nodes through TCP/IP. Merely by adding nodes,the system’s processing ability can be increased proportionately without changing the software.System debugging and experimental testing proved the system to be practical and stable.This design provides a new method for high speed active sonar.展开更多
This article describes the data processing and acquisition system for the HT-7 mul-tipulse Thomson scattering diagnostic. An eight-pulse laser is used in the Thomson scattering system to obtain electron temperature pr...This article describes the data processing and acquisition system for the HT-7 mul-tipulse Thomson scattering diagnostic. An eight-pulse laser is used in the Thomson scattering system to obtain electron temperature profiles at eight different times throughout an entire plasma discharge. The major components of the diagnostic system consist of a multipulse Nd-glass laser, a photodetector's subsystem, a calibration set and a CAMAC data processing and acquisition system. The data processing software along with LeCroy 2250L will perform the data acquisition. In order to simplify the operation and extend the capability of its compatibility with other math softwares, the processing software has been improved by the authors. The new software based on the VC++ easily utilizes some math softwares to calculate the electron temperature. The new software is simpler and more operational than the old one.展开更多
At present, with the rapid development of science and technology, based on the requirements of weapon test and identification tasks, the experimental data acquisition and processing space station, which is suitable fo...At present, with the rapid development of science and technology, based on the requirements of weapon test and identification tasks, the experimental data acquisition and processing space station, which is suitable for a variety of extreme natural environments such as alpine, plateau, mountain, jungle, desert, island and reef, has been studied theoretically and in practice. The space station is a dome-shaped structure with scale-shaped modules and basalt reinforced fiber composite materials, providing thermal insulation, ventilation and continuous power supply. It can provide support and guarantee for the real-time monitoring, recovery and information transmission of test data, and meet the basic work and life needs of test personnel.展开更多
This report presents the design and implementation of a Distributed Data Acquisition、 Monitoring and Processing System (DDAMAP)。It is assumed that operations of a factory are organized into two-levels: client machin...This report presents the design and implementation of a Distributed Data Acquisition、 Monitoring and Processing System (DDAMAP)。It is assumed that operations of a factory are organized into two-levels: client machines at plant-level collect real-time raw data from sensors and measurement instrumentations and transfer them to a central processor over the Ethernets, and the central processor handles tasks of real-time data processing and monitoring. This system utilizes the computation power of Intel T2300 dual-core processor and parallel computations supported by multi-threading techniques. Our experiments show that these techniques can significantly improve the system performance and are viable solutions to real-time high-speed data processing.展开更多
In the anticorrosive coating line of a welded tube plant, the current status and existing problems of the medium-frequency induction heating equipment were discussed.Partial renovations of the power control cabinet ha...In the anticorrosive coating line of a welded tube plant, the current status and existing problems of the medium-frequency induction heating equipment were discussed.Partial renovations of the power control cabinet have been conducted.Parameters such as the DC current, DC voltage, intermediate frequency power, heating temperature, and the positioning signal at the pipe end were collected.A data acquisition and processing system, which can process data according to user needs and provide convenient data processing functions, has been developed using LabVIEW software.This system has been successfully applied in the coating line for the automatic control of high-power induction heating equipment, production management, and digital steel tube and/or digital delivery.展开更多
Children can acquire knowledge of their mother tongue easily in a relatively short time,whereas adults are too inferior to bear the comparison in learning a second language.This paper sets out to study the background ...Children can acquire knowledge of their mother tongue easily in a relatively short time,whereas adults are too inferior to bear the comparison in learning a second language.This paper sets out to study the background and process of children's and adults' language learning,make comparison and contrast,and find out an effective way to promote adults' second language learning.展开更多
The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,suc...The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.展开更多
With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Alth...With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads.展开更多
Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem....Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm.展开更多
Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing t...Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2.展开更多
A state-of-the-art detector array with a digital data acquisition system has been developed for charged-particle decay studies,includingβ-delayed protons,αdecay,and direct proton emissions from exotic proton-rich nu...A state-of-the-art detector array with a digital data acquisition system has been developed for charged-particle decay studies,includingβ-delayed protons,αdecay,and direct proton emissions from exotic proton-rich nuclei.The digital data acquisition system enables precise synchronization and processing of complex signals from various detectors,such as plastic scintillators,silicon detectors,and germaniumγdetectors.The system's performance was evaluated using theβdecay of^(32)Ar and its neighboring nuclei,produced via projectile fragmentation at the first Radioactive Ion Beam Line in Lanzhou(RIBLL1).Key measurements,including the half-life,charged-particle spectrum,andγ-ray spectrum,were obtained and compared with previous results for validation.Using the implantation–decay method,the isotopes of interest were implanted into two doublesided silicon strip detectors,where their subsequent decays were measured and correlated with preceding implantations using both position and time information.This detection system has potential for further applications,including the study ofβ-delayed charged-particle decay and direct proton emissions from even more exotic proton-rich nuclei.展开更多
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ...The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.展开更多
The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a pytho...The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community.展开更多
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol...During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.展开更多
文摘While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.
基金the NBI team and the partial support of National Natural Science Foundation of China (No. 61363019)National Natural Science Foundation of Qinghai Province (No. 2014-ZJ-718)
文摘As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.
文摘In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for searching residual oil in matured fields. Time-lapse seismic reservoir monitoring technique is one of most important techniques to define residual oil distribution. According to the demand for and development of time-lapse seismic reservoir monitoring in China, purposeless repeated acquisition time-lapse seismic data processing was studied. The four key steps in purposeless repeated acquisition time-lapse seismic data processing, including amplitude-preserved processing with relative consistency, rebinning, match filtering and difference calculation, were analyzed by combining theory and real seismic data processing. Meanwhile, quality control during real time-lapse seismic processing was emphasized.
基金Sponsored by the Natural Science Fundation of Jiangxi Province(Grant No.20114BAB211026 and No.20122BA-B201028)Open Science Fund from Key Laboratory of Radioactive Geology and Exploration Technology Fundamental Science for National Defense,East China Institute of Technology(Grant No.2010RGET11)
文摘Research for detecting or obtaining radionuclide by gamma energy spectrum data acquisition and process system is one of the key issues about intelligent measurement of gamma-ray spectrum. For this reason, a software and hardware implementation schematic design based on ARM ( Advanced RISC Machines) + DSP ( Digital Signal Processor) architecture for gamma energy spectrum data acquisition and processing system is proposed. The paper discusses in detail some key technologies such as communication interface design between microcontroller ARM and digital signal processor DSP,distribution scheduling under multi-task in the ARM-Linux,DSP handling procedures for multi-channel A / D high-speed sample. At the same time,because the traditional Gaussian fitting to determine the boundary of peak is not ideal,it puts forward a weighting factor of Gaussian function least squares fitting realize boundary determined. Finally gamma-spectrum data from sodium iodide NaI( TI) scintillation detector is tested and processed in the new system. The results show that gamma energy spectrum data acquisition and process system is perfect functionality, stable and convergence in unimodal. Compared with data from conventional energy spectrometers,the system can keep better energy resolution in a wide range of pulse pass rate.
文摘In order to realize the high speed data acquisition and fast Fourier analysis, the paper put forward a kind of high speed data acquisition and analysis system based on FPGA, the system uses Cyclone series FPGA with high-speed A/D converter, and use the fast Fourier custom analysis nucleation of Altera company, using the standard TCP/IP protocol communication with PC, match up the master machine based on Matlab GUI analysis software. We experiment high speed data acquisition and fast Fourier analysis for a plurality of groups of high frequency analog signals, at the same time the results display on the computer. The experimental results validate the fast Fourier analysis theory, and has realized the low cost, high performance data acquisition and analysis of the complete system design.
基金the National High Technology Project of China Foundation under Grant No.2002AA602230-1
文摘A multi-beam chirp sonar based on IP connections and DSP processing nodes was proposed and designed to provide an expandable system with high-speed processing and mass-storage of real-time signals for multi-beam profiling sonar.The system was designed for seabed petroleum pipeline detection and orientation,and can receive echo signals and process the data in real time,refreshing the display 10 times per second.Every node of the chirp sonar connects with data processing nodes through TCP/IP. Merely by adding nodes,the system’s processing ability can be increased proportionately without changing the software.System debugging and experimental testing proved the system to be practical and stable.This design provides a new method for high speed active sonar.
基金The project supported by the National Science Foundation of China(No.10075049 and No.10275068)
文摘This article describes the data processing and acquisition system for the HT-7 mul-tipulse Thomson scattering diagnostic. An eight-pulse laser is used in the Thomson scattering system to obtain electron temperature profiles at eight different times throughout an entire plasma discharge. The major components of the diagnostic system consist of a multipulse Nd-glass laser, a photodetector's subsystem, a calibration set and a CAMAC data processing and acquisition system. The data processing software along with LeCroy 2250L will perform the data acquisition. In order to simplify the operation and extend the capability of its compatibility with other math softwares, the processing software has been improved by the authors. The new software based on the VC++ easily utilizes some math softwares to calculate the electron temperature. The new software is simpler and more operational than the old one.
文摘At present, with the rapid development of science and technology, based on the requirements of weapon test and identification tasks, the experimental data acquisition and processing space station, which is suitable for a variety of extreme natural environments such as alpine, plateau, mountain, jungle, desert, island and reef, has been studied theoretically and in practice. The space station is a dome-shaped structure with scale-shaped modules and basalt reinforced fiber composite materials, providing thermal insulation, ventilation and continuous power supply. It can provide support and guarantee for the real-time monitoring, recovery and information transmission of test data, and meet the basic work and life needs of test personnel.
文摘This report presents the design and implementation of a Distributed Data Acquisition、 Monitoring and Processing System (DDAMAP)。It is assumed that operations of a factory are organized into two-levels: client machines at plant-level collect real-time raw data from sensors and measurement instrumentations and transfer them to a central processor over the Ethernets, and the central processor handles tasks of real-time data processing and monitoring. This system utilizes the computation power of Intel T2300 dual-core processor and parallel computations supported by multi-threading techniques. Our experiments show that these techniques can significantly improve the system performance and are viable solutions to real-time high-speed data processing.
文摘In the anticorrosive coating line of a welded tube plant, the current status and existing problems of the medium-frequency induction heating equipment were discussed.Partial renovations of the power control cabinet have been conducted.Parameters such as the DC current, DC voltage, intermediate frequency power, heating temperature, and the positioning signal at the pipe end were collected.A data acquisition and processing system, which can process data according to user needs and provide convenient data processing functions, has been developed using LabVIEW software.This system has been successfully applied in the coating line for the automatic control of high-power induction heating equipment, production management, and digital steel tube and/or digital delivery.
文摘Children can acquire knowledge of their mother tongue easily in a relatively short time,whereas adults are too inferior to bear the comparison in learning a second language.This paper sets out to study the background and process of children's and adults' language learning,make comparison and contrast,and find out an effective way to promote adults' second language learning.
基金supported by the National Key R&D Program of China(Nos.2022YFF 0503203 and 2024YFF0809900)the Research Funds of the Institute of Geophysics,China Earthquake Administration(No.DQJB24X28)the National Natural Science Foundation of China(Nos.42474226 and 42441827).
文摘The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.
基金funded by the Joint Project of Industry-University-Research of Jiangsu Province(Grant:BY20231146).
文摘With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads.
基金supported by the Start-up Fund from Hainan University(No.KYQD(ZR)-20077)。
文摘Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm.
基金supported by the National Nature Science Foundation of China(12363010)supported by the Guizhou Provincial Basic Research Program(Natural Science)(ZK[2023]039)the Key Technology R&D Program([2023]352).
文摘Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2.
基金supported by the National Key Research and Development Project,China(No.2023YFA1606404)the Strategic Priority Research Program of Chinese Academy of Sciences(No.XDB34010300)+5 种基金the National Natural Science Foundation of China(Nos.12022501,12105329,12475127)the Guangdong Major Project of Basic and Applied Basic Research(No.2021B0301030006)the Research Program of Heavy Ion Science and Technology Key Laboratory,Institute of Modern Physics,Chinese Academy of Sciences(Nos.HIST2024KS04,HIST2024CO04)Longyuan Youth Innovation and Entrepreneurship Talent Project of Gansu Province(No.2024GZT04)State Key Laboratory of Nuclear Physics and Technology,Peking University(No.NPT2023KFY01)the Major Science and Technology Projects in Gansu Province(No.24GD13GA005)。
文摘A state-of-the-art detector array with a digital data acquisition system has been developed for charged-particle decay studies,includingβ-delayed protons,αdecay,and direct proton emissions from exotic proton-rich nuclei.The digital data acquisition system enables precise synchronization and processing of complex signals from various detectors,such as plastic scintillators,silicon detectors,and germaniumγdetectors.The system's performance was evaluated using theβdecay of^(32)Ar and its neighboring nuclei,produced via projectile fragmentation at the first Radioactive Ion Beam Line in Lanzhou(RIBLL1).Key measurements,including the half-life,charged-particle spectrum,andγ-ray spectrum,were obtained and compared with previous results for validation.Using the implantation–decay method,the isotopes of interest were implanted into two doublesided silicon strip detectors,where their subsequent decays were measured and correlated with preceding implantations using both position and time information.This detection system has potential for further applications,including the study ofβ-delayed charged-particle decay and direct proton emissions from even more exotic proton-rich nuclei.
基金the National Natural Science Foundation of China(Grant Nos.52308403 and 52079068)the Yunlong Lake Laboratory of Deep Underground Science and Engineering(No.104023005)the China Postdoctoral Science Foundation(Grant No.2023M731998)for funding provided to this work.
文摘The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.
基金supported by the National Natural Science Foundation of China(NSFC,12173012 and 12473050)the Guangdong Natural Science Funds for Distinguished Young Scholars(2023B1515020049)+2 种基金the Shenzhen Science and Technology Project(JCYJ20240813104805008)the Shenzhen Key Laboratory Launching Project(No.ZDSYS20210702140800001)the Specialized Research Fund for State Key Laboratory of Solar Activity and Space Weather。
文摘The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community.
基金Supported by the National Natural Science Foundation of China(U24B2031)National Key Research and Development Project(2018YFA0702504)"14th Five-Year Plan"Science and Technology Project of CNOOC(KJGG2022-0201)。
文摘During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.