While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as...While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.展开更多
The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,suc...The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.展开更多
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol...During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.展开更多
With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Alth...With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads.展开更多
A new method is introduced to suppress the noise in seismic data processing. Based on the subtle difference in shape between the noise and the actual signal, we introduce morphologic filtering into seismic data proces...A new method is introduced to suppress the noise in seismic data processing. Based on the subtle difference in shape between the noise and the actual signal, we introduce morphologic filtering into seismic data processing. From the shape and the S/N we can see that the effect of morphologic filtering is superior to other methods like id-value filtering, neighbor average filtering, etc. The SNR of the signal after morphological filtering is comparatively great. In addition, the precision of the seismic data after morphological filtering is high. The characteristics of the actual signal, such as frequency and amplitude, are preserved. We give an example of the real seismic data processing using morphological filtering, in which the actual signal is retained, while the random high intensity noise was removed.展开更多
Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem....Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm.展开更多
In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for search...In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for searching residual oil in matured fields. Time-lapse seismic reservoir monitoring technique is one of most important techniques to define residual oil distribution. According to the demand for and development of time-lapse seismic reservoir monitoring in China, purposeless repeated acquisition time-lapse seismic data processing was studied. The four key steps in purposeless repeated acquisition time-lapse seismic data processing, including amplitude-preserved processing with relative consistency, rebinning, match filtering and difference calculation, were analyzed by combining theory and real seismic data processing. Meanwhile, quality control during real time-lapse seismic processing was emphasized.展开更多
Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing t...Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2.展开更多
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ...The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.展开更多
The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a pytho...The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community.展开更多
Statics are big challenges for the processing of deep reflection seismic data. In this paper several different statics solutions have been implemented in the processing of deep reflection seismic data in South China a...Statics are big challenges for the processing of deep reflection seismic data. In this paper several different statics solutions have been implemented in the processing of deep reflection seismic data in South China and their corresponding results have been compared in order to find proper statics solutions. Either statics solutions based on tomographic principle or combining the low-frequency components of field statics with the high-frequency ones of refraction statics can provide reasonable statics solutions for deep reflection seismic data in South China with very rugged surface topography, and the two statics solutions can correct the statics anomalies of both long spatial wavelengths and short ones. The surface-consistent residual static corrections can serve as the good compensations to the several kinds of the first statics solutions. Proper statics solutions can improve both qualities and reso- lutions of seismic sections, especially for the reflections of Moho in the upmost mantle.展开更多
A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depi...A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depict the shape of the whole seismograms. However, unlike some previous efforts which completely abandon the DIAL approach, i.e., signal detection, phase identifi- cation, association, and event localization, and seek to use envelope cross-correlation to detect seismic events directly, our technique keeps following the DIAL approach, but in addition to detect signals corresponding to individual seismic phases, it also detects continuous wave-trains and explores their feature for phase-type identification and signal association. More concrete ideas about how to define wave-trains and combine them with various detections, as well as how to measure and utilize their feature in the seismic data processing were expatiated in the paper. This approach has been applied to the routine data processing by us for years, and test results for a 16 days' period using data from the Xinjiang seismic station network were presented. The automatic processing results have fairly low false and missed event rate simultaneously, showing that the new technique has good application prospects for improvement of the automatic seismic data processing.展开更多
The Kuiyang-ST2000 deep-towed high-resolution multichannel seismic system was designed by the First Institute of Oceanography,Ministry of Natural Resources(FIO,MNR).The system is mainly composed of a plasma spark sour...The Kuiyang-ST2000 deep-towed high-resolution multichannel seismic system was designed by the First Institute of Oceanography,Ministry of Natural Resources(FIO,MNR).The system is mainly composed of a plasma spark source(source level:216 dB,main frequency:750 Hz,frequency bandwidth:150-1200 Hz)and a towed hydrophone streamer with 48 channels.Because the source and the towed hydrophone streamer are constantly moving according to the towing configuration,the accurate positioning of the towing hydrophone array and the moveout correction of deep-towed multichannel seismic data processing before imaging are challenging.Initially,according to the characteristics of the system and the towing streamer shape in deep water,travel-time positioning method was used to construct the hydrophone streamer shape,and the results were corrected by using the polynomial curve fitting method.Then,a new data-processing workflow for Kuiyang-ST2000 system data was introduced,mainly including float datum setting,residual static correction,phase-based moveout correction,which allows the imaging algorithms of conventional marine seismic data processing to extend to deep-towed seismic data.We successfully applied the Kuiyang-ST2000 system and methodology of data processing to a gas hydrate survey of the Qiongdongnan and Shenhu areas in the South China Sea,and the results show that the profile has very high vertical and lateral resolutions(0.5 m and 8 m,respectively),which can provide full and accurate details of gas hydrate-related and geohazard sedimentary and structural features in the South China Sea.展开更多
In this paper, I described the methods that I used for the creation of Xlets, which are Java applets that are developed for the IDTV environment;and the methods for online data retrieval and processing that I utilized...In this paper, I described the methods that I used for the creation of Xlets, which are Java applets that are developed for the IDTV environment;and the methods for online data retrieval and processing that I utilized in these Xlets. The themes that I chose for the Xlets of the IDTV applications are Earthquake and Tsunami Early Warning;Recent Seismic Activity Report;and Emergency Services. The online data regarding the Recent Seismic Activity Report application are provided by the Kandilli Observatory and Earthquake Research Institute (KOERI) of Bogazici University in Istanbul;while the online data for the Earthquake and Tsunami Early Warning and the Emergency Services applications are provided by the Godoro website which I used for storing (and retrieving by the Xlets) the earthquake and tsunami early warning simulation data, and the DVB network subscriber data (such as name and address information) for utilizing in the Emergency Services (Police, Ambulance and Fire Department) application. I have focused on the methodologies to use digital television as an efficient medium to convey timely and useful information regarding seismic warning data to the public, which forms the main research topic of this paper.展开更多
A comprehensive study of the data profiles, including the 2D seismic data, single channel seismic data, shallow sections, etc., reveals that gas hydrates occur in the East China Sea. A series of special techniques are...A comprehensive study of the data profiles, including the 2D seismic data, single channel seismic data, shallow sections, etc., reveals that gas hydrates occur in the East China Sea. A series of special techniques are used in the processing of seismic data, which include enhancing the accuracy of velocity analysis and resolution, estimating the wavelet, suppressing the multiple, preserving the relative amplitude, using the DMO and AVO techniques and some special techniques in dealing with the wave impedance. The existence of gas hydrates is reflected in the subbottom profiles in the appearance of BSRs, amplitude anomalies, velocity anomalies and AVO anomalies, etc. Hence the gas hydrates can be identified and predicted. It is pointed out that the East China Sea is a favorable area of the gas hydrates resource, and the Okinawa Trough is a target area of gas hydrates reservoir.展开更多
Nowadays, it becomes very urgent to find remain oil under the oil shortage worldwide.However, most of simple reservoirs have been discovered and those undiscovered are mostly complex structural, stratigraphic and lith...Nowadays, it becomes very urgent to find remain oil under the oil shortage worldwide.However, most of simple reservoirs have been discovered and those undiscovered are mostly complex structural, stratigraphic and lithologic ones. Summarized in this paper is the integrated seismic processing/interpretation technique established on the basis of pre-stack AVO processing and interpretation.Information feedbacks occurred between the pre-stack and post-stack processes so as to improve the accuracy in utilization of data and avoid pitfalls in seismic attributes. Through the integration of seismic data with geologic data, parameters that were most essential to describing hydrocarbon characteristics were determined and comprehensively appraised, and regularities of reservoir generation and distribution were described so as to accurately appraise reservoirs, delineate favorite traps and pinpoint wells.展开更多
Through analyzing the needs of seismic data processing and interpretation,a system model based on CSCW is designed.Using the technology of CSCW to build the environment of cooperative work allows the field data acquis...Through analyzing the needs of seismic data processing and interpretation,a system model based on CSCW is designed.Using the technology of CSCW to build the environment of cooperative work allows the field data acquisition to possess the functions of remote real-time guidance by experts and remote real-time processing of the data.展开更多
Branching river channels and the coexistence of valleys, ridges, hiils, and slopes'as the result of long-term weathering and erosion form the unique loess topography. The Changqing Geophysical Company, working in the...Branching river channels and the coexistence of valleys, ridges, hiils, and slopes'as the result of long-term weathering and erosion form the unique loess topography. The Changqing Geophysical Company, working in these complex conditions, has established a suite of technologies for high-fidelity processing and fine interpretation of seismic data. This article introduces the processes involved in the data processing and interpretation and illustrates the results.展开更多
Increasing the resolution of seismic data has long been a major topic in seismic exploration.Due to the effect of high-frequency noises,traditional methods could only improve the resolution limitedly.To end this,this ...Increasing the resolution of seismic data has long been a major topic in seismic exploration.Due to the effect of high-frequency noises,traditional methods could only improve the resolution limitedly.To end this,this paper newly proposed a high-resolution seismic data processing method based on welleseismic combination after summarizing the research status on high resolution.Synthetic record and seismogram are similar in effective signals but dissimilar in noises.Their effective signals are regular and noises are irregular.And they are similar in adjacent frequency.Based on these“three-regularity”characteristics,the relationship between synthetic record and seismogram was established using the neural network algorithm.Then,the corresponding extrapolation algorithm was proposed based on the self-adaptive geological and geophysical variation of multi-layer network structure.And a model was established by virtue of this method and the theoretical simulation was carried out.In addition,it was tested from the aspects of frequency component and amplitude energy recovery,phase correction,regularity elimination and stochastic noise.And the following research results were obtained.First,this new method can extract high-frequency information as much as possible and remain middle and low-frequency effective information while eliminating the noises.Second,in this method,the idea of traditional methods to denoisefirst and then expand frequency is changed completely and the limitation of traditional methods is broken.It establishes the idea of expanding frequency and denoising simultaneously and increases the resolution to the uttermost.Third,this new method has been applied to a variety of reservoir descriptions and the high-resolution processing results have been improved significantly in precision and accuracy.展开更多
The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured d...The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.展开更多
文摘While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.
基金supported by the National Key R&D Program of China(Nos.2022YFF 0503203 and 2024YFF0809900)the Research Funds of the Institute of Geophysics,China Earthquake Administration(No.DQJB24X28)the National Natural Science Foundation of China(Nos.42474226 and 42441827).
文摘The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.
基金Supported by the National Natural Science Foundation of China(U24B2031)National Key Research and Development Project(2018YFA0702504)"14th Five-Year Plan"Science and Technology Project of CNOOC(KJGG2022-0201)。
文摘During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.
基金funded by the Joint Project of Industry-University-Research of Jiangsu Province(Grant:BY20231146).
文摘With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads.
文摘A new method is introduced to suppress the noise in seismic data processing. Based on the subtle difference in shape between the noise and the actual signal, we introduce morphologic filtering into seismic data processing. From the shape and the S/N we can see that the effect of morphologic filtering is superior to other methods like id-value filtering, neighbor average filtering, etc. The SNR of the signal after morphological filtering is comparatively great. In addition, the precision of the seismic data after morphological filtering is high. The characteristics of the actual signal, such as frequency and amplitude, are preserved. We give an example of the real seismic data processing using morphological filtering, in which the actual signal is retained, while the random high intensity noise was removed.
基金supported by the Start-up Fund from Hainan University(No.KYQD(ZR)-20077)。
文摘Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm.
文摘In China, most oil fields are continental sedimentation with strong heterogeneity, which on one side makes the reservoir prospecting and development more difficult, but on the other side provides more space for searching residual oil in matured fields. Time-lapse seismic reservoir monitoring technique is one of most important techniques to define residual oil distribution. According to the demand for and development of time-lapse seismic reservoir monitoring in China, purposeless repeated acquisition time-lapse seismic data processing was studied. The four key steps in purposeless repeated acquisition time-lapse seismic data processing, including amplitude-preserved processing with relative consistency, rebinning, match filtering and difference calculation, were analyzed by combining theory and real seismic data processing. Meanwhile, quality control during real time-lapse seismic processing was emphasized.
基金supported by the National Nature Science Foundation of China(12363010)supported by the Guizhou Provincial Basic Research Program(Natural Science)(ZK[2023]039)the Key Technology R&D Program([2023]352).
文摘Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2.
基金the National Natural Science Foundation of China(Grant Nos.52308403 and 52079068)the Yunlong Lake Laboratory of Deep Underground Science and Engineering(No.104023005)the China Postdoctoral Science Foundation(Grant No.2023M731998)for funding provided to this work.
文摘The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.
基金supported by the National Natural Science Foundation of China(NSFC,12173012 and 12473050)the Guangdong Natural Science Funds for Distinguished Young Scholars(2023B1515020049)+2 种基金the Shenzhen Science and Technology Project(JCYJ20240813104805008)the Shenzhen Key Laboratory Launching Project(No.ZDSYS20210702140800001)the Specialized Research Fund for State Key Laboratory of Solar Activity and Space Weather。
文摘The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community.
基金supported by the Foundation of Institute of Geology,Chinese Academy of Geological Sciences (No. J1315)the 3D Geological Mapping Project (No. D1204)the SinoProbe-02 project of China
文摘Statics are big challenges for the processing of deep reflection seismic data. In this paper several different statics solutions have been implemented in the processing of deep reflection seismic data in South China and their corresponding results have been compared in order to find proper statics solutions. Either statics solutions based on tomographic principle or combining the low-frequency components of field statics with the high-frequency ones of refraction statics can provide reasonable statics solutions for deep reflection seismic data in South China with very rugged surface topography, and the two statics solutions can correct the statics anomalies of both long spatial wavelengths and short ones. The surface-consistent residual static corrections can serve as the good compensations to the several kinds of the first statics solutions. Proper statics solutions can improve both qualities and reso- lutions of seismic sections, especially for the reflections of Moho in the upmost mantle.
文摘A novel technique for automatic seismic data processing using both integral and local feature of seismograms was presented in this paper. Here, the term integral feature of seismograms refers to feature which may depict the shape of the whole seismograms. However, unlike some previous efforts which completely abandon the DIAL approach, i.e., signal detection, phase identifi- cation, association, and event localization, and seek to use envelope cross-correlation to detect seismic events directly, our technique keeps following the DIAL approach, but in addition to detect signals corresponding to individual seismic phases, it also detects continuous wave-trains and explores their feature for phase-type identification and signal association. More concrete ideas about how to define wave-trains and combine them with various detections, as well as how to measure and utilize their feature in the seismic data processing were expatiated in the paper. This approach has been applied to the routine data processing by us for years, and test results for a 16 days' period using data from the Xinjiang seismic station network were presented. The automatic processing results have fairly low false and missed event rate simultaneously, showing that the new technique has good application prospects for improvement of the automatic seismic data processing.
基金Supported by the National Key R&D Program of China(No.2016YFC0303900)the Laoshan Laboratory(Nos.MGQNLM-KF201807,LSKJ202203604)the National Natural Science Foundation of China(No.42106072)。
文摘The Kuiyang-ST2000 deep-towed high-resolution multichannel seismic system was designed by the First Institute of Oceanography,Ministry of Natural Resources(FIO,MNR).The system is mainly composed of a plasma spark source(source level:216 dB,main frequency:750 Hz,frequency bandwidth:150-1200 Hz)and a towed hydrophone streamer with 48 channels.Because the source and the towed hydrophone streamer are constantly moving according to the towing configuration,the accurate positioning of the towing hydrophone array and the moveout correction of deep-towed multichannel seismic data processing before imaging are challenging.Initially,according to the characteristics of the system and the towing streamer shape in deep water,travel-time positioning method was used to construct the hydrophone streamer shape,and the results were corrected by using the polynomial curve fitting method.Then,a new data-processing workflow for Kuiyang-ST2000 system data was introduced,mainly including float datum setting,residual static correction,phase-based moveout correction,which allows the imaging algorithms of conventional marine seismic data processing to extend to deep-towed seismic data.We successfully applied the Kuiyang-ST2000 system and methodology of data processing to a gas hydrate survey of the Qiongdongnan and Shenhu areas in the South China Sea,and the results show that the profile has very high vertical and lateral resolutions(0.5 m and 8 m,respectively),which can provide full and accurate details of gas hydrate-related and geohazard sedimentary and structural features in the South China Sea.
文摘In this paper, I described the methods that I used for the creation of Xlets, which are Java applets that are developed for the IDTV environment;and the methods for online data retrieval and processing that I utilized in these Xlets. The themes that I chose for the Xlets of the IDTV applications are Earthquake and Tsunami Early Warning;Recent Seismic Activity Report;and Emergency Services. The online data regarding the Recent Seismic Activity Report application are provided by the Kandilli Observatory and Earthquake Research Institute (KOERI) of Bogazici University in Istanbul;while the online data for the Earthquake and Tsunami Early Warning and the Emergency Services applications are provided by the Godoro website which I used for storing (and retrieving by the Xlets) the earthquake and tsunami early warning simulation data, and the DVB network subscriber data (such as name and address information) for utilizing in the Emergency Services (Police, Ambulance and Fire Department) application. I have focused on the methodologies to use digital television as an efficient medium to convey timely and useful information regarding seismic warning data to the public, which forms the main research topic of this paper.
文摘A comprehensive study of the data profiles, including the 2D seismic data, single channel seismic data, shallow sections, etc., reveals that gas hydrates occur in the East China Sea. A series of special techniques are used in the processing of seismic data, which include enhancing the accuracy of velocity analysis and resolution, estimating the wavelet, suppressing the multiple, preserving the relative amplitude, using the DMO and AVO techniques and some special techniques in dealing with the wave impedance. The existence of gas hydrates is reflected in the subbottom profiles in the appearance of BSRs, amplitude anomalies, velocity anomalies and AVO anomalies, etc. Hence the gas hydrates can be identified and predicted. It is pointed out that the East China Sea is a favorable area of the gas hydrates resource, and the Okinawa Trough is a target area of gas hydrates reservoir.
文摘Nowadays, it becomes very urgent to find remain oil under the oil shortage worldwide.However, most of simple reservoirs have been discovered and those undiscovered are mostly complex structural, stratigraphic and lithologic ones. Summarized in this paper is the integrated seismic processing/interpretation technique established on the basis of pre-stack AVO processing and interpretation.Information feedbacks occurred between the pre-stack and post-stack processes so as to improve the accuracy in utilization of data and avoid pitfalls in seismic attributes. Through the integration of seismic data with geologic data, parameters that were most essential to describing hydrocarbon characteristics were determined and comprehensively appraised, and regularities of reservoir generation and distribution were described so as to accurately appraise reservoirs, delineate favorite traps and pinpoint wells.
文摘Through analyzing the needs of seismic data processing and interpretation,a system model based on CSCW is designed.Using the technology of CSCW to build the environment of cooperative work allows the field data acquisition to possess the functions of remote real-time guidance by experts and remote real-time processing of the data.
文摘Branching river channels and the coexistence of valleys, ridges, hiils, and slopes'as the result of long-term weathering and erosion form the unique loess topography. The Changqing Geophysical Company, working in these complex conditions, has established a suite of technologies for high-fidelity processing and fine interpretation of seismic data. This article introduces the processes involved in the data processing and interpretation and illustrates the results.
文摘Increasing the resolution of seismic data has long been a major topic in seismic exploration.Due to the effect of high-frequency noises,traditional methods could only improve the resolution limitedly.To end this,this paper newly proposed a high-resolution seismic data processing method based on welleseismic combination after summarizing the research status on high resolution.Synthetic record and seismogram are similar in effective signals but dissimilar in noises.Their effective signals are regular and noises are irregular.And they are similar in adjacent frequency.Based on these“three-regularity”characteristics,the relationship between synthetic record and seismogram was established using the neural network algorithm.Then,the corresponding extrapolation algorithm was proposed based on the self-adaptive geological and geophysical variation of multi-layer network structure.And a model was established by virtue of this method and the theoretical simulation was carried out.In addition,it was tested from the aspects of frequency component and amplitude energy recovery,phase correction,regularity elimination and stochastic noise.And the following research results were obtained.First,this new method can extract high-frequency information as much as possible and remain middle and low-frequency effective information while eliminating the noises.Second,in this method,the idea of traditional methods to denoisefirst and then expand frequency is changed completely and the limitation of traditional methods is broken.It establishes the idea of expanding frequency and denoising simultaneously and increases the resolution to the uttermost.Third,this new method has been applied to a variety of reservoir descriptions and the high-resolution processing results have been improved significantly in precision and accuracy.
文摘The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.