While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as...While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.展开更多
The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,suc...The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.展开更多
False Data Injection Attacks(FDIAs)pose a critical security threat to modern power grids,corrupting state estimation and enabling malicious control actions that can lead to severe consequences,including cascading fail...False Data Injection Attacks(FDIAs)pose a critical security threat to modern power grids,corrupting state estimation and enabling malicious control actions that can lead to severe consequences,including cascading failures,large-scale blackouts,and significant economic losses.While detecting attacks is important,accurately localizing compromised nodes or measurements is even more critical,as it enables timely mitigation,targeted response,and enhanced system resilience beyond what detection alone can offer.Existing research typically models topological features using fixed structures,which can introduce irrelevant information and affect the effectiveness of feature extraction.To address this limitation,this paper proposes an FDIA localization model with adaptive neighborhood selection,which dynamically captures spatial dependencies of the power grid by adjusting node relationships based on data-driven similarities.The improved Transformer is employed to pre-fuse global spatial features of the graph,enriching the feature representation.To improve spatio-temporal correlation extraction for FDIA localization,the proposed model employs dilated causal convolution with a gating mechanism combined with graph convolution to capture and fuse long-range temporal features and adaptive topological features.This fully exploits the temporal dynamics and spatial dependencies inherent in the power grid.Finally,multi-source information is integrated to generate highly robust node embeddings,enhancing FDIA detection and localization.Experiments are conducted on IEEE 14,57,and 118-bus systems,and the results demonstrate that the proposed model substantially improves the accuracy of FDIA localization.Additional experiments are conducted to verify the effectiveness and robustness of the proposed model.展开更多
With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Alth...With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads.展开更多
Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem....Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm.展开更多
Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing t...Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2.展开更多
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ...The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.展开更多
The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a pytho...The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community.展开更多
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol...During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.展开更多
This paper introduces techniques in Gaussian process regression model for spatiotemporal data collected from complex systems.This study focuses on extracting local structures and then constructing surrogate models bas...This paper introduces techniques in Gaussian process regression model for spatiotemporal data collected from complex systems.This study focuses on extracting local structures and then constructing surrogate models based on Gaussian process assumptions.The proposed Dynamic Gaussian Process Regression(DGPR)consists of a sequence of local surrogate models related to each other.In DGPR,the time-based spatial clustering is carried out to divide the systems into sub-spatio-temporal parts whose interior has similar variation patterns,where the temporal information is used as the prior information for training the spatial-surrogate model.The DGPR is robust and especially suitable for the loosely coupled model structure,also allowing for parallel computation.The numerical results of the test function show the effectiveness of DGPR.Furthermore,the shock tube problem is successfully approximated under different phenomenon complexity.展开更多
The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured d...The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.展开更多
To solve the query processing correctness problem for semantic-based relational data integration,the semantics of SAPRQL(simple protocol and RDF query language) queries is defined.In the course of query rewriting,al...To solve the query processing correctness problem for semantic-based relational data integration,the semantics of SAPRQL(simple protocol and RDF query language) queries is defined.In the course of query rewriting,all relative tables are found and decomposed into minimal connectable units.Minimal connectable units are joined according to semantic queries to produce the semantically correct query plans.Algorithms for query rewriting and transforming are presented.Computational complexity of the algorithms is discussed.Under the worst case,the query decomposing algorithm can be finished in O(n2) time and the query rewriting algorithm requires O(nm) time.And the performance of the algorithms is verified by experiments,and experimental results show that when the length of query is less than 8,the query processing algorithms can provide satisfactory performance.展开更多
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di...Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.展开更多
High-resolution vehicular emissions inventories are important for managing vehicular pollution and improving urban air quality. This study developed a vehicular emission inventory with high spatio-temporal resolution ...High-resolution vehicular emissions inventories are important for managing vehicular pollution and improving urban air quality. This study developed a vehicular emission inventory with high spatio-temporal resolution in the main urban area of Chongqing, based on realtime traffic data from 820 RFID detectors covering 454 roads, and the differences in spatiotemporal emission characteristics between inner and outer districts were analysed. The result showed that the daily vehicular emission intensities of CO, hydrocarbons, PM2.5, PM10,and NO_(x) were 30.24, 3.83, 0.18, 0.20, and 8.65 kg/km per day, respectively, in the study area during 2018. The pollutants emission intensities in inner district were higher than those in outer district. Light passenger cars(LPCs) were the main contributors of all-day CO emissions in the inner and outer districts, from which the contributors of NO_(x) emissions were different. Diesel and natural gas buses were major contributors of daytime NO_(x) emissions in inner districts, accounting for 40.40%, but buses and heavy duty trucks(HDTs) were major contributors in outer districts. At nighttime, due to the lifting of truck restrictions and suspension of buses, HDTs become the main NO_(x) contributor in both inner and outer districts,and its three NO_(x) emission peak hours were found, which are different to the peak hours of total NO_(x) emission by all vehicles. Unlike most other cities, bridges and connecting channels are always emission hotspots due to long-time traffic congestion. This knowledge will help fully understand vehicular emissions characteristics and is useful for policymakers to design precise prevention and control measures.展开更多
This essay combines the Defense Meteorological Satellite Program Operational Linescan System(DMSP-OLS)nighttime light data and the Visible Infrared Imaging Radiometer Suite(VIIRS)nighttime light data into a“synthetic...This essay combines the Defense Meteorological Satellite Program Operational Linescan System(DMSP-OLS)nighttime light data and the Visible Infrared Imaging Radiometer Suite(VIIRS)nighttime light data into a“synthetic DMSP”dataset,from 1992 to 2020,to retrieve the spatio-temporal variations in energy-related carbon emissions in Xinjiang,China.Then,this paper analyzes several influencing factors for spatial differentiation of carbon emissions in Xinjiang with the application of geographical detector technique.Results reveal that(1)total carbon emissions continued to grow,while the growth rate slowed down in the past five years.(2)Large regional differences exist in total carbon emissions across various regions.Total carbon emissions of these regions in descending order are the northern slope of the Tianshan(Mountains)>the southern slope of the Tianshan>the three prefectures in southern Xinjiang>the northern part of Xinjiang.(3)Economic growth,population size,and energy consumption intensity are the most important factors of spatial differentiation of carbon emissions.The interaction between economic growth and population size as well as between economic growth and energy consumption intensity also enhances the explanatory power of carbon emissions’spatial differentiation.This paper aims to help formulate differentiated carbon reduction targets and strategies for cities in different economic development stages and those with different carbon intensities so as to achieve the carbon peak goals in different steps.展开更多
As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and...As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.展开更多
The High Precision Magnetometer(HPM) on board the China Seismo-Electromagnetic Satellite(CSES) allows highly accurate measurement of the geomagnetic field; it includes FGM(Fluxgate Magnetometer) and CDSM(Coupled Dark ...The High Precision Magnetometer(HPM) on board the China Seismo-Electromagnetic Satellite(CSES) allows highly accurate measurement of the geomagnetic field; it includes FGM(Fluxgate Magnetometer) and CDSM(Coupled Dark State Magnetometer)probes. This article introduces the main processing method, algorithm, and processing procedure of the HPM data. First, the FGM and CDSM probes are calibrated according to ground sensor data. Then the FGM linear parameters can be corrected in orbit, by applying the absolute vector magnetic field correction algorithm from CDSM data. At the same time, the magnetic interference of the satellite is eliminated according to ground-satellite magnetic test results. Finally, according to the characteristics of the magnetic field direction in the low latitude region, the transformation matrix between FGM probe and star sensor is calibrated in orbit to determine the correct direction of the magnetic field. Comparing the magnetic field data of CSES and SWARM satellites in five continuous geomagnetic quiet days, the difference in measurements of the vector magnetic field is about 10 nT, which is within the uncertainty interval of geomagnetic disturbance.展开更多
Underground coal fires are one of the most common and serious geohazards in most coal producing countries in the world. Monitoring their spatio-temporal changes plays an important role in controlling and preventing th...Underground coal fires are one of the most common and serious geohazards in most coal producing countries in the world. Monitoring their spatio-temporal changes plays an important role in controlling and preventing the effects of coal fires, and their environmental impact. In this study, the spatio-temporal changes of underground coal fires in Khanh Hoa coal field(North-East of Viet Nam) were analyzed using Landsat time-series data during the 2008-2016 period. Based on land surface temperatures retrieved from Landsat thermal data, underground coal fires related to thermal anomalies were identified using the MEDIAN+1.5×IQR(IQR: Interquartile range) threshold technique. The locations of underground coal fires were validated using a coal fire map produced by the field survey data and cross-validated using the daytime ASTER thermal infrared imagery. Based on the fires extracted from seven Landsat thermal imageries, the spatiotemporal changes of underground coal fire areas were analyzed. The results showed that the thermalanomalous zones have been correlated with known coal fires. Cross-validation of coal fires using ASTER TIR data showed a high consistency of 79.3%. The largest coal fire area of 184.6 hectares was detected in 2010, followed by 2014(181.1 hectares) and 2016(178.5 hectares). The smaller coal fire areas were extracted with areas of 133.6 and 152.5 hectares in 2011 and 2009 respectively. Underground coal fires were mainly detected in the northern and southern part, and tend to spread to north-west of the coal field.展开更多
Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strengt...Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strength of NMR tools and the complex petrophysical properties of detected samples.Suppressing the noise and highlighting the available NMR signals is very important for subsequent data processing.Most denoising methods are normally based on fixed mathematical transformation or handdesign feature selectors to suppress noise characteristics,which may not perform well because of their non-adaptive performance to different noisy signals.In this paper,we proposed a“data processing framework”to improve the quality of low field NMR echo data based on dictionary learning.Dictionary learning is a machine learning method based on redundancy and sparse representation theory.Available information in noisy NMR echo data can be adaptively extracted and reconstructed by dictionary learning.The advantages and application effectiveness of the proposed method were verified with a number of numerical simulations,NMR core data analyses,and NMR logging data processing.The results show that dictionary learning can significantly improve the quality of NMR echo data with high noise level and effectively improve the accuracy and reliability of inversion results.展开更多
Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently...Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently, greater emphasis has been placed on GIS (geographical information system)to deal with the marine information. The GIS has shown great success for terrestrial applications in the last decades, but its use in marine fields has been far more restricted. One of the main reasons is that most of the GIS systems or their data models are designed for land applications. They cannot do well with the nature of the marine environment and for the marine information. And this becomes a fundamental challenge to the traditional GIS and its data structure. This work designed a data model, the raster-based spatio-temporal hierarchical data model (RSHDM), for the marine information system, or for the knowledge discovery fi'om spatio-temporal data, which bases itself on the nature of the marine data and overcomes the shortages of the current spatio-temporal models when they are used in the field. As an experiment, the marine fishery data warehouse (FDW) for marine fishery management was set up, which was based on the RSHDM. The experiment proved that the RSHDM can do well with the data and can extract easily the aggregations that the management needs at different levels.展开更多
文摘While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.
基金supported by the National Key R&D Program of China(Nos.2022YFF 0503203 and 2024YFF0809900)the Research Funds of the Institute of Geophysics,China Earthquake Administration(No.DQJB24X28)the National Natural Science Foundation of China(Nos.42474226 and 42441827).
文摘The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.
基金supported by National Key Research and Development Plan of China(No.2022YFB3103304).
文摘False Data Injection Attacks(FDIAs)pose a critical security threat to modern power grids,corrupting state estimation and enabling malicious control actions that can lead to severe consequences,including cascading failures,large-scale blackouts,and significant economic losses.While detecting attacks is important,accurately localizing compromised nodes or measurements is even more critical,as it enables timely mitigation,targeted response,and enhanced system resilience beyond what detection alone can offer.Existing research typically models topological features using fixed structures,which can introduce irrelevant information and affect the effectiveness of feature extraction.To address this limitation,this paper proposes an FDIA localization model with adaptive neighborhood selection,which dynamically captures spatial dependencies of the power grid by adjusting node relationships based on data-driven similarities.The improved Transformer is employed to pre-fuse global spatial features of the graph,enriching the feature representation.To improve spatio-temporal correlation extraction for FDIA localization,the proposed model employs dilated causal convolution with a gating mechanism combined with graph convolution to capture and fuse long-range temporal features and adaptive topological features.This fully exploits the temporal dynamics and spatial dependencies inherent in the power grid.Finally,multi-source information is integrated to generate highly robust node embeddings,enhancing FDIA detection and localization.Experiments are conducted on IEEE 14,57,and 118-bus systems,and the results demonstrate that the proposed model substantially improves the accuracy of FDIA localization.Additional experiments are conducted to verify the effectiveness and robustness of the proposed model.
基金funded by the Joint Project of Industry-University-Research of Jiangsu Province(Grant:BY20231146).
文摘With the widespread application of Internet of Things(IoT)technology,the processing of massive realtime streaming data poses significant challenges to the computational and data-processing capabilities of systems.Although distributed streaming data processing frameworks such asApache Flink andApache Spark Streaming provide solutions,meeting stringent response time requirements while ensuring high throughput and resource utilization remains an urgent problem.To address this,the study proposes a formal modeling approach based on Performance Evaluation Process Algebra(PEPA),which abstracts the core components and interactions of cloud-based distributed streaming data processing systems.Additionally,a generic service flow generation algorithmis introduced,enabling the automatic extraction of service flows fromthe PEPAmodel and the computation of key performance metrics,including response time,throughput,and resource utilization.The novelty of this work lies in the integration of PEPA-based formal modeling with the service flow generation algorithm,bridging the gap between formal modeling and practical performance evaluation for IoT systems.Simulation experiments demonstrate that optimizing the execution efficiency of components can significantly improve system performance.For instance,increasing the task execution rate from 10 to 100 improves system performance by 9.53%,while further increasing it to 200 results in a 21.58%improvement.However,diminishing returns are observed when the execution rate reaches 500,with only a 0.42%gain.Similarly,increasing the number of TaskManagers from 10 to 20 improves response time by 18.49%,but the improvement slows to 6.06% when increasing from 20 to 50,highlighting the importance of co-optimizing component efficiency and resource management to achieve substantial performance gains.This study provides a systematic framework for analyzing and optimizing the performance of IoT systems for large-scale real-time streaming data processing.The proposed approach not only identifies performance bottlenecks but also offers insights into improving system efficiency under different configurations and workloads.
基金supported by the Start-up Fund from Hainan University(No.KYQD(ZR)-20077)。
文摘Three-dimensional(3D)single molecule localization microscopy(SMLM)plays an important role in biomedical applications,but its data processing is very complicated.Deep learning is a potential tool to solve this problem.As the state of art 3D super-resolution localization algorithm based on deep learning,FD-DeepLoc algorithm reported recently still has a gap with the expected goal of online image processing,even though it has greatly improved the data processing throughput.In this paper,a new algorithm Lite-FD-DeepLoc is developed on the basis of FD-DeepLoc algorithm to meet the online image processing requirements of 3D SMLM.This new algorithm uses the feature compression method to reduce the parameters of the model,and combines it with pipeline programming to accelerate the inference process of the deep learning model.The simulated data processing results show that the image processing speed of Lite-FD-DeepLoc is about twice as fast as that of FD-DeepLoc with a slight decrease in localization accuracy,which can realize real-time processing of 256×256 pixels size images.The results of biological experimental data processing imply that Lite-FD-DeepLoc can successfully analyze the data based on astigmatism and saddle point engineering,and the global resolution of the reconstructed image is equivalent to or even better than FD-DeepLoc algorithm.
基金supported by the National Nature Science Foundation of China(12363010)supported by the Guizhou Provincial Basic Research Program(Natural Science)(ZK[2023]039)the Key Technology R&D Program([2023]352).
文摘Previous studies aiming to accelerate data processing have focused on enhancement algorithms,using the graphics processing unit(GPU)to speed up programs,and thread-level parallelism.These methods overlook maximizing the utilization of existing central processing unit(CPU)resources and reducing human and computational time costs via process automation.Accordingly,this paper proposes a scheme,called SSM,that combines“Srun job submission mode”,“Sbatch job submission mode”,and“Monitor function”.The SSM scheme includes three main modules:data management,command management,and resource management.Its core innovations are command splitting and parallel execution.The results show that this method effectively improves CPU utilization and reduces the time required for data processing.In terms of CPU utilization,the average value of this scheme is 89%.In contrast,the average CPU utilizations of“Srun job submission mode”and“Sbatch job submission mode”are significantly lower,at 43%and 52%,respectively.In terms of the data-processing time,SSM testing on the Five-hundred-meter Aperture Spherical radio Telescope(FAST)data requires only 5.5 h,compared with 8 h in the“Srun job submission mode”and 14 h in the“Sbatch job submission mode”.In addition,tests on the FAST and Parkes datasets demonstrate the universality of the SSM scheme,which can process data from different telescopes.The compatibility of the SSM scheme for pulsar searches is verified using 2 days of observational data from the globular cluster M2,with the scheme successfully discovering all published pulsars in M2.
基金the National Natural Science Foundation of China(Grant Nos.52308403 and 52079068)the Yunlong Lake Laboratory of Deep Underground Science and Engineering(No.104023005)the China Postdoctoral Science Foundation(Grant No.2023M731998)for funding provided to this work.
文摘The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.
基金supported by the National Natural Science Foundation of China(NSFC,12173012 and 12473050)the Guangdong Natural Science Funds for Distinguished Young Scholars(2023B1515020049)+2 种基金the Shenzhen Science and Technology Project(JCYJ20240813104805008)the Shenzhen Key Laboratory Launching Project(No.ZDSYS20210702140800001)the Specialized Research Fund for State Key Laboratory of Solar Activity and Space Weather。
文摘The increasing demand for high-resolution solar observations has driven the development of advanced data processing and enhancement techniques for ground-based solar telescopes.This study focuses on developing a python-based package(GT-scopy)for data processing and enhancing for giant solar telescopes,with application to the 1.6 m Goode Solar Telescope(GST)at Big Bear Solar Observatory.The objective is to develop a modern data processing software for refining existing data acquisition,processing,and enhancement methodologies to achieve atmospheric effect removal and accurate alignment at the sub-pixel level,particularly within the processing levels 1.0-1.5.In this research,we implemented an integrated and comprehensive data processing procedure that includes image de-rotation,zone-of-interest selection,coarse alignment,correction for atmospheric distortions,and fine alignment at the sub-pixel level with an advanced algorithm.The results demonstrate a significant improvement in image quality,with enhanced visibility of fine solar structures both in sunspots and quiet-Sun regions.The enhanced data processing package developed in this study significantly improves the utility of data obtained from the GST,paving the way for more precise solar research and contributing to a better understanding of solar dynamics.This package can be adapted for other ground-based solar telescopes,such as the Daniel K.Inouye Solar Telescope(DKIST),the European Solar Telescope(EST),and the 8 m Chinese Giant Solar Telescope,potentially benefiting the broader solar physics community.
基金Supported by the National Natural Science Foundation of China(U24B2031)National Key Research and Development Project(2018YFA0702504)"14th Five-Year Plan"Science and Technology Project of CNOOC(KJGG2022-0201)。
文摘During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.
基金co-supported by the National Natural Science Foundation of China(No.12101608)the NSAF(No.U2230208)the Hunan Provincial Innovation Foundation for Postgraduate,China(No.CX20220034).
文摘This paper introduces techniques in Gaussian process regression model for spatiotemporal data collected from complex systems.This study focuses on extracting local structures and then constructing surrogate models based on Gaussian process assumptions.The proposed Dynamic Gaussian Process Regression(DGPR)consists of a sequence of local surrogate models related to each other.In DGPR,the time-based spatial clustering is carried out to divide the systems into sub-spatio-temporal parts whose interior has similar variation patterns,where the temporal information is used as the prior information for training the spatial-surrogate model.The DGPR is robust and especially suitable for the loosely coupled model structure,also allowing for parallel computation.The numerical results of the test function show the effectiveness of DGPR.Furthermore,the shock tube problem is successfully approximated under different phenomenon complexity.
文摘The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.
基金Weaponry Equipment Pre-Research Foundation of PLA Equipment Ministry (No. 9140A06050409JB8102)Pre-Research Foundation of PLA University of Science and Technology (No. 2009JSJ11)
文摘To solve the query processing correctness problem for semantic-based relational data integration,the semantics of SAPRQL(simple protocol and RDF query language) queries is defined.In the course of query rewriting,all relative tables are found and decomposed into minimal connectable units.Minimal connectable units are joined according to semantic queries to produce the semantically correct query plans.Algorithms for query rewriting and transforming are presented.Computational complexity of the algorithms is discussed.Under the worst case,the query decomposing algorithm can be finished in O(n2) time and the query rewriting algorithm requires O(nm) time.And the performance of the algorithms is verified by experiments,and experimental results show that when the length of query is less than 8,the query processing algorithms can provide satisfactory performance.
文摘Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.
基金supported by the National Key Research Program(No.2018YFB1601105,No.2018YFB1601102)the Natural Science Foundation of China(No.41975165,No.U1811463)Chongqing Science and Technology Project(No.cstc2019jscxfxydX0035)。
文摘High-resolution vehicular emissions inventories are important for managing vehicular pollution and improving urban air quality. This study developed a vehicular emission inventory with high spatio-temporal resolution in the main urban area of Chongqing, based on realtime traffic data from 820 RFID detectors covering 454 roads, and the differences in spatiotemporal emission characteristics between inner and outer districts were analysed. The result showed that the daily vehicular emission intensities of CO, hydrocarbons, PM2.5, PM10,and NO_(x) were 30.24, 3.83, 0.18, 0.20, and 8.65 kg/km per day, respectively, in the study area during 2018. The pollutants emission intensities in inner district were higher than those in outer district. Light passenger cars(LPCs) were the main contributors of all-day CO emissions in the inner and outer districts, from which the contributors of NO_(x) emissions were different. Diesel and natural gas buses were major contributors of daytime NO_(x) emissions in inner districts, accounting for 40.40%, but buses and heavy duty trucks(HDTs) were major contributors in outer districts. At nighttime, due to the lifting of truck restrictions and suspension of buses, HDTs become the main NO_(x) contributor in both inner and outer districts,and its three NO_(x) emission peak hours were found, which are different to the peak hours of total NO_(x) emission by all vehicles. Unlike most other cities, bridges and connecting channels are always emission hotspots due to long-time traffic congestion. This knowledge will help fully understand vehicular emissions characteristics and is useful for policymakers to design precise prevention and control measures.
基金The Third Xinjiang Scientific Expedition Program(2021xjkk0905)GDAS Special Project of Science and Technology Development(2020GDASYL-20200301003)+2 种基金GDAS Special Project of Science and Technology Development(2020GDASYL-20200102002)National Natural Science Foundation of China(41501144)Project of Department of Natural Resources of Guangdong Province(GDZRZYKJ2022005)。
文摘This essay combines the Defense Meteorological Satellite Program Operational Linescan System(DMSP-OLS)nighttime light data and the Visible Infrared Imaging Radiometer Suite(VIIRS)nighttime light data into a“synthetic DMSP”dataset,from 1992 to 2020,to retrieve the spatio-temporal variations in energy-related carbon emissions in Xinjiang,China.Then,this paper analyzes several influencing factors for spatial differentiation of carbon emissions in Xinjiang with the application of geographical detector technique.Results reveal that(1)total carbon emissions continued to grow,while the growth rate slowed down in the past five years.(2)Large regional differences exist in total carbon emissions across various regions.Total carbon emissions of these regions in descending order are the northern slope of the Tianshan(Mountains)>the southern slope of the Tianshan>the three prefectures in southern Xinjiang>the northern part of Xinjiang.(3)Economic growth,population size,and energy consumption intensity are the most important factors of spatial differentiation of carbon emissions.The interaction between economic growth and population size as well as between economic growth and energy consumption intensity also enhances the explanatory power of carbon emissions’spatial differentiation.This paper aims to help formulate differentiated carbon reduction targets and strategies for cities in different economic development stages and those with different carbon intensities so as to achieve the carbon peak goals in different steps.
基金the NBI team and the partial support of National Natural Science Foundation of China (No. 61363019)National Natural Science Foundation of Qinghai Province (No. 2014-ZJ-718)
文摘As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi- threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.
基金supported by National Key Research and Development Program of China from MOST (2016YFB0501503)
文摘The High Precision Magnetometer(HPM) on board the China Seismo-Electromagnetic Satellite(CSES) allows highly accurate measurement of the geomagnetic field; it includes FGM(Fluxgate Magnetometer) and CDSM(Coupled Dark State Magnetometer)probes. This article introduces the main processing method, algorithm, and processing procedure of the HPM data. First, the FGM and CDSM probes are calibrated according to ground sensor data. Then the FGM linear parameters can be corrected in orbit, by applying the absolute vector magnetic field correction algorithm from CDSM data. At the same time, the magnetic interference of the satellite is eliminated according to ground-satellite magnetic test results. Finally, according to the characteristics of the magnetic field direction in the low latitude region, the transformation matrix between FGM probe and star sensor is calibrated in orbit to determine the correct direction of the magnetic field. Comparing the magnetic field data of CSES and SWARM satellites in five continuous geomagnetic quiet days, the difference in measurements of the vector magnetic field is about 10 nT, which is within the uncertainty interval of geomagnetic disturbance.
基金funded by the Ministry-level Scientific and Technological Key Programs of Ministry of Natural Resources and Environment of Viet Nam "Application of thermal infrared remote sensing and GIS for mapping underground coal fires in Quang Ninh coal basin" (Grant No. TNMT.2017.08.06)
文摘Underground coal fires are one of the most common and serious geohazards in most coal producing countries in the world. Monitoring their spatio-temporal changes plays an important role in controlling and preventing the effects of coal fires, and their environmental impact. In this study, the spatio-temporal changes of underground coal fires in Khanh Hoa coal field(North-East of Viet Nam) were analyzed using Landsat time-series data during the 2008-2016 period. Based on land surface temperatures retrieved from Landsat thermal data, underground coal fires related to thermal anomalies were identified using the MEDIAN+1.5×IQR(IQR: Interquartile range) threshold technique. The locations of underground coal fires were validated using a coal fire map produced by the field survey data and cross-validated using the daytime ASTER thermal infrared imagery. Based on the fires extracted from seven Landsat thermal imageries, the spatiotemporal changes of underground coal fire areas were analyzed. The results showed that the thermalanomalous zones have been correlated with known coal fires. Cross-validation of coal fires using ASTER TIR data showed a high consistency of 79.3%. The largest coal fire area of 184.6 hectares was detected in 2010, followed by 2014(181.1 hectares) and 2016(178.5 hectares). The smaller coal fire areas were extracted with areas of 133.6 and 152.5 hectares in 2011 and 2009 respectively. Underground coal fires were mainly detected in the northern and southern part, and tend to spread to north-west of the coal field.
基金supported by Science Foundation of China University of Petroleum,Beijing(Grant Number ZX20210024)Chinese Postdoctoral Science Foundation(Grant Number 2021M700172)+1 种基金The Strategic Cooperation Technology Projects of CNPC and CUP(Grant Number ZLZX2020-03)National Natural Science Foundation of China(Grant Number 42004105)
文摘Low-field(nuclear magnetic resonance)NMR has been widely used in petroleum industry,such as well logging and laboratory rock core analysis.However,the signal-to-noise ratio is low due to the low magnetic field strength of NMR tools and the complex petrophysical properties of detected samples.Suppressing the noise and highlighting the available NMR signals is very important for subsequent data processing.Most denoising methods are normally based on fixed mathematical transformation or handdesign feature selectors to suppress noise characteristics,which may not perform well because of their non-adaptive performance to different noisy signals.In this paper,we proposed a“data processing framework”to improve the quality of low field NMR echo data based on dictionary learning.Dictionary learning is a machine learning method based on redundancy and sparse representation theory.Available information in noisy NMR echo data can be adaptively extracted and reconstructed by dictionary learning.The advantages and application effectiveness of the proposed method were verified with a number of numerical simulations,NMR core data analyses,and NMR logging data processing.The results show that dictionary learning can significantly improve the quality of NMR echo data with high noise level and effectively improve the accuracy and reliability of inversion results.
基金supported by the National Key Basic Research and Development Program of China under contract No.2006CB701305the National Natural Science Foundation of China under coutract No.40571129the National High-Technology Program of China under contract Nos 2002AA639400,2003AA604040 and 2003AA637030.
文摘Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently, greater emphasis has been placed on GIS (geographical information system)to deal with the marine information. The GIS has shown great success for terrestrial applications in the last decades, but its use in marine fields has been far more restricted. One of the main reasons is that most of the GIS systems or their data models are designed for land applications. They cannot do well with the nature of the marine environment and for the marine information. And this becomes a fundamental challenge to the traditional GIS and its data structure. This work designed a data model, the raster-based spatio-temporal hierarchical data model (RSHDM), for the marine information system, or for the knowledge discovery fi'om spatio-temporal data, which bases itself on the nature of the marine data and overcomes the shortages of the current spatio-temporal models when they are used in the field. As an experiment, the marine fishery data warehouse (FDW) for marine fishery management was set up, which was based on the RSHDM. The experiment proved that the RSHDM can do well with the data and can extract easily the aggregations that the management needs at different levels.