In order to solve the so-called "bull-eye" problem caused by using a simple bilinear interpolation as an observational mapping operator in the cost function in the multigrid three-dimensional variational (3DVAR) d...In order to solve the so-called "bull-eye" problem caused by using a simple bilinear interpolation as an observational mapping operator in the cost function in the multigrid three-dimensional variational (3DVAR) data assimilation scheme, a smoothing term, equivalent to a penalty term, is introduced into the cost function to serve as a means of troubleshooting. A theoretical analysis is first performed to figure out what on earth results in the issue of "bull-eye", and then the meaning of such smoothing term is elucidated and the uniqueness of solution of the multigrid 3DVAR with the smoothing term added is discussed through the theoretical deduction for one-dimensional (1D) case, and two idealized data assimilation experiments (one- and two-dimensional (2D) cases). By exploring the relationship between the smoothing term and the recursive filter theoretically and practically, it is revealed why satisfied analysis results can be achieved by using such proposed solution for the issue of the multigrid 3DVAR.展开更多
Internal multiples are commonly present in seismic data due to variations in velocity or density of subsurface media.They can reduce the signal-to-noise ratio of seismic data and degrade the quality of the image.With ...Internal multiples are commonly present in seismic data due to variations in velocity or density of subsurface media.They can reduce the signal-to-noise ratio of seismic data and degrade the quality of the image.With the development of seismic exploration into deep and ultradeep events,especially those from complex targets in the western region of China,the internal multiple eliminations become increasingly challenging.Currently,three-dimensional(3D)seismic data are primarily used for oil and gas target recognition and drilling.Effectively eliminating internal multiples in 3D seismic data of complex structures and mitigating their adverse effects is crucial for enhancing the success rate of drilling.In this study,we propose an internal multiple prediction algorithm for 3D seismic data in complex structures using the Marchenko autofocusing theory.This method can predict the accurate internal multiples of time difference without an accurate velocity model and the implementation process mainly consists of several steps.Firstly,simulating direct waves with a 3D macroscopic velocity model.Secondly,using direct waves and 3D full seismic acquisition records to obtain the upgoing and down-going Green's functions between the virtual source point and surface.Thirdly,constructing internal multiples of the relevant layers by upgoing and downgoing Green's functions.Finally,utilizing the adaptive matching subtraction method to remove predicted internal multiples from the original data to obtain seismic records without multiples.Compared with the two-dimensional(2D)Marchenko algo-rithm,the performance of the 3D Marchenko algorithm for internal multiple prediction has been significantly enhanced,resulting in higher computational accuracy.Numerical simulation test results indicate that our proposed method can effectively eliminate internal multiples in 3D seismic data,thereby exhibiting important theoretical and industrial application value.展开更多
Multi-source data fusion provides high-precision spatial situational awareness essential for analyzing granular urban social activities.This study used Shanghai’s catering industry as a case study,leveraging electron...Multi-source data fusion provides high-precision spatial situational awareness essential for analyzing granular urban social activities.This study used Shanghai’s catering industry as a case study,leveraging electronic reviews and consumer data sourced from third-party restaurant platforms collected in 2021.By performing weighted processing on two-dimensional point-of-interest(POI)data,clustering hotspots of high-dimensional restaurant data were identified.A hierarchical network of restaurant hotspots was constructed following the Central Place Theory(CPT)framework,while the Geo-Informatic Tupu method was employed to resolve the challenges posed by network deformation in multi-scale processes.These findings suggest the necessity of enhancing the spatial balance of Shanghai’s urban centers by moderately increasing the number and service capacity of suburban centers at the urban periphery.Such measures would contribute to a more optimized urban structure and facilitate the outward dispersion of comfort-oriented facilities such as the restaurant industry.At a finer spatial scale,the distribution of restaurant hotspots demonstrates a polycentric and symmetric spatial pattern,with a developmental trend radiating outward along the city’s ring roads.This trend can be attributed to the efforts of restaurants to establish connections with other urban functional spaces,leading to the reconfiguration of urban spaces,expansion of restaurant-dedicated land use,and the reorganization of associated commercial activities.The results validate the existence of a polycentric urban structure in Shanghai but also highlight the instability of the restaurant hotspot network during cross-scale transitions.展开更多
Geochemical survey data are essential across Earth Science disciplines but are often affected by noise,which can obscure important geological signals and compromise subsequent prediction and interpretation.Quantifying...Geochemical survey data are essential across Earth Science disciplines but are often affected by noise,which can obscure important geological signals and compromise subsequent prediction and interpretation.Quantifying prediction uncertainty is hence crucial for robust geoscientific decision-making.This study proposes a novel deep learning framework,the Spatially Constrained Variational Autoencoder(SC-VAE),for denoising geochemical survey data with integrated uncertainty quantification.The SC-VAE incorporates spatial regularization,which enforces spatial coherence by modeling inter-sample relationships directly within the latent space.The performance of the SC-VAE was systematically evaluated against a standard Variational Autoencoder(VAE)using geochemical data from the gold polymetallic district in the northwestern part of Sichuan Province,China.Both models were optimized using Bayesian optimization,with objective functions specifically designed to maintain essential geostatistical characteristics.Evaluation metrics include variogram analysis,quantitative measures of spatial interpolation accuracy,visual assessment of denoised maps,and statistical analysis of data distributions,as well as decomposition of uncertainties.Results show that the SC-VAE achieves superior noise suppression and better preservation of spatial structure compared to the standard VAE,as demonstrated by a significant reduction in the variogram nugget effect and an increased partial sill.The SC-VAE produces denoised maps with clearer anomaly delineation and more regularized data distributions,effectively mitigating outliers and reducing kurtosis.Additionally,it delivers improved interpolation accuracy and spatially explicit uncertainty estimates,facilitating more reliable and interpretable assessments of prediction confidence.The SC-VAE framework thus provides a robust,geostatistically informed solution for enhancing the quality and interpretability of geochemical data,with broad applicability in mineral exploration,environmental geochemistry,and other Earth Science domains.展开更多
In this work,based on the role of pre-ionization of the non-uniform electric field and its effect of reducing the collisional ionization coefficient,a diffuse dielectric barrier discharge plasma is formed in the open ...In this work,based on the role of pre-ionization of the non-uniform electric field and its effect of reducing the collisional ionization coefficient,a diffuse dielectric barrier discharge plasma is formed in the open space outside the electrode structure at a lower voltage by constructing a three-dimensional non-uniform spatial electric field using a contact electrode structure.The air purification study is also carried out.Firstly,a contact electrode structure is constructed using a three-dimensional wire electrode.The distribution characteristics of the spatial electric field formed by this electrode structure are analyzed,and the effects of the non-uniform electric field and the different angles of the vertical wire on the generation of three-dimensional spatial diffuse discharge are investigated.Secondly,the copper foam contact electrode structure is constructed using copper foam material,and the effects of different mesh sizes on the electric field distribution are analyzed.The results show that as the mesh size of the copper foam becomes larger,a strong electric field region exists not only on the surface of the insulating layer,but also on the surface of the vertical wires inside the copper foam,i.e.,the strong electric field region shows a three-dimensional distribution.Besides,as the mesh size increases,the area of the vertical strong electric field also increases.However,the electric field strength on the surface of the insulating layer gradually decreases.Therefore,the appropriate mesh size can effectively increase the discharge area,which is conducive to improving the air purification efficiency.Finally,a highly permeable stacked electrode structure of multilayer wire-copper foam is designed.In combination with an ozone treatment catalyst,an air purification device is fabricated,and the air purification experiment is carried out.展开更多
Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean...Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.展开更多
For forward-looking array synthetic aperture radar(FASAR),the scattering intensity of ground scatterers fluctuates greatly since there are kinds of vegetations and topography on the surface of the ground,and thus the ...For forward-looking array synthetic aperture radar(FASAR),the scattering intensity of ground scatterers fluctuates greatly since there are kinds of vegetations and topography on the surface of the ground,and thus the signal-to-noise ratio(SNR)of its echo signals corresponding to different vegetations and topography also varies obviously.Owing to the reason known to all,the performance of the sparse reconstruction of compressed sensing(CS)becomes worse in the case of lower SNR,and the quality of the sparse three-dimensional imaging for FASAR would be affected significantly in the practical application.In this paper,the spatial continuity of the ground scatterers is introduced to the sparse recovery algorithm of CS in the threedimensional imaging for FASAR,in which the weighted least square method of the cubic interpolation is used to filter out the bad and isolated scatterer.The simulation results show that the proposed method can realize the sparse three-dimensional imaging of FASAR more effectively in the case of low SNR.展开更多
In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use Op...In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use OpenGL technique and the characteristic of analyzed data to construct a TDDF, the ways of reality processing and interactive processing are described. Then the medium geometric element and a related realistic model are constructed by means of the first algorithm. Models obtained for attaching the third dimension in three-dimensional data field are presented. An example for TDDF realization of machine measuring is provided. The analysis of resultant graphic indicates that the three-dimensional graphics built by the method developed is featured by good reality, fast processing and strong interaction展开更多
There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from ...There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from common shot gathers or other datasets located at certain points or along lines. We propose a novel method in this paper to observe seismic data on time slices from spatial subsets. The composition of a spatial subset and the unique character of orthogonal or oblique subsets are described and pre-stack subsets are shown by 3D visualization. In seismic data processing, spatial subsets can be used for the following aspects: (1) to check the trace distribution uniformity and regularity; (2) to observe the main features of ground-roll and linear noise; (3) to find abnormal traces from slices of datasets; and (4) to QC the results of pre-stack noise attenuation. The field data application shows that seismic data analysis in spatial subsets is an effective method that may lead to a better discrimination among various wavefields and help us obtain more information.展开更多
To improve the performance of the traditional map matching algorithms in freeway traffic state monitoring systems using the low logging frequency GPS (global positioning system) probe data, a map matching algorithm ...To improve the performance of the traditional map matching algorithms in freeway traffic state monitoring systems using the low logging frequency GPS (global positioning system) probe data, a map matching algorithm based on the Oracle spatial data model is proposed. The algorithm uses the Oracle road network data model to analyze the spatial relationships between massive GPS positioning points and freeway networks, builds an N-shortest path algorithm to find reasonable candidate routes between GPS positioning points efficiently, and uses the fuzzy logic inference system to determine the final matched traveling route. According to the implementation with field data from Los Angeles, the computation speed of the algorithm is about 135 GPS positioning points per second and the accuracy is 98.9%. The results demonstrate the effectiveness and accuracy of the proposed algorithm for mapping massive GPS positioning data onto freeway networks with complex geometric characteristics.展开更多
An integration processing system of three-dimensional laser scanning information visualization in goaf was developed. It is provided with multiple functions, such as laser scanning information management for goaf, clo...An integration processing system of three-dimensional laser scanning information visualization in goaf was developed. It is provided with multiple functions, such as laser scanning information management for goaf, cloud data de-noising optimization, construction, display and operation of three-dimensional model, model editing, profile generation, calculation of goaf volume and roof area, Boolean calculation among models and interaction with the third party soft ware. Concerning this system with a concise interface, plentiful data input/output interfaces, it is featured with high integration, simple and convenient operations of applications. According to practice, in addition to being well-adapted, this system is favorably reliable and stable.展开更多
Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recogni...Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.展开更多
Eighty percent of big data are associated with spatial information,and thus are Big Spatial Data(BSD).BSD provides new and great opportunities to rework problems in urban and environmental sustainability with advanced...Eighty percent of big data are associated with spatial information,and thus are Big Spatial Data(BSD).BSD provides new and great opportunities to rework problems in urban and environmental sustainability with advanced BSD analytics.To fully leverage the advantages of BSD,it is integrated with conventional data(e.g.remote sensing images)and improved methods are developed.This paper introduces four case studies:(1)Detection of polycentric urban structures;(2)Evaluation of urban vibrancy;(3)Estimation of population exposure to PM2.5;and(4)Urban land-use classification via deep learning.The results provide evidence that integrated methods can harness the advantages of both traditional data and BSD.Meanwhile,they can also improve the effectiveness of big data itself.Finally,this study makes three key recommendations for the development of BSD with regards to data fusion,data and predicting analytics,and theoretical modeling.展开更多
A novel Hilbert-curve is introduced for parallel spatial data partitioning, with consideration of the huge-amount property of spatial information and the variable-length characteristic of vector data items. Based on t...A novel Hilbert-curve is introduced for parallel spatial data partitioning, with consideration of the huge-amount property of spatial information and the variable-length characteristic of vector data items. Based on the improved Hilbert curve, the algorithm can be designed to achieve almost-uniform spatial data partitioning among multiple disks in parallel spatial databases. Thus, the phenomenon of data imbalance can be significantly avoided and search and query efficiency can be enhanced.展开更多
China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a...China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data.展开更多
The efficacy of vegetation dynamics simulations in offline land surface models(LSMs)largely depends on the quality and spatial resolution of meteorological forcing data.In this study,the Princeton Global Meteorologica...The efficacy of vegetation dynamics simulations in offline land surface models(LSMs)largely depends on the quality and spatial resolution of meteorological forcing data.In this study,the Princeton Global Meteorological Forcing Data(PMFD)and the high spatial resolution and upscaled China Meteorological Forcing Data(CMFD)were used to drive the Simplified Simple Biosphere model version 4/Top-down Representation of Interactive Foliage and Flora Including Dynamics(SSiB4/TRIFFID)and investigate how meteorological forcing datasets with different spatial resolutions affect simulations over the Tibetan Plateau(TP),a region with complex topography and sparse observations.By comparing the monthly Leaf Area Index(LAI)and Gross Primary Production(GPP)against observations,we found that SSiB4/TRIFFID driven by upscaled CMFD improved the performance in simulating the spatial distributions of LAI and GPP over the TP,reducing RMSEs by 24.3%and 20.5%,respectively.The multi-year averaged GPP decreased from 364.68 gC m^(-2)yr^(-1)to 241.21 gC m^(-2)yr^(-1)with the percentage bias dropping from 50.2%to-1.7%.When using the high spatial resolution CMFD,the RMSEs of the spatial distributions of LAI and GPP simulations were further reduced by 7.5%and 9.5%,respectively.This study highlights the importance of more realistic and high-resolution forcing data in simulating vegetation growth and carbon exchange between the atmosphere and biosphere over the TP.展开更多
There are hundreds of villages in the western mountainous area of Beijing,of which quite a few have a profound history and form the settlement culture in the western part of Beijing.Taking dozens of ancient villages i...There are hundreds of villages in the western mountainous area of Beijing,of which quite a few have a profound history and form the settlement culture in the western part of Beijing.Taking dozens of ancient villages in Mentougou District as the research sample,the village space as the research object,based on ASTER GDEM database and quantitative analysis tools such as Global Mapper and ArcGIS,this study analyzed from the perspectives of altitude,topography,slope direction,and building density distribution,made a quantitative study on the spatial distribution and plane structure of ancient villages so that the law of village space with the characteristics of western Beijing was summarized to supplement and improve the relevant achievements in the research field of ancient villages in western Beijing.展开更多
With the deepening informationization of Resources & Environment Remote Sensing geological survey conducted,some potential problems and deficiency are:(1) shortage of unified-planed running environment;(2) inconsi...With the deepening informationization of Resources & Environment Remote Sensing geological survey conducted,some potential problems and deficiency are:(1) shortage of unified-planed running environment;(2) inconsistent methods of data integration;and(3) disadvantages of different performing ways of data integration.This paper solves the above problems through overall planning and design,constructs unified running environment, consistent methods of data integration and system structure in order to advance the informationization展开更多
In order to provide a provincial spatial database, this paper presents a scheme for spatial database construction to meet the needs of China. The objective and overall technical route of spatial database construction ...In order to provide a provincial spatial database, this paper presents a scheme for spatial database construction to meet the needs of China. The objective and overall technical route of spatial database construction are described. The logical and physical database models are designed. Key issues are addressed, such as integration of multi-scale heterogeneous spatial databases, spatial data version management based on metadata and integrative management of map cartography and spatial database.展开更多
In this paper a review on current research on 3DCM is presented, and an alternative approach by integrating the concepts and techniques of object\|oriented method and Computer Aided Design (CAD) is suggested. Through ...In this paper a review on current research on 3DCM is presented, and an alternative approach by integrating the concepts and techniques of object\|oriented method and Computer Aided Design (CAD) is suggested. Through the approach urban spatial entities as objects are extracted, which are represented with primary 3D elements (node, edge, face and body) and their combinations. In the light of the concept of object, the method supports the multiple representation of Level of Details (LOD). More importantly, topological relationships between objects are described so that 3D topological operations can be implemented.展开更多
基金The National Basic Research Program of China under contract No. 2013CB430304the National High-Tech R&D Program of China under contract No. 2013AA09A505the National Natural Science Foundation of China under contract Nos 41030854,40906015,40906016,41106005 and 41176003
文摘In order to solve the so-called "bull-eye" problem caused by using a simple bilinear interpolation as an observational mapping operator in the cost function in the multigrid three-dimensional variational (3DVAR) data assimilation scheme, a smoothing term, equivalent to a penalty term, is introduced into the cost function to serve as a means of troubleshooting. A theoretical analysis is first performed to figure out what on earth results in the issue of "bull-eye", and then the meaning of such smoothing term is elucidated and the uniqueness of solution of the multigrid 3DVAR with the smoothing term added is discussed through the theoretical deduction for one-dimensional (1D) case, and two idealized data assimilation experiments (one- and two-dimensional (2D) cases). By exploring the relationship between the smoothing term and the recursive filter theoretically and practically, it is revealed why satisfied analysis results can be achieved by using such proposed solution for the issue of the multigrid 3DVAR.
文摘Internal multiples are commonly present in seismic data due to variations in velocity or density of subsurface media.They can reduce the signal-to-noise ratio of seismic data and degrade the quality of the image.With the development of seismic exploration into deep and ultradeep events,especially those from complex targets in the western region of China,the internal multiple eliminations become increasingly challenging.Currently,three-dimensional(3D)seismic data are primarily used for oil and gas target recognition and drilling.Effectively eliminating internal multiples in 3D seismic data of complex structures and mitigating their adverse effects is crucial for enhancing the success rate of drilling.In this study,we propose an internal multiple prediction algorithm for 3D seismic data in complex structures using the Marchenko autofocusing theory.This method can predict the accurate internal multiples of time difference without an accurate velocity model and the implementation process mainly consists of several steps.Firstly,simulating direct waves with a 3D macroscopic velocity model.Secondly,using direct waves and 3D full seismic acquisition records to obtain the upgoing and down-going Green's functions between the virtual source point and surface.Thirdly,constructing internal multiples of the relevant layers by upgoing and downgoing Green's functions.Finally,utilizing the adaptive matching subtraction method to remove predicted internal multiples from the original data to obtain seismic records without multiples.Compared with the two-dimensional(2D)Marchenko algo-rithm,the performance of the 3D Marchenko algorithm for internal multiple prediction has been significantly enhanced,resulting in higher computational accuracy.Numerical simulation test results indicate that our proposed method can effectively eliminate internal multiples in 3D seismic data,thereby exhibiting important theoretical and industrial application value.
基金Under the auspices of the Key Program of National Natural Science Foundation of China(No.42030409)。
文摘Multi-source data fusion provides high-precision spatial situational awareness essential for analyzing granular urban social activities.This study used Shanghai’s catering industry as a case study,leveraging electronic reviews and consumer data sourced from third-party restaurant platforms collected in 2021.By performing weighted processing on two-dimensional point-of-interest(POI)data,clustering hotspots of high-dimensional restaurant data were identified.A hierarchical network of restaurant hotspots was constructed following the Central Place Theory(CPT)framework,while the Geo-Informatic Tupu method was employed to resolve the challenges posed by network deformation in multi-scale processes.These findings suggest the necessity of enhancing the spatial balance of Shanghai’s urban centers by moderately increasing the number and service capacity of suburban centers at the urban periphery.Such measures would contribute to a more optimized urban structure and facilitate the outward dispersion of comfort-oriented facilities such as the restaurant industry.At a finer spatial scale,the distribution of restaurant hotspots demonstrates a polycentric and symmetric spatial pattern,with a developmental trend radiating outward along the city’s ring roads.This trend can be attributed to the efforts of restaurants to establish connections with other urban functional spaces,leading to the reconfiguration of urban spaces,expansion of restaurant-dedicated land use,and the reorganization of associated commercial activities.The results validate the existence of a polycentric urban structure in Shanghai but also highlight the instability of the restaurant hotspot network during cross-scale transitions.
基金supported by the National Natural Science Foundation of China(Nos.42530801,42425208)the Natural Science Foundation of Hubei Province(China)(No.2023AFA001)+1 种基金the MOST Special Fund from State Key Laboratory of Geological Processes and Mineral Resources,China University of Geosciences(No.MSFGPMR2025-401)the China Scholarship Council(No.202306410181)。
文摘Geochemical survey data are essential across Earth Science disciplines but are often affected by noise,which can obscure important geological signals and compromise subsequent prediction and interpretation.Quantifying prediction uncertainty is hence crucial for robust geoscientific decision-making.This study proposes a novel deep learning framework,the Spatially Constrained Variational Autoencoder(SC-VAE),for denoising geochemical survey data with integrated uncertainty quantification.The SC-VAE incorporates spatial regularization,which enforces spatial coherence by modeling inter-sample relationships directly within the latent space.The performance of the SC-VAE was systematically evaluated against a standard Variational Autoencoder(VAE)using geochemical data from the gold polymetallic district in the northwestern part of Sichuan Province,China.Both models were optimized using Bayesian optimization,with objective functions specifically designed to maintain essential geostatistical characteristics.Evaluation metrics include variogram analysis,quantitative measures of spatial interpolation accuracy,visual assessment of denoised maps,and statistical analysis of data distributions,as well as decomposition of uncertainties.Results show that the SC-VAE achieves superior noise suppression and better preservation of spatial structure compared to the standard VAE,as demonstrated by a significant reduction in the variogram nugget effect and an increased partial sill.The SC-VAE produces denoised maps with clearer anomaly delineation and more regularized data distributions,effectively mitigating outliers and reducing kurtosis.Additionally,it delivers improved interpolation accuracy and spatially explicit uncertainty estimates,facilitating more reliable and interpretable assessments of prediction confidence.The SC-VAE framework thus provides a robust,geostatistically informed solution for enhancing the quality and interpretability of geochemical data,with broad applicability in mineral exploration,environmental geochemistry,and other Earth Science domains.
基金supported by the Fundamental Research Funds for the Central Universities(No.2022YJS094)。
文摘In this work,based on the role of pre-ionization of the non-uniform electric field and its effect of reducing the collisional ionization coefficient,a diffuse dielectric barrier discharge plasma is formed in the open space outside the electrode structure at a lower voltage by constructing a three-dimensional non-uniform spatial electric field using a contact electrode structure.The air purification study is also carried out.Firstly,a contact electrode structure is constructed using a three-dimensional wire electrode.The distribution characteristics of the spatial electric field formed by this electrode structure are analyzed,and the effects of the non-uniform electric field and the different angles of the vertical wire on the generation of three-dimensional spatial diffuse discharge are investigated.Secondly,the copper foam contact electrode structure is constructed using copper foam material,and the effects of different mesh sizes on the electric field distribution are analyzed.The results show that as the mesh size of the copper foam becomes larger,a strong electric field region exists not only on the surface of the insulating layer,but also on the surface of the vertical wires inside the copper foam,i.e.,the strong electric field region shows a three-dimensional distribution.Besides,as the mesh size increases,the area of the vertical strong electric field also increases.However,the electric field strength on the surface of the insulating layer gradually decreases.Therefore,the appropriate mesh size can effectively increase the discharge area,which is conducive to improving the air purification efficiency.Finally,a highly permeable stacked electrode structure of multilayer wire-copper foam is designed.In combination with an ozone treatment catalyst,an air purification device is fabricated,and the air purification experiment is carried out.
基金The National Key R&D Program of China under contract No.2021YFC3101603.
文摘Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.
基金supported by the National Natural Science Foundation of China(61640006)the Natural Science Foundation of Shannxi Province,China(2019JM-386).
文摘For forward-looking array synthetic aperture radar(FASAR),the scattering intensity of ground scatterers fluctuates greatly since there are kinds of vegetations and topography on the surface of the ground,and thus the signal-to-noise ratio(SNR)of its echo signals corresponding to different vegetations and topography also varies obviously.Owing to the reason known to all,the performance of the sparse reconstruction of compressed sensing(CS)becomes worse in the case of lower SNR,and the quality of the sparse three-dimensional imaging for FASAR would be affected significantly in the practical application.In this paper,the spatial continuity of the ground scatterers is introduced to the sparse recovery algorithm of CS in the threedimensional imaging for FASAR,in which the weighted least square method of the cubic interpolation is used to filter out the bad and isolated scatterer.The simulation results show that the proposed method can realize the sparse three-dimensional imaging of FASAR more effectively in the case of low SNR.
基金This project is supported by National Natural Science Foundation of China (No.50405009)
文摘In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use OpenGL technique and the characteristic of analyzed data to construct a TDDF, the ways of reality processing and interactive processing are described. Then the medium geometric element and a related realistic model are constructed by means of the first algorithm. Models obtained for attaching the third dimension in three-dimensional data field are presented. An example for TDDF realization of machine measuring is provided. The analysis of resultant graphic indicates that the three-dimensional graphics built by the method developed is featured by good reality, fast processing and strong interaction
文摘There are some limitations when we apply conventional methods to analyze the massive amounts of seismic data acquired with high-density spatial sampling since processors usually obtain the properties of raw data from common shot gathers or other datasets located at certain points or along lines. We propose a novel method in this paper to observe seismic data on time slices from spatial subsets. The composition of a spatial subset and the unique character of orthogonal or oblique subsets are described and pre-stack subsets are shown by 3D visualization. In seismic data processing, spatial subsets can be used for the following aspects: (1) to check the trace distribution uniformity and regularity; (2) to observe the main features of ground-roll and linear noise; (3) to find abnormal traces from slices of datasets; and (4) to QC the results of pre-stack noise attenuation. The field data application shows that seismic data analysis in spatial subsets is an effective method that may lead to a better discrimination among various wavefields and help us obtain more information.
文摘To improve the performance of the traditional map matching algorithms in freeway traffic state monitoring systems using the low logging frequency GPS (global positioning system) probe data, a map matching algorithm based on the Oracle spatial data model is proposed. The algorithm uses the Oracle road network data model to analyze the spatial relationships between massive GPS positioning points and freeway networks, builds an N-shortest path algorithm to find reasonable candidate routes between GPS positioning points efficiently, and uses the fuzzy logic inference system to determine the final matched traveling route. According to the implementation with field data from Los Angeles, the computation speed of the algorithm is about 135 GPS positioning points per second and the accuracy is 98.9%. The results demonstrate the effectiveness and accuracy of the proposed algorithm for mapping massive GPS positioning data onto freeway networks with complex geometric characteristics.
基金Project(51274250)supported by the National Natural Science Foundation of ChinaProject(2012BAK09B02-05)supported by the National Key Technology R&D Program during the 12th Five-year Plan of China
文摘An integration processing system of three-dimensional laser scanning information visualization in goaf was developed. It is provided with multiple functions, such as laser scanning information management for goaf, cloud data de-noising optimization, construction, display and operation of three-dimensional model, model editing, profile generation, calculation of goaf volume and roof area, Boolean calculation among models and interaction with the third party soft ware. Concerning this system with a concise interface, plentiful data input/output interfaces, it is featured with high integration, simple and convenient operations of applications. According to practice, in addition to being well-adapted, this system is favorably reliable and stable.
基金Supported by the Open Researches Fund Program of L IESMARS(WKL(0 0 ) 0 30 2 )
文摘Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.
文摘Eighty percent of big data are associated with spatial information,and thus are Big Spatial Data(BSD).BSD provides new and great opportunities to rework problems in urban and environmental sustainability with advanced BSD analytics.To fully leverage the advantages of BSD,it is integrated with conventional data(e.g.remote sensing images)and improved methods are developed.This paper introduces four case studies:(1)Detection of polycentric urban structures;(2)Evaluation of urban vibrancy;(3)Estimation of population exposure to PM2.5;and(4)Urban land-use classification via deep learning.The results provide evidence that integrated methods can harness the advantages of both traditional data and BSD.Meanwhile,they can also improve the effectiveness of big data itself.Finally,this study makes three key recommendations for the development of BSD with regards to data fusion,data and predicting analytics,and theoretical modeling.
基金Funded by the National 863 Program of China (No. 2005AA113150), and the National Natural Science Foundation of China (No.40701158).
文摘A novel Hilbert-curve is introduced for parallel spatial data partitioning, with consideration of the huge-amount property of spatial information and the variable-length characteristic of vector data items. Based on the improved Hilbert curve, the algorithm can be designed to achieve almost-uniform spatial data partitioning among multiple disks in parallel spatial databases. Thus, the phenomenon of data imbalance can be significantly avoided and search and query efficiency can be enhanced.
文摘China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data.
基金the National Natural Science Foundation of China(Grant Nos.42130602,42175136)the Collaborative Innovation Center for Climate Change,Jiangsu Province,China.
文摘The efficacy of vegetation dynamics simulations in offline land surface models(LSMs)largely depends on the quality and spatial resolution of meteorological forcing data.In this study,the Princeton Global Meteorological Forcing Data(PMFD)and the high spatial resolution and upscaled China Meteorological Forcing Data(CMFD)were used to drive the Simplified Simple Biosphere model version 4/Top-down Representation of Interactive Foliage and Flora Including Dynamics(SSiB4/TRIFFID)and investigate how meteorological forcing datasets with different spatial resolutions affect simulations over the Tibetan Plateau(TP),a region with complex topography and sparse observations.By comparing the monthly Leaf Area Index(LAI)and Gross Primary Production(GPP)against observations,we found that SSiB4/TRIFFID driven by upscaled CMFD improved the performance in simulating the spatial distributions of LAI and GPP over the TP,reducing RMSEs by 24.3%and 20.5%,respectively.The multi-year averaged GPP decreased from 364.68 gC m^(-2)yr^(-1)to 241.21 gC m^(-2)yr^(-1)with the percentage bias dropping from 50.2%to-1.7%.When using the high spatial resolution CMFD,the RMSEs of the spatial distributions of LAI and GPP simulations were further reduced by 7.5%and 9.5%,respectively.This study highlights the importance of more realistic and high-resolution forcing data in simulating vegetation growth and carbon exchange between the atmosphere and biosphere over the TP.
基金Sponsored by National Natural Science Fund of China(51608007)Young Top-notch Talent Cultivation Project of North China University of Technology(2018)
文摘There are hundreds of villages in the western mountainous area of Beijing,of which quite a few have a profound history and form the settlement culture in the western part of Beijing.Taking dozens of ancient villages in Mentougou District as the research sample,the village space as the research object,based on ASTER GDEM database and quantitative analysis tools such as Global Mapper and ArcGIS,this study analyzed from the perspectives of altitude,topography,slope direction,and building density distribution,made a quantitative study on the spatial distribution and plane structure of ancient villages so that the law of village space with the characteristics of western Beijing was summarized to supplement and improve the relevant achievements in the research field of ancient villages in western Beijing.
文摘With the deepening informationization of Resources & Environment Remote Sensing geological survey conducted,some potential problems and deficiency are:(1) shortage of unified-planed running environment;(2) inconsistent methods of data integration;and(3) disadvantages of different performing ways of data integration.This paper solves the above problems through overall planning and design,constructs unified running environment, consistent methods of data integration and system structure in order to advance the informationization
基金Supported by the 863 High Technology Program of China (No. 2007AA12Z214), the National Natural Science Foundation of China (No. 40601083) and the National Key Basic Research and Development Program of China ( No. 2004CB318206).
文摘In order to provide a provincial spatial database, this paper presents a scheme for spatial database construction to meet the needs of China. The objective and overall technical route of spatial database construction are described. The logical and physical database models are designed. Key issues are addressed, such as integration of multi-scale heterogeneous spatial databases, spatial data version management based on metadata and integrative management of map cartography and spatial database.
文摘In this paper a review on current research on 3DCM is presented, and an alternative approach by integrating the concepts and techniques of object\|oriented method and Computer Aided Design (CAD) is suggested. Through the approach urban spatial entities as objects are extracted, which are represented with primary 3D elements (node, edge, face and body) and their combinations. In the light of the concept of object, the method supports the multiple representation of Level of Details (LOD). More importantly, topological relationships between objects are described so that 3D topological operations can be implemented.