Processing large-scale 3-D gravity data is an important topic in geophysics field. Many existing inversion methods lack the competence of processing massive data and practical application capacity. This study proposes...Processing large-scale 3-D gravity data is an important topic in geophysics field. Many existing inversion methods lack the competence of processing massive data and practical application capacity. This study proposes the application of GPU parallel processing technology to the focusing inversion method, aiming at improving the inversion accuracy while speeding up calculation and reducing the memory consumption, thus obtaining the fast and reliable inversion results for large complex model. In this paper, equivalent storage of geometric trellis is used to calculate the sensitivity matrix, and the inversion is based on GPU parallel computing technology. The parallel computing program that is optimized by reducing data transfer, access restrictions and instruction restrictions as well as latency hiding greatly reduces the memory usage, speeds up the calculation, and makes the fast inversion of large models possible. By comparing and analyzing the computing speed of traditional single thread CPU method and CUDA-based GPU parallel technology, the excellent acceleration performance of GPU parallel computing is verified, which provides ideas for practical application of some theoretical inversion methods restricted by computing speed and computer memory. The model test verifies that the focusing inversion method can overcome the problem of severe skin effect and ambiguity of geological body boundary. Moreover, the increase of the model cells and inversion data can more clearly depict the boundary position of the abnormal body and delineate its specific shape.展开更多
Background A task assigned to space exploration satellites involves detecting the physical environment within a certain space.However,space detection data are complex and abstract.These data are not conducive for rese...Background A task assigned to space exploration satellites involves detecting the physical environment within a certain space.However,space detection data are complex and abstract.These data are not conducive for researchers'visual perceptions of the evolution and interaction of events in the space environment.Methods A time-series dynamic data sampling method for large-scale space was proposed for sample detection data in space and time,and the corresponding relationships between data location features and other attribute features were established.A tone-mapping method based on statistical histogram equalization was proposed and applied to the final attribute feature data.The visualization process is optimized for rendering by merging materials,reducing the number of patches,and performing other operations.Results The results of sampling,feature extraction,and uniform visualization of the detection data of complex types,long duration spans,and uneven spatial distributions were obtained.The real-time visualization of large-scale spatial structures using augmented reality devices,particularly low-performance devices,was also investigated.Conclusions The proposed visualization system can reconstruct the three-dimensional structure of a large-scale space,express the structure and changes in the spatial environment using augmented reality,and assist in intuitively discovering spatial environmental events and evolutionary rules.展开更多
Social media data created a paradigm shift in assessing situational awareness during a natural disaster or emergencies such as wildfire, hurricane, tropical storm etc. Twitter as an emerging data source is an effectiv...Social media data created a paradigm shift in assessing situational awareness during a natural disaster or emergencies such as wildfire, hurricane, tropical storm etc. Twitter as an emerging data source is an effective and innovative digital platform to observe trend from social media users’ perspective who are direct or indirect witnesses of the calamitous event. This paper aims to collect and analyze twitter data related to the recent wildfire in California to perform a trend analysis by classifying firsthand and credible information from Twitter users. This work investigates tweets on the recent wildfire in California and classifies them based on witnesses into two types: 1) direct witnesses and 2) indirect witnesses. The collected and analyzed information can be useful for law enforcement agencies and humanitarian organizations for communication and verification of the situational awareness during wildfire hazards. Trend analysis is an aggregated approach that includes sentimental analysis and topic modeling performed through domain-expert manual annotation and machine learning. Trend analysis ultimately builds a fine-grained analysis to assess evacuation routes and provide valuable information to the firsthand emergency responders<span style="font-family:Verdana;">.</span>展开更多
In the face of a growing number of large-scale data sets, affinity propagation clustering algorithm to calculate the process required to build the similarity matrix, will bring huge storage and computation. Therefore,...In the face of a growing number of large-scale data sets, affinity propagation clustering algorithm to calculate the process required to build the similarity matrix, will bring huge storage and computation. Therefore, this paper proposes an improved affinity propagation clustering algorithm. First, add the subtraction clustering, using the density value of the data points to obtain the point of initial clusters. Then, calculate the similarity distance between the initial cluster points, and reference the idea of semi-supervised clustering, adding pairs restriction information, structure sparse similarity matrix. Finally, the cluster representative points conduct AP clustering until a suitable cluster division.Experimental results show that the algorithm allows the calculation is greatly reduced, the similarity matrix storage capacity is also reduced, and better than the original algorithm on the clustering effect and processing speed.展开更多
Based on the synchronous joint gravity and magnetic inversion of single interface by Pilkington and the need of revealing Cenozoic and crystalline basement thickness in the new round of oil-gas exploration, we propose...Based on the synchronous joint gravity and magnetic inversion of single interface by Pilkington and the need of revealing Cenozoic and crystalline basement thickness in the new round of oil-gas exploration, we propose a joint gravity and magnetic inversion methodfor two-layer models by concentrating on the relationship between the change of thicknessI and position of the middle layer and anomaly and discuss the effects of the key parameters. Model tests and application to field data show the validity of this method.展开更多
Tikhonov regularization(TR) method has played a very important role in the gravity data and magnetic data process. In this paper, the Tikhonov regularization method with respect to the inversion of gravity data is d...Tikhonov regularization(TR) method has played a very important role in the gravity data and magnetic data process. In this paper, the Tikhonov regularization method with respect to the inversion of gravity data is discussed. and the extrapolated TR method(EXTR) is introduced to improve the fitting error. Furthermore, the effect of the parameters in the EXTR method on the fitting error, number of iterations, and inversion results are discussed in details. The computation results using a synthetic model with the same and different densities indicated that. compared with the TR method, the EXTR method not only achieves the a priori fitting error level set by the interpreter but also increases the fitting precision, although it increases the computation time and number of iterations. And the EXTR inversion results are more compact than the TR inversion results, which are more divergent. The range of the inversion data is closer to the default range of the model parameters, and the model features and default model density distribution agree well.展开更多
The Atlas region in northwest Africa is characterized by the Quaternary volcanism and elevated topography with past complex tectonic mitigation between the African and European plates.Geodynamics of this atypical regi...The Atlas region in northwest Africa is characterized by the Quaternary volcanism and elevated topography with past complex tectonic mitigation between the African and European plates.Geodynamics of this atypical region has left indubitably imprints in crustal architectonics,mainly regarding the crustal thickness as well as the crustal density structure.The knowledge of crustal thickness variations is of a significant interest,since it provides a crucial constraint to geodynamic and geophysical modelling of this region.In this study,we use gravity,topographic,bathymetric and sediment data together with results of seismic surveys to image the Moho topography beneath the Atlas region.The Bouguer gravity anomalies used for a gravimetric Moho recovery are obtained from the free-air gravity anomalies after subtracting the gravitational contributions of topography,bathymetry and sediments.The regional gravimetric Moho inversion constrained on seismic data is carried out by applying a regularized inversion technique based on Gauss-Newton’s formulation of improved Bott’s method,while adopting Earth’s spherical approximation.The numerical result reveals relatively significant Moho depth variations in the Moroccan Atlas,with minima of approximately 24 km along continental margins of the Mediterranean Sea and maxima exceeding 51 km beneath the Rif Cordillera.The Moho depth beneath the West African Craton varies from 32 km in its southern margin to 45 km beneath the Middle Atlas.The Tell Atlas is characterized by the shallow Moho depth of approximately 22 km and further deepening to 42 km towards the northern edge of the Aures Mountains.Our findings indicate a limited tectonic shortening of the High Atlas with the crustal thickness mostly within 36-42 km.Topographic discrepancies between the Rif Cordillera and the Atlas Mountains suggest that the hypothesis of isostatic compensation cannot be fully established.展开更多
With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processin...With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noise- contaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airbome gravity-gradiometry data from Vinton salt dome (south- west Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.展开更多
The density inversion of gravity gradiometry data has attracted considerable attention;however,in large datasets,the multiplicity and low depth resolution as well as efficiency are constrained by time and computer mem...The density inversion of gravity gradiometry data has attracted considerable attention;however,in large datasets,the multiplicity and low depth resolution as well as efficiency are constrained by time and computer memory requirements.To solve these problems,we improve the reweighting focusing inversion and probability tomography inversion with joint multiple tensors and prior information constraints,and assess the inversion results,computing efficiency,and dataset size.A Message Passing Interface(MPI)-Open Multi-Processing(OpenMP)-Computed Unified Device Architecture(CUDA)multilevel hybrid parallel inversion,named Hybrinv for short,is proposed.Using model and real data from the Vinton Dome,we confirm that Hybrinv can be used to compute the density distribution.For data size of 100×100×20,the hybrid parallel algorithm is fast and based on the run time and scalability we infer that it can be used to process the large-scale data.展开更多
A new gravity survey was carried out in the northern part of the onshore Kribi- Campo sub-basin in Cameroon. The data were incorporated to the existing ones and then analyzed and modeled in order to elucidate the subs...A new gravity survey was carried out in the northern part of the onshore Kribi- Campo sub-basin in Cameroon. The data were incorporated to the existing ones and then analyzed and modeled in order to elucidate the subsurface structure of the area. The area is characterized in its north-western part by considerably high positive anomalies indicative of the presence of a dense intrusive body. We find, 1) from the analysis of the gravity residual anomaly map, the high positive anomalies observed are the signature of a shallow dense structure;2) from the multi-scale analysis of the maxima of the horizontal gradient, the structure is confined between depths of 0.5 km and 5 km;3) from the quantitative interpretation of residual anomalies by spectral analysis, the depth to the upper surface of the intrusive body is not uniform, the average depth of the bottom is h1 = 3.6 km and the depths to particular sections of the roof of the intrusion are h2 = 1.6 km and h3 = 0.5 km;4) and the 3D modeling gives results that are suggestive of the presence of contacts between rocks of different densities at different depths and a dense intrusive igneous body in the upper crust of the Kribi zone. From the 3D model the dense intrusive igneous block is surrounded by sedimentary formations to the south-west and metamorphic formations to the north-east. Both formations have a density of about 2.74 g/cm3. The near surface portions of this igneous block lie at a depth range of 0.5 km to 1.5 km while its lower surface has a depth range of 3.6 km to 5.2 km. The shape of the edges and the bottom of the intrusive body are suggestive of the fact that it forms part of a broader structure underlying the Kribi-Campo sub-basin with a great influence on the sedimentary cover.展开更多
Joint inversion based on a correlation constraint utilizes a linear correlation function as a structural constraint.The linear correlation function contains a denominator,which may result in a singularity as the objec...Joint inversion based on a correlation constraint utilizes a linear correlation function as a structural constraint.The linear correlation function contains a denominator,which may result in a singularity as the objective function is optimized,leading to an unstable inversion calculation.To improve the robustness of this calculation,this paper proposes a new method in which a sinusoidal correlation function is employed as the structural constraint for joint inversion instead of the conventional linear correlation function.This structural constraint does not contain a denominator,thereby preventing a singularity.Compared with the joint inversion method based on a cross-gradient constraint,the joint inversion method based on a sinusoidal correlation constraint exhibits good performance.An application to actual data demonstrates that this method can process real data.展开更多
Lineaments in the southeastern Jordan plateau are mapped using gravity data and field studies in order to understand the tectonic origin of these lineaments, especially in relation to the Dead Sea transform (DST) an...Lineaments in the southeastern Jordan plateau are mapped using gravity data and field studies in order to understand the tectonic origin of these lineaments, especially in relation to the Dead Sea transform (DST) and the Red Sea opening. Four sets trending E-W, NW-SE, NE-SW, and N-S are identified in gravity data. Field studies generally reveal similar orientations. Field and gravity studies indicate that most of the lineaments are extensional features that correspond to normal faults. Most of these were subsequently reactivated into strike-slip shear fractures. The NW-SE and N-S lineaments represent dilatational fractures. The N-S trending lineaments are the oldest. The E-W lineaments form conjugate shear fractures and are younger than the N-S lineaments. These conjugate shear fractures are also older than other set of conjugate shear fractures oriented NE-SW. The evolution of all these fractures is attributed to the DST and the Red Sea spreading. Kinematic and dynamic analysis of the two, older and younger, pairs of conjugate strike-slip fractures revealed, respectively, broadly NW-SE and N-S oriented transpressional stress with corresponding transtensional stress oriented NE-SW and E-W.展开更多
By systemic processing, comprehensive analysis, and interpretation of gravity data, we confirmed the existence of the west segment of the coastal fault zone(west of Yangjiang to Beibu Bay) in the coastal region of Sou...By systemic processing, comprehensive analysis, and interpretation of gravity data, we confirmed the existence of the west segment of the coastal fault zone(west of Yangjiang to Beibu Bay) in the coastal region of South China. This showed an apparent high gravity gradient in the NEE direction, and worse linearity and less compactness than that in the Pearl River month. This also revealed a relatively large curvature and a complicated gravity structure. In the finding images processed by the gravity data system, each fault was well reflected and primarily characterized by isolines or thick black stripes with a cutting depth greater than 30 km. Though mutually cut by NW-trending and NE-trending faults, the apparent NEE stripe-shaped structure of the west segment of the coastal fault zone remained unchanged,with good continuity and an activity strength higher than that of NW and NE-trending faults. Moreover,we determined that the west segment of the coastal fault zone is the major seismogenic structure responsible for strong earthquakes in the coastal region in the border area of Guangdong, Guangxi, and Hainan.展开更多
The earth gravity field model CDS01S of degree and order 36 has been recovered from the post processed Science Orbits and on-board accelerometer data of GFZ’s CHAMP satellite. The model resolves the geoid with an acc...The earth gravity field model CDS01S of degree and order 36 has been recovered from the post processed Science Orbits and on-board accelerometer data of GFZ’s CHAMP satellite. The model resolves the geoid with an accuracy of better than 4 cm at a resolution of 700 km half-wavelength. By using the degree difference variances of geopotential coefficients to compare the model CDS01S with EIGEN3P, EIGEN1S and EGM96, the result indicates that the coefficients of CDS01S are most close to those of EIGEN3P. The result of the comparison between the accuracies of geopotential coefficients in the above models, indicates that the accuracy of coefficients in CDS01S is higher than that in EGM96.The geoid undulations of CDS01S and GGM01C up to 30 degrees are calculated and the standard deviation is 4.7 cm between them.展开更多
Regarding the rapid compensation of the influence of the Earth' s disturbing gravity field upon trajectory calculation,the key point lies in how to derive the analytical solutions to the partial derivatives of the st...Regarding the rapid compensation of the influence of the Earth' s disturbing gravity field upon trajectory calculation,the key point lies in how to derive the analytical solutions to the partial derivatives of the state of burnout point with respect to the launch data.In view of this,this paper mainly expounds on two issues:one is based on the approximate analytical solution to the motion equation for the vacuum flight section of a long-range rocket,deriving the analytical solutions to the partial derivatives of the state of burnout point with respect to the changing rate of the finalstage pitch program;the other is based on the initial positioning and orientation error propagation mechanism,proposing the analytical calculation formula for the partial derivatives of the state of burnout point with respect to the launch azimuth.The calculation results of correction data are simulated and verified under different circumstances.The simulation results are as follows:(1) the accuracy of approximation between the analytical solutions and the results attained via the difference method is higher than 90%,and the ratio of calculation time between them is lower than 0.2%,thus demonstrating the accuracy of calculation of data corrections and advantages in calculation speed;(2) after the analytical solutions are compensated,the longitudinal landing deviation of the rocket is less than 20 m and the lateral landing deviation of the rocket is less than 10 m,demonstrating that the corrected data can meet the requirements for the hit accuracy of a long-range rocket.展开更多
The processing and interpretation of gravity and gradient data plays an important role in geophysics.The cross gradient joint inversion is usually used for achieving structure coupling of multiple geophysical models. ...The processing and interpretation of gravity and gradient data plays an important role in geophysics.The cross gradient joint inversion is usually used for achieving structure coupling of multiple geophysical models. In order to realize the coupling of gravity and gravity tensor data,the authors analyzed each component.The results show that different types of data contain different direction information,and derived the joint inversion based on cross gradient function and applied it to model data. The theoretical model results show that the cross gradient method can reduce the multi solution and significantly improve the resolution of the inversion.The method was also applied to inverse the gravity tensor data in Vinton salt dome,showing that this method can get higher resolution results than the separate linear inversion,and be closer to the real density from drilling data.展开更多
To generate carbon credits under the Reducing Emissions from Deforestation and forest Degradation program(REDD+), accurate estimates of forest carbon stocks are needed. Carbon accounting efforts have focused on car...To generate carbon credits under the Reducing Emissions from Deforestation and forest Degradation program(REDD+), accurate estimates of forest carbon stocks are needed. Carbon accounting efforts have focused on carbon stocks in aboveground biomass(AGB).Although wood specific gravity(WSG) is known to be an important variable in AGB estimates, there is currently a lack of data on WSG for Malagasy tree species. This study aimed to determine whether estimates of carbon stocks calculated from literature-based WSG values differed from those based on WSG values measured on wood core samples. Carbon stocks in forest biomass were assessed using two WSG data sets:(i) values measured from 303 wood core samples extracted in the study area,(ii) values derived from international databases. Results suggested that there is difference between the field and literaturebased WSG at the 0.05 level. The latter data set was on average 16 % higher than the former. However, carbon stocks calculated from the two data sets did not differ significantly at the 0.05 level. Such findings could be attributed to the form of the allometric equation used which gives more weight to tree diameter and tree height than to WSG. The choice of dataset should depend on the level of accuracy(Tier II or III) desired by REDD+. As higher levels of accuracy are rewarded by higher prices, speciesspecific WSG data would be highly desirable.展开更多
The global bathymetry models are usually of low accuracy over the coastline of polar areas due to the harsh climatic environment and the complex topography.Satellite altimetric gravity data can be a supplement and pla...The global bathymetry models are usually of low accuracy over the coastline of polar areas due to the harsh climatic environment and the complex topography.Satellite altimetric gravity data can be a supplement and plays a key role in bathymetry modeling over these regions.The Synthetic Aperture Radar(SAR)altimeters in the missions like CryoSat-2 and Sentinel-3A/3B can relieve waveform contamination that existed in conventional altimeters and provide data with improved accuracy and spatial resolution.In this study,we investigate the potential application of SAR altimetric gravity data in enhancing coastal bathymetry,where the effects on local bathymetry modeling introduced from SAR altimetry data are quantified and evaluated.Furthermore,we study the effects on bathymetry modeling by using different scale factor calculation approaches,where a partition-wise scheme is implemented.The numerical experiment over the South Sandwich Islands near Antarctica suggests that using SARbased altimetric gravity data improves local coastal bathymetry modeling,compared with the model calculated without SAR altimetry data by a magnitude of 3:55 m within 10 km of offshore areas.Moreover,by using the partition-wise scheme for scale factor calculation,the quality of the coastal bathymetry model is improved by 7.34 m compared with the result derived from the traditional method.These results indicate the superiority of using SAR altimetry data in coastal bathymetry inversion.展开更多
A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for det...A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.展开更多
基金Supported by Project of National Natural Science Foundation(No.41874134)
文摘Processing large-scale 3-D gravity data is an important topic in geophysics field. Many existing inversion methods lack the competence of processing massive data and practical application capacity. This study proposes the application of GPU parallel processing technology to the focusing inversion method, aiming at improving the inversion accuracy while speeding up calculation and reducing the memory consumption, thus obtaining the fast and reliable inversion results for large complex model. In this paper, equivalent storage of geometric trellis is used to calculate the sensitivity matrix, and the inversion is based on GPU parallel computing technology. The parallel computing program that is optimized by reducing data transfer, access restrictions and instruction restrictions as well as latency hiding greatly reduces the memory usage, speeds up the calculation, and makes the fast inversion of large models possible. By comparing and analyzing the computing speed of traditional single thread CPU method and CUDA-based GPU parallel technology, the excellent acceleration performance of GPU parallel computing is verified, which provides ideas for practical application of some theoretical inversion methods restricted by computing speed and computer memory. The model test verifies that the focusing inversion method can overcome the problem of severe skin effect and ambiguity of geological body boundary. Moreover, the increase of the model cells and inversion data can more clearly depict the boundary position of the abnormal body and delineate its specific shape.
文摘Background A task assigned to space exploration satellites involves detecting the physical environment within a certain space.However,space detection data are complex and abstract.These data are not conducive for researchers'visual perceptions of the evolution and interaction of events in the space environment.Methods A time-series dynamic data sampling method for large-scale space was proposed for sample detection data in space and time,and the corresponding relationships between data location features and other attribute features were established.A tone-mapping method based on statistical histogram equalization was proposed and applied to the final attribute feature data.The visualization process is optimized for rendering by merging materials,reducing the number of patches,and performing other operations.Results The results of sampling,feature extraction,and uniform visualization of the detection data of complex types,long duration spans,and uneven spatial distributions were obtained.The real-time visualization of large-scale spatial structures using augmented reality devices,particularly low-performance devices,was also investigated.Conclusions The proposed visualization system can reconstruct the three-dimensional structure of a large-scale space,express the structure and changes in the spatial environment using augmented reality,and assist in intuitively discovering spatial environmental events and evolutionary rules.
文摘Social media data created a paradigm shift in assessing situational awareness during a natural disaster or emergencies such as wildfire, hurricane, tropical storm etc. Twitter as an emerging data source is an effective and innovative digital platform to observe trend from social media users’ perspective who are direct or indirect witnesses of the calamitous event. This paper aims to collect and analyze twitter data related to the recent wildfire in California to perform a trend analysis by classifying firsthand and credible information from Twitter users. This work investigates tweets on the recent wildfire in California and classifies them based on witnesses into two types: 1) direct witnesses and 2) indirect witnesses. The collected and analyzed information can be useful for law enforcement agencies and humanitarian organizations for communication and verification of the situational awareness during wildfire hazards. Trend analysis is an aggregated approach that includes sentimental analysis and topic modeling performed through domain-expert manual annotation and machine learning. Trend analysis ultimately builds a fine-grained analysis to assess evacuation routes and provide valuable information to the firsthand emergency responders<span style="font-family:Verdana;">.</span>
基金This research has been partially supported by the national natural science foundation of China (51175169) and the national science and technology support program (2012BAF02B01).
文摘In the face of a growing number of large-scale data sets, affinity propagation clustering algorithm to calculate the process required to build the similarity matrix, will bring huge storage and computation. Therefore, this paper proposes an improved affinity propagation clustering algorithm. First, add the subtraction clustering, using the density value of the data points to obtain the point of initial clusters. Then, calculate the similarity distance between the initial cluster points, and reference the idea of semi-supervised clustering, adding pairs restriction information, structure sparse similarity matrix. Finally, the cluster representative points conduct AP clustering until a suitable cluster division.Experimental results show that the algorithm allows the calculation is greatly reduced, the similarity matrix storage capacity is also reduced, and better than the original algorithm on the clustering effect and processing speed.
基金Supported by the National Natural Science Foundation of China(Grant No.40674063)National Hi-tech Research and Development Program of China(863Program)(Grant No.2006AA09Z311)
文摘Based on the synchronous joint gravity and magnetic inversion of single interface by Pilkington and the need of revealing Cenozoic and crystalline basement thickness in the new round of oil-gas exploration, we propose a joint gravity and magnetic inversion methodfor two-layer models by concentrating on the relationship between the change of thicknessI and position of the middle layer and anomaly and discuss the effects of the key parameters. Model tests and application to field data show the validity of this method.
基金supported by the National Scientific and Technological Plan(Nos.2009BAB43B00 and 2009BAB43B01)
文摘Tikhonov regularization(TR) method has played a very important role in the gravity data and magnetic data process. In this paper, the Tikhonov regularization method with respect to the inversion of gravity data is discussed. and the extrapolated TR method(EXTR) is introduced to improve the fitting error. Furthermore, the effect of the parameters in the EXTR method on the fitting error, number of iterations, and inversion results are discussed in details. The computation results using a synthetic model with the same and different densities indicated that. compared with the TR method, the EXTR method not only achieves the a priori fitting error level set by the interpreter but also increases the fitting precision, although it increases the computation time and number of iterations. And the EXTR inversion results are more compact than the TR inversion results, which are more divergent. The range of the inversion data is closer to the default range of the model parameters, and the model features and default model density distribution agree well.
基金conducted under the HK science project 1-ZE8F:Remote-sensing data for studding the Earth’s and planetary inner structure.
文摘The Atlas region in northwest Africa is characterized by the Quaternary volcanism and elevated topography with past complex tectonic mitigation between the African and European plates.Geodynamics of this atypical region has left indubitably imprints in crustal architectonics,mainly regarding the crustal thickness as well as the crustal density structure.The knowledge of crustal thickness variations is of a significant interest,since it provides a crucial constraint to geodynamic and geophysical modelling of this region.In this study,we use gravity,topographic,bathymetric and sediment data together with results of seismic surveys to image the Moho topography beneath the Atlas region.The Bouguer gravity anomalies used for a gravimetric Moho recovery are obtained from the free-air gravity anomalies after subtracting the gravitational contributions of topography,bathymetry and sediments.The regional gravimetric Moho inversion constrained on seismic data is carried out by applying a regularized inversion technique based on Gauss-Newton’s formulation of improved Bott’s method,while adopting Earth’s spherical approximation.The numerical result reveals relatively significant Moho depth variations in the Moroccan Atlas,with minima of approximately 24 km along continental margins of the Mediterranean Sea and maxima exceeding 51 km beneath the Rif Cordillera.The Moho depth beneath the West African Craton varies from 32 km in its southern margin to 45 km beneath the Middle Atlas.The Tell Atlas is characterized by the shallow Moho depth of approximately 22 km and further deepening to 42 km towards the northern edge of the Aures Mountains.Our findings indicate a limited tectonic shortening of the High Atlas with the crustal thickness mostly within 36-42 km.Topographic discrepancies between the Rif Cordillera and the Atlas Mountains suggest that the hypothesis of isostatic compensation cannot be fully established.
基金the Sub-project of National Science and Technology Major Project of China(No.2016ZX05027-002-003)the National Natural Science Foundation of China(No.41404089)+1 种基金the State Key Program of National Natural Science of China(No.41430322)the National Basic Research Program of China(973 Program)(No.2015CB45300)
文摘With the continuous development of full tensor gradiometer (FTG) measurement techniques, three-dimensional (3D) inversion of FTG data is becoming increasingly used in oil and gas exploration. In the fast processing and interpretation of large-scale high-precision data, the use of the graphics processing unit process unit (GPU) and preconditioning methods are very important in the data inversion. In this paper, an improved preconditioned conjugate gradient algorithm is proposed by combining the symmetric successive over-relaxation (SSOR) technique and the incomplete Choleksy decomposition conjugate gradient algorithm (ICCG). Since preparing the preconditioner requires extra time, a parallel implement based on GPU is proposed. The improved method is then applied in the inversion of noise- contaminated synthetic data to prove its adaptability in the inversion of 3D FTG data. Results show that the parallel SSOR-ICCG algorithm based on NVIDIA Tesla C2050 GPU achieves a speedup of approximately 25 times that of a serial program using a 2.0 GHz Central Processing Unit (CPU). Real airbome gravity-gradiometry data from Vinton salt dome (south- west Louisiana, USA) are also considered. Good results are obtained, which verifies the efficiency and feasibility of the proposed parallel method in fast inversion of 3D FTG data.
基金support by the China Postdoctoral Science Foundation(2017M621151)Northeastern University Postdoctoral Science Foundation(20180313)+1 种基金the Fundamental Research Funds for Central Universities(N180104020)NSFCShandong Joint Fund of the National Natural Science Foundation of China(U1806208)
文摘The density inversion of gravity gradiometry data has attracted considerable attention;however,in large datasets,the multiplicity and low depth resolution as well as efficiency are constrained by time and computer memory requirements.To solve these problems,we improve the reweighting focusing inversion and probability tomography inversion with joint multiple tensors and prior information constraints,and assess the inversion results,computing efficiency,and dataset size.A Message Passing Interface(MPI)-Open Multi-Processing(OpenMP)-Computed Unified Device Architecture(CUDA)multilevel hybrid parallel inversion,named Hybrinv for short,is proposed.Using model and real data from the Vinton Dome,we confirm that Hybrinv can be used to compute the density distribution.For data size of 100×100×20,the hybrid parallel algorithm is fast and based on the run time and scalability we infer that it can be used to process the large-scale data.
文摘A new gravity survey was carried out in the northern part of the onshore Kribi- Campo sub-basin in Cameroon. The data were incorporated to the existing ones and then analyzed and modeled in order to elucidate the subsurface structure of the area. The area is characterized in its north-western part by considerably high positive anomalies indicative of the presence of a dense intrusive body. We find, 1) from the analysis of the gravity residual anomaly map, the high positive anomalies observed are the signature of a shallow dense structure;2) from the multi-scale analysis of the maxima of the horizontal gradient, the structure is confined between depths of 0.5 km and 5 km;3) from the quantitative interpretation of residual anomalies by spectral analysis, the depth to the upper surface of the intrusive body is not uniform, the average depth of the bottom is h1 = 3.6 km and the depths to particular sections of the roof of the intrusion are h2 = 1.6 km and h3 = 0.5 km;4) and the 3D modeling gives results that are suggestive of the presence of contacts between rocks of different densities at different depths and a dense intrusive igneous body in the upper crust of the Kribi zone. From the 3D model the dense intrusive igneous block is surrounded by sedimentary formations to the south-west and metamorphic formations to the north-east. Both formations have a density of about 2.74 g/cm3. The near surface portions of this igneous block lie at a depth range of 0.5 km to 1.5 km while its lower surface has a depth range of 3.6 km to 5.2 km. The shape of the edges and the bottom of the intrusive body are suggestive of the fact that it forms part of a broader structure underlying the Kribi-Campo sub-basin with a great influence on the sedimentary cover.
基金supported by the National Key Research and Development Project of China(No:2017YFC0602201)
文摘Joint inversion based on a correlation constraint utilizes a linear correlation function as a structural constraint.The linear correlation function contains a denominator,which may result in a singularity as the objective function is optimized,leading to an unstable inversion calculation.To improve the robustness of this calculation,this paper proposes a new method in which a sinusoidal correlation function is employed as the structural constraint for joint inversion instead of the conventional linear correlation function.This structural constraint does not contain a denominator,thereby preventing a singularity.Compared with the joint inversion method based on a cross-gradient constraint,the joint inversion method based on a sinusoidal correlation constraint exhibits good performance.An application to actual data demonstrates that this method can process real data.
基金supported by the National Plan for Science,Technology and Innovation (NPST) Program,King Saud University,Saudi Arabia (No.11-WAT1731-02)
文摘Lineaments in the southeastern Jordan plateau are mapped using gravity data and field studies in order to understand the tectonic origin of these lineaments, especially in relation to the Dead Sea transform (DST) and the Red Sea opening. Four sets trending E-W, NW-SE, NE-SW, and N-S are identified in gravity data. Field studies generally reveal similar orientations. Field and gravity studies indicate that most of the lineaments are extensional features that correspond to normal faults. Most of these were subsequently reactivated into strike-slip shear fractures. The NW-SE and N-S lineaments represent dilatational fractures. The N-S trending lineaments are the oldest. The E-W lineaments form conjugate shear fractures and are younger than the N-S lineaments. These conjugate shear fractures are also older than other set of conjugate shear fractures oriented NE-SW. The evolution of all these fractures is attributed to the DST and the Red Sea spreading. Kinematic and dynamic analysis of the two, older and younger, pairs of conjugate strike-slip fractures revealed, respectively, broadly NW-SE and N-S oriented transpressional stress with corresponding transtensional stress oriented NE-SW and E-W.
基金financially supported by Guangdong Provincial Science and Technology Plan Projects(20178030314082)General Project of National Natural Science Foundation of China (41676057)National Science and Technology Support Program (2015BAK18B01)
文摘By systemic processing, comprehensive analysis, and interpretation of gravity data, we confirmed the existence of the west segment of the coastal fault zone(west of Yangjiang to Beibu Bay) in the coastal region of South China. This showed an apparent high gravity gradient in the NEE direction, and worse linearity and less compactness than that in the Pearl River month. This also revealed a relatively large curvature and a complicated gravity structure. In the finding images processed by the gravity data system, each fault was well reflected and primarily characterized by isolines or thick black stripes with a cutting depth greater than 30 km. Though mutually cut by NW-trending and NE-trending faults, the apparent NEE stripe-shaped structure of the west segment of the coastal fault zone remained unchanged,with good continuity and an activity strength higher than that of NW and NE-trending faults. Moreover,we determined that the west segment of the coastal fault zone is the major seismogenic structure responsible for strong earthquakes in the coastal region in the border area of Guangdong, Guangxi, and Hainan.
文摘The earth gravity field model CDS01S of degree and order 36 has been recovered from the post processed Science Orbits and on-board accelerometer data of GFZ’s CHAMP satellite. The model resolves the geoid with an accuracy of better than 4 cm at a resolution of 700 km half-wavelength. By using the degree difference variances of geopotential coefficients to compare the model CDS01S with EIGEN3P, EIGEN1S and EGM96, the result indicates that the coefficients of CDS01S are most close to those of EIGEN3P. The result of the comparison between the accuracies of geopotential coefficients in the above models, indicates that the accuracy of coefficients in CDS01S is higher than that in EGM96.The geoid undulations of CDS01S and GGM01C up to 30 degrees are calculated and the standard deviation is 4.7 cm between them.
文摘Regarding the rapid compensation of the influence of the Earth' s disturbing gravity field upon trajectory calculation,the key point lies in how to derive the analytical solutions to the partial derivatives of the state of burnout point with respect to the launch data.In view of this,this paper mainly expounds on two issues:one is based on the approximate analytical solution to the motion equation for the vacuum flight section of a long-range rocket,deriving the analytical solutions to the partial derivatives of the state of burnout point with respect to the changing rate of the finalstage pitch program;the other is based on the initial positioning and orientation error propagation mechanism,proposing the analytical calculation formula for the partial derivatives of the state of burnout point with respect to the launch azimuth.The calculation results of correction data are simulated and verified under different circumstances.The simulation results are as follows:(1) the accuracy of approximation between the analytical solutions and the results attained via the difference method is higher than 90%,and the ratio of calculation time between them is lower than 0.2%,thus demonstrating the accuracy of calculation of data corrections and advantages in calculation speed;(2) after the analytical solutions are compensated,the longitudinal landing deviation of the rocket is less than 20 m and the lateral landing deviation of the rocket is less than 10 m,demonstrating that the corrected data can meet the requirements for the hit accuracy of a long-range rocket.
基金Supported by Project of National Key Research and Development Plan(No.2017YFC0601606,2017YFC0602203)National Science and Technology Major Project(No.2016ZX05027-002-03)+1 种基金National Natural Science Foundation of China(No.41604098,41404089) State Key Program of National Natural Science of China(No.41430322)
文摘The processing and interpretation of gravity and gradient data plays an important role in geophysics.The cross gradient joint inversion is usually used for achieving structure coupling of multiple geophysical models. In order to realize the coupling of gravity and gravity tensor data,the authors analyzed each component.The results show that different types of data contain different direction information,and derived the joint inversion based on cross gradient function and applied it to model data. The theoretical model results show that the cross gradient method can reduce the multi solution and significantly improve the resolution of the inversion.The method was also applied to inverse the gravity tensor data in Vinton salt dome,showing that this method can get higher resolution results than the separate linear inversion,and be closer to the real density from drilling data.
基金supported by TWAS (The World Academy of Sciences) and CIRAD (Centre de Coopération Internationale en Recherche Agronomique pour le Développement)
文摘To generate carbon credits under the Reducing Emissions from Deforestation and forest Degradation program(REDD+), accurate estimates of forest carbon stocks are needed. Carbon accounting efforts have focused on carbon stocks in aboveground biomass(AGB).Although wood specific gravity(WSG) is known to be an important variable in AGB estimates, there is currently a lack of data on WSG for Malagasy tree species. This study aimed to determine whether estimates of carbon stocks calculated from literature-based WSG values differed from those based on WSG values measured on wood core samples. Carbon stocks in forest biomass were assessed using two WSG data sets:(i) values measured from 303 wood core samples extracted in the study area,(ii) values derived from international databases. Results suggested that there is difference between the field and literaturebased WSG at the 0.05 level. The latter data set was on average 16 % higher than the former. However, carbon stocks calculated from the two data sets did not differ significantly at the 0.05 level. Such findings could be attributed to the form of the allometric equation used which gives more weight to tree diameter and tree height than to WSG. The choice of dataset should depend on the level of accuracy(Tier II or III) desired by REDD+. As higher levels of accuracy are rewarded by higher prices, speciesspecific WSG data would be highly desirable.
基金supported by the National Natural Science Foundation of China(No.42004008)the Natural Science Foundation of Jiangsu Province,China(No.BK20190498)+1 种基金the Fundamental Research Funds for the Central Universities(No.B220202055)the State Scholarship Fund from Chinese Scholarship Council(No.201306270014).
文摘The global bathymetry models are usually of low accuracy over the coastline of polar areas due to the harsh climatic environment and the complex topography.Satellite altimetric gravity data can be a supplement and plays a key role in bathymetry modeling over these regions.The Synthetic Aperture Radar(SAR)altimeters in the missions like CryoSat-2 and Sentinel-3A/3B can relieve waveform contamination that existed in conventional altimeters and provide data with improved accuracy and spatial resolution.In this study,we investigate the potential application of SAR altimetric gravity data in enhancing coastal bathymetry,where the effects on local bathymetry modeling introduced from SAR altimetry data are quantified and evaluated.Furthermore,we study the effects on bathymetry modeling by using different scale factor calculation approaches,where a partition-wise scheme is implemented.The numerical experiment over the South Sandwich Islands near Antarctica suggests that using SARbased altimetric gravity data improves local coastal bathymetry modeling,compared with the model calculated without SAR altimetry data by a magnitude of 3:55 m within 10 km of offshore areas.Moreover,by using the partition-wise scheme for scale factor calculation,the quality of the coastal bathymetry model is improved by 7.34 m compared with the result derived from the traditional method.These results indicate the superiority of using SAR altimetry data in coastal bathymetry inversion.
文摘A novel method for noise removal from the rotating accelerometer gravity gradiometer(MAGG)is presented.It introduces a head-to-tail data expansion technique based on the zero-phase filtering principle.A scheme for determining band-pass filter parameters based on signal-to-noise ratio gain,smoothness index,and cross-correlation coefficient is designed using the Chebyshev optimal consistent approximation theory.Additionally,a wavelet denoising evaluation function is constructed,with the dmey wavelet basis function identified as most effective for processing gravity gradient data.The results of hard-in-the-loop simulation and prototype experiments show that the proposed processing method has shown a 14%improvement in the measurement variance of gravity gradient signals,and the measurement accuracy has reached within 4E,compared to other commonly used methods,which verifies that the proposed method effectively removes noise from the gradient signals,improved gravity gradiometry accuracy,and has certain technical insights for high-precision airborne gravity gradiometry.