This study examines the effectiveness of artificial intelligence techniques in generating high-quality environmental data for species introductory site selection systems.Combining Strengths,Weaknesses,Opportunities,Th...This study examines the effectiveness of artificial intelligence techniques in generating high-quality environmental data for species introductory site selection systems.Combining Strengths,Weaknesses,Opportunities,Threats(SWOT)analysis data with Variation Autoencoder(VAE)and Generative AdversarialNetwork(GAN)the network framework model(SAE-GAN),is proposed for environmental data reconstruction.The model combines two popular generative models,GAN and VAE,to generate features conditional on categorical data embedding after SWOT Analysis.The model is capable of generating features that resemble real feature distributions and adding sample factors to more accurately track individual sample data.Reconstructed data is used to retain more semantic information to generate features.The model was applied to species in Southern California,USA,citing SWOT analysis data to train the model.Experiments show that the model is capable of integrating data from more comprehensive analyses than traditional methods and generating high-quality reconstructed data from them,effectively solving the problem of insufficient data collection in development environments.The model is further validated by the Technique for Order Preference by Similarity to an Ideal Solution(TOPSIS)classification assessment commonly used in the environmental data domain.This study provides a reliable and rich source of training data for species introduction site selection systems and makes a significant contribution to ecological and sustainable development.展开更多
Satellite communication plays an important role in 6G systems.However,satellite communication systems are more susceptible to intentional or unintentional interference signals than other communication systems because ...Satellite communication plays an important role in 6G systems.However,satellite communication systems are more susceptible to intentional or unintentional interference signals than other communication systems because of their working mechanism of transparent forwarding.For the purpose of eliminating the influence of interference,this paper develops an angle reciprocal interference suppression scheme based on the reconstruction of interferenceplus-noise covariance matrix(ARIS-RIN).Firstly,we utilize the reciprocity between the known beam central angle and the unknown signal arrival angle to estimate the angle of arrival(AOA)of desired signal due to the multi-beam coverage.Then,according to the priori known spatial spectrum distribution,the interferenceplus-noise covariance matrix(INCM)is reconstructed by integrating within the range except the direction of desired signal.In order to correct the estimation bias of the first two steps,the worst-case performance optimization technology is adopted in the process of solving the beamforming vector.Numerical simulation results show that the developed scheme:1)has a higher output signal-to-interference-plus-noise ratio(SINR)under arbitrary signal-to-noise ratio(SNR);2)still has good performance under small snapshots;3)is robuster and easier to be realized when comparing with minimum variance distortionless response(MVDR)and the traditional diagonal loading algorithms.展开更多
Data reconstruction is a crucial step in seismic data preprocessing.To improve reconstruction speed and save memory,the commonly used three-dimensional(3D)seismic data reconstruction method divides the missing data in...Data reconstruction is a crucial step in seismic data preprocessing.To improve reconstruction speed and save memory,the commonly used three-dimensional(3D)seismic data reconstruction method divides the missing data into a series of time slices and independently reconstructs each time slice.However,when this strategy is employed,the potential correlations between two adjacent time slices are ignored,which degrades reconstruction performance.Therefore,this study proposes the use of a two-dimensional curvelet transform and the fast iterative shrinkage thresholding algorithm for data reconstruction.Based on the significant overlapping characteristics between the curvelet coefficient support sets of two adjacent time slices,a weighted operator is constructed in the curvelet domain using the prior support set provided by the previous reconstructed time slice to delineate the main energy distribution range,eff ectively providing prior information for reconstructing adjacent slices.Consequently,the resulting weighted fast iterative shrinkage thresholding algorithm can be used to reconstruct 3D seismic data.The processing of synthetic and field data shows that the proposed method has higher reconstruction accuracy and faster computational speed than the conventional fast iterative shrinkage thresholding algorithm for handling missing 3D seismic data.展开更多
We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parame...We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parameter E(z)from the DESI BAO Alcock-Paczynski(AP)data using Gaussian process to perform the null test.We find strong evidence of accelerated expansion from the DESI BAO AP data.By reconstructing the deceleration parameter q(z) from the DESI BAO AP data,we find that accelerated expansion persisted until z■0.7 with a 99.7%confidence level.Additionally,to provide insights into the Hubble tension problem,we propose combining the reconstructed E(z) with D_(H)/r_(d) data to derive a model-independent result r_(d)h=99.8±3.1 Mpc.This result is consistent with measurements from cosmic microwave background(CMB)anisotropies using the ΛCDM model.We also propose a model-independent method for reconstructing the comoving angular diameter distance D_(M)(z) from the distance modulus μ,using SNe Ia data and combining this result with DESI BAO data of D_(M)/r_(d) to constrain the value of r_(d).We find that the value of r_(d),derived from this model-independent method,is smaller than that obtained from CMB measurements,with a significant discrepancy of at least 4.17σ.All the conclusions drawn in this paper are independent of cosmological models and gravitational theories.展开更多
Irregular seismic data causes problems with multi-trace processing algorithms and degrades processing quality. We introduce the Projection onto Convex Sets (POCS) based image restoration method into the seismic data...Irregular seismic data causes problems with multi-trace processing algorithms and degrades processing quality. We introduce the Projection onto Convex Sets (POCS) based image restoration method into the seismic data reconstruction field to interpolate irregularly missing traces. For entire dead traces, we transfer the POCS iteration reconstruction process from the time to frequency domain to save computational cost because forward and reverse Fourier time transforms are not needed. In each iteration, the selection threshold parameter is important for reconstruction efficiency. In this paper, we designed two types of threshold models to reconstruct irregularly missing seismic data. The experimental results show that an exponential threshold can greatly reduce iterations and improve reconstruction efficiency compared to a linear threshold for the same reconstruction result. We also analyze the anti- noise and anti-alias ability of the POCS reconstruction method. Finally, theoretical model tests and real data examples indicate that the proposed method is efficient and applicable.展开更多
Missing data are a problem in geophysical surveys, and interpolation and reconstruction of missing data is part of the data processing and interpretation. Based on the sparseness of the geophysical data or the transfo...Missing data are a problem in geophysical surveys, and interpolation and reconstruction of missing data is part of the data processing and interpretation. Based on the sparseness of the geophysical data or the transform domain, we can improve the accuracy and stability of the reconstruction by transforming it to a sparse optimization problem. In this paper, we propose a mathematical model for the sparse reconstruction of data based on the LO-norm minimization. Furthermore, we discuss two types of the approximation algorithm for the LO- norm minimization according to the size and characteristics of the geophysical data: namely, the iteratively reweighted least-squares algorithm and the fast iterative hard thresholding algorithm. Theoretical and numerical analysis showed that applying the iteratively reweighted least-squares algorithm to the reconstruction of potential field data exploits its fast convergence rate, short calculation time, and high precision, whereas the fast iterative hard thresholding algorithm is more suitable for processing seismic data, moreover, its computational efficiency is better than that of the traditional iterative hard thresholding algorithm.展开更多
Seismic data contain random noise interference and are affected by irregular subsampling. Presently, most of the data reconstruction methods are carried out separately from noise suppression. Moreover, most data recon...Seismic data contain random noise interference and are affected by irregular subsampling. Presently, most of the data reconstruction methods are carried out separately from noise suppression. Moreover, most data reconstruction methods are not ideal for noisy data. In this paper, we choose the multiscale and multidirectional 2D curvelet transform to perform simultaneous data reconstruction and noise suppression of 3D seismic data. We introduce the POCS algorithm, the exponentially decreasing square root threshold, and soft threshold operator to interpolate the data at each time slice. A weighing strategy was introduced to reduce the reconstructed data noise. A 3D simultaneous data reconstruction and noise suppression method based on the curvelet transform was proposed. When compared with data reconstruction followed by denoizing and the Fourier transform, the proposed method is more robust and effective. The proposed method has important implications for data acquisition in complex areas and reconstructing missing traces.展开更多
Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes...Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.展开更多
Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theore...Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theorem, rendering the coherent aliases of regular undersampling into harmless incoherent random noise using random undersampling, and effectively turning the reconstruction problem into a much simpler denoising problem. We introduce the projections onto convex sets (POCS) algorithm in the data reconstruction process, apply the exponential decay threshold parameter in the iterations, and modify the traditional reconstruction process that performs forward and reverse transforms in the time and space domain. We propose a new method that uses forward and reverse transforms in the space domain. The proposed method uses less computer memory and improves computational speed. We also analyze the antinoise and anti-aliasing ability of the proposed method, and compare the 2D and 3D data reconstruction. Theoretical models and real data show that the proposed method is effective and of practical importance, as it can reconstruct missing traces and reduce the exploration cost of complex data acquisition.展开更多
为实现工业产品的可追溯性,直接将条码加工在零件表面的直接零件标识(Direct Part Marking,DPM)技术,在国内外受到了越来越多的关注。对于金属零件,由于其具有较高的反光性,由相机捕获的金属表面的条码图像常常产生局部高光现象,影响条...为实现工业产品的可追溯性,直接将条码加工在零件表面的直接零件标识(Direct Part Marking,DPM)技术,在国内外受到了越来越多的关注。对于金属零件,由于其具有较高的反光性,由相机捕获的金属表面的条码图像常常产生局部高光现象,影响条码的正确读取。为此,针对金属表面激光标刻二维条码出现的局部高光现象,提出了基于五步重构模型的条码重构法,以重构高光区域的条码信息。对获得的条码图像进行倾斜校正,使"L"型实线边界位于图像左下角,对条码进行网格划分实现各个模块的定位。基于Modified Specular-Free(MSF)图像对高光区域进行检测。采用五步重构模型对条码的各个模块进行数值填充,对条码进行读取。实验表明,该算法能达到去除金属表面上条码局部高光的目的,并取得了较高的识读正确率。展开更多
基金supported by the Fundamental Research Funds for the Liaoning Universities(LJ212410146025).
文摘This study examines the effectiveness of artificial intelligence techniques in generating high-quality environmental data for species introductory site selection systems.Combining Strengths,Weaknesses,Opportunities,Threats(SWOT)analysis data with Variation Autoencoder(VAE)and Generative AdversarialNetwork(GAN)the network framework model(SAE-GAN),is proposed for environmental data reconstruction.The model combines two popular generative models,GAN and VAE,to generate features conditional on categorical data embedding after SWOT Analysis.The model is capable of generating features that resemble real feature distributions and adding sample factors to more accurately track individual sample data.Reconstructed data is used to retain more semantic information to generate features.The model was applied to species in Southern California,USA,citing SWOT analysis data to train the model.Experiments show that the model is capable of integrating data from more comprehensive analyses than traditional methods and generating high-quality reconstructed data from them,effectively solving the problem of insufficient data collection in development environments.The model is further validated by the Technique for Order Preference by Similarity to an Ideal Solution(TOPSIS)classification assessment commonly used in the environmental data domain.This study provides a reliable and rich source of training data for species introduction site selection systems and makes a significant contribution to ecological and sustainable development.
基金supported by the National Natural Science Foundation of China under Grants No.61671367 and 62471381the Research Foundation of Science and Technology on Communication Networks Laboratory,and the National Key Laboratory of Wireless Communications Foundation under Grant No.IFN202401.
文摘Satellite communication plays an important role in 6G systems.However,satellite communication systems are more susceptible to intentional or unintentional interference signals than other communication systems because of their working mechanism of transparent forwarding.For the purpose of eliminating the influence of interference,this paper develops an angle reciprocal interference suppression scheme based on the reconstruction of interferenceplus-noise covariance matrix(ARIS-RIN).Firstly,we utilize the reciprocity between the known beam central angle and the unknown signal arrival angle to estimate the angle of arrival(AOA)of desired signal due to the multi-beam coverage.Then,according to the priori known spatial spectrum distribution,the interferenceplus-noise covariance matrix(INCM)is reconstructed by integrating within the range except the direction of desired signal.In order to correct the estimation bias of the first two steps,the worst-case performance optimization technology is adopted in the process of solving the beamforming vector.Numerical simulation results show that the developed scheme:1)has a higher output signal-to-interference-plus-noise ratio(SINR)under arbitrary signal-to-noise ratio(SNR);2)still has good performance under small snapshots;3)is robuster and easier to be realized when comparing with minimum variance distortionless response(MVDR)and the traditional diagonal loading algorithms.
基金National Natural Science Foundation of China under Grant 42304145Jiangxi Provincial Natural Science Foundation under Grant 20242BAB26051,20242BAB25191 and 20232BAB213077+1 种基金Foundation of National Key Laboratory of Uranium Resources Exploration-Mining and Nuclear Remote Sensing under Grant 2024QZ-TD-13Open Fund(FW0399-0002)of SINOPEC Key Laboratory of Geophysics。
文摘Data reconstruction is a crucial step in seismic data preprocessing.To improve reconstruction speed and save memory,the commonly used three-dimensional(3D)seismic data reconstruction method divides the missing data into a series of time slices and independently reconstructs each time slice.However,when this strategy is employed,the potential correlations between two adjacent time slices are ignored,which degrades reconstruction performance.Therefore,this study proposes the use of a two-dimensional curvelet transform and the fast iterative shrinkage thresholding algorithm for data reconstruction.Based on the significant overlapping characteristics between the curvelet coefficient support sets of two adjacent time slices,a weighted operator is constructed in the curvelet domain using the prior support set provided by the previous reconstructed time slice to delineate the main energy distribution range,eff ectively providing prior information for reconstructing adjacent slices.Consequently,the resulting weighted fast iterative shrinkage thresholding algorithm can be used to reconstruct 3D seismic data.The processing of synthetic and field data shows that the proposed method has higher reconstruction accuracy and faster computational speed than the conventional fast iterative shrinkage thresholding algorithm for handling missing 3D seismic data.
基金supported in part by the National Key Research and Development Program of China (Grant No.2020YFC2201504)the National Natural Science Foundation of China (Grant Nos.12588101 and 12535002)。
文摘We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parameter E(z)from the DESI BAO Alcock-Paczynski(AP)data using Gaussian process to perform the null test.We find strong evidence of accelerated expansion from the DESI BAO AP data.By reconstructing the deceleration parameter q(z) from the DESI BAO AP data,we find that accelerated expansion persisted until z■0.7 with a 99.7%confidence level.Additionally,to provide insights into the Hubble tension problem,we propose combining the reconstructed E(z) with D_(H)/r_(d) data to derive a model-independent result r_(d)h=99.8±3.1 Mpc.This result is consistent with measurements from cosmic microwave background(CMB)anisotropies using the ΛCDM model.We also propose a model-independent method for reconstructing the comoving angular diameter distance D_(M)(z) from the distance modulus μ,using SNe Ia data and combining this result with DESI BAO data of D_(M)/r_(d) to constrain the value of r_(d).We find that the value of r_(d),derived from this model-independent method,is smaller than that obtained from CMB measurements,with a significant discrepancy of at least 4.17σ.All the conclusions drawn in this paper are independent of cosmological models and gravitational theories.
基金financially supported by National 863 Program (Grants No.2006AA 09A 102-09)National Science and Technology of Major Projects ( Grants No.2008ZX0 5025-001-001)
文摘Irregular seismic data causes problems with multi-trace processing algorithms and degrades processing quality. We introduce the Projection onto Convex Sets (POCS) based image restoration method into the seismic data reconstruction field to interpolate irregularly missing traces. For entire dead traces, we transfer the POCS iteration reconstruction process from the time to frequency domain to save computational cost because forward and reverse Fourier time transforms are not needed. In each iteration, the selection threshold parameter is important for reconstruction efficiency. In this paper, we designed two types of threshold models to reconstruct irregularly missing seismic data. The experimental results show that an exponential threshold can greatly reduce iterations and improve reconstruction efficiency compared to a linear threshold for the same reconstruction result. We also analyze the anti- noise and anti-alias ability of the POCS reconstruction method. Finally, theoretical model tests and real data examples indicate that the proposed method is efficient and applicable.
基金supported by the National Natural Science Foundation of China (Grant No.41074133)
文摘Missing data are a problem in geophysical surveys, and interpolation and reconstruction of missing data is part of the data processing and interpretation. Based on the sparseness of the geophysical data or the transform domain, we can improve the accuracy and stability of the reconstruction by transforming it to a sparse optimization problem. In this paper, we propose a mathematical model for the sparse reconstruction of data based on the LO-norm minimization. Furthermore, we discuss two types of the approximation algorithm for the LO- norm minimization according to the size and characteristics of the geophysical data: namely, the iteratively reweighted least-squares algorithm and the fast iterative hard thresholding algorithm. Theoretical and numerical analysis showed that applying the iteratively reweighted least-squares algorithm to the reconstruction of potential field data exploits its fast convergence rate, short calculation time, and high precision, whereas the fast iterative hard thresholding algorithm is more suitable for processing seismic data, moreover, its computational efficiency is better than that of the traditional iterative hard thresholding algorithm.
基金sponsored by the National Natural Science Foundation of China(Nos.41304097 and 41664006)the Natural Science Foundation of Jiangxi Province(No.20151BAB203044)+1 种基金the China Scholarship Council(No.201508360061)Distinguished Young Talent Foundation of Jiangxi Province(2017)
文摘Seismic data contain random noise interference and are affected by irregular subsampling. Presently, most of the data reconstruction methods are carried out separately from noise suppression. Moreover, most data reconstruction methods are not ideal for noisy data. In this paper, we choose the multiscale and multidirectional 2D curvelet transform to perform simultaneous data reconstruction and noise suppression of 3D seismic data. We introduce the POCS algorithm, the exponentially decreasing square root threshold, and soft threshold operator to interpolate the data at each time slice. A weighing strategy was introduced to reduce the reconstructed data noise. A 3D simultaneous data reconstruction and noise suppression method based on the curvelet transform was proposed. When compared with data reconstruction followed by denoizing and the Fourier transform, the proposed method is more robust and effective. The proposed method has important implications for data acquisition in complex areas and reconstructing missing traces.
文摘Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.
基金sponsored by the National Natural Science Foundation of China (No.41174107)the National Science and Technology projects of oil and gas (No.2011ZX05023-005)
文摘Traditional seismic data sampling follows the Nyquist sampling theorem. In this paper, we introduce the theory of compressive sensing (CS), breaking through the limitations of the traditional Nyquist sampling theorem, rendering the coherent aliases of regular undersampling into harmless incoherent random noise using random undersampling, and effectively turning the reconstruction problem into a much simpler denoising problem. We introduce the projections onto convex sets (POCS) algorithm in the data reconstruction process, apply the exponential decay threshold parameter in the iterations, and modify the traditional reconstruction process that performs forward and reverse transforms in the time and space domain. We propose a new method that uses forward and reverse transforms in the space domain. The proposed method uses less computer memory and improves computational speed. We also analyze the antinoise and anti-aliasing ability of the proposed method, and compare the 2D and 3D data reconstruction. Theoretical models and real data show that the proposed method is effective and of practical importance, as it can reconstruct missing traces and reduce the exploration cost of complex data acquisition.
文摘为实现工业产品的可追溯性,直接将条码加工在零件表面的直接零件标识(Direct Part Marking,DPM)技术,在国内外受到了越来越多的关注。对于金属零件,由于其具有较高的反光性,由相机捕获的金属表面的条码图像常常产生局部高光现象,影响条码的正确读取。为此,针对金属表面激光标刻二维条码出现的局部高光现象,提出了基于五步重构模型的条码重构法,以重构高光区域的条码信息。对获得的条码图像进行倾斜校正,使"L"型实线边界位于图像左下角,对条码进行网格划分实现各个模块的定位。基于Modified Specular-Free(MSF)图像对高光区域进行检测。采用五步重构模型对条码的各个模块进行数值填充,对条码进行读取。实验表明,该算法能达到去除金属表面上条码局部高光的目的,并取得了较高的识读正确率。