摘要
由于环境和经济约束,实际采集的地震资料往往呈现不规则分布,影响了后续地震数据的处理和解释。基于卷积神经网络(Convolutional Neural Network,CNN)的地震数据重建方法近年来引起广泛关注。该方法无需任何先验假设,从海量训练数据集中自动学习缺失数据到完整数据的映射,取得了良好的重建效果。然而,网络的训练依赖于训练数据。当训练数据集信息不够丰富时,网络泛化性不足,从而影响重建效果。为此,本文提出了一种基于数据增广的CNN地震数据重建方法。网络框架使用经典的Unet网络和Resnet网络。数据增广策略为:从合成数据随机截取的数据块构成训练数据集,通过对地震数据的空间方向多尺度采样、翻转、加噪等数据增广策略,增加样本的特征丰富度。最后,利用增广的数据集训练CNN网络实现重建。合成和实际资料的测试结果表明,所提出的数据增广策略可以有效地提高网络的泛化性。与无数据增广情况下的重建结果相比,信噪比更高。
Due to the environmental and economic constraints,acquired seismic data often containsmissing traces,which affects the subsequent processing and interpretation.The seismic data reconstruction method based on convolutional neural network(CNN)has attracted wide attention in recent years without any prior assumptions,this method automatically learns the mapping from the missing data to the complete data from the massive training data set,and achieves good reconstruction results.However,the training of the network dependson the training data.If the training dataset is not informative enough,the network generalization is insufficient,and the reconstruction performance degrades.To this end,we propose a seismic data recoverymethod using CNN based on data augmentation strategy.The classical Unet and Resnet are utilized in this paper.The data augmentation strategy concludes:multi-scale sampling,flipping and noise addition in the spatial directions of the seismic data.Finally,the CNN network is trained to realize the reconstruction using the broadened dataset.The experimental results of synthetic and fielddata show that the proposed strategy can effectively improve the generaliz ability of the network,and achieve higher signal-to-noise compared with the results without data augmentation.
作者
陈锐
王琴
Chen Rui;Wang Qin(School of Mathematics and Physics, China University of Geosciences, Wuhan Hubei 430074, China)
出处
《工程地球物理学报》
2021年第4期471-478,共8页
Chinese Journal of Engineering Geophysics
基金
地质探测与评估教育部重点实验室主任基金(编号:No.GLAB2020ZR13)。
关键词
地震数据重建
卷积神经网络
数据增广
seismic data reconstruction
convolutional neural network
data augmentation