Recognition: unknown
Deep learning for denoising
read the original abstract
Compared with traditional seismic noise attenuation algorithms that depend on signal models and their corresponding prior assumptions, removing noise with a deep neural network is trained based on a large training set, where the inputs are the raw datasets and the corresponding outputs are the desired clean data. After the completion of training, the deep learning method achieves adaptive denoising with no requirements of (i) accurate modelings of the signal and noise, or (ii) optimal parameters tuning. We call this intelligent denoising. We use a convolutional neural network as the basic tool for deep learning. In random and linear noise attenuation, the training set is generated with artificially added noise. In the multiple attenuation step, the training set is generated with acoustic wave equation. Stochastic gradient descent is used to solve the optimal parameters for the convolutional neural network. The runtime of deep learning on a graphics processing unit for denoising has the same order as the $f-x$ deconvolution method. Synthetic and field results show the potential applications of deep learning in automatic attenuation of random noise (with unknown variance), linear noise, and multiples.
This paper has not been read by Pith yet.
Forward citations
Cited by 2 Pith papers
-
Learning Stratigraphically Consistent Relative Geologic Time from 3D Seismic Data via Sinusoidal Mapping
RGT-Est reformulates relative geologic time learning via sinusoidal mapping and joint pointwise-perceptual-adversarial losses to enforce stratigraphic consistency, outperforming prior AI methods on field data especial...
-
Learning Stratigraphically Consistent Relative Geologic Time from 3D Seismic Data via Sinusoidal Mapping
RGT-Est transforms RGT estimation into a sinusoidal space with joint losses to capture fine horizons and global stratigraphic order from seismic data, outperforming prior AI methods especially with sparse horizon priors.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.