This website uses cookies to improve your experience. If you continue without changing your settings, you consent to our use of cookies in accordance with our cookie policy. You can disable cookies at any time.


Style transfer for generation of realistically textured subsurface models



Training datasets consisting of numerous pairs of subsurface models and target variables are essential for building machine learning solutions for geophysical applications. We apply an iterative style transfer approach from image processing to produce realistically textured subsurface models based on synthetic prior models. The key idea of style transfer is that content and texture representations within a convolutional neural network are, to some extent, separable. Thus, a style from one image can be transferred to match the content from another image. We demonstrate examples where realistically random models are stylized to mimic texture patterns from Marmousi II and a section from the BP 2004 benchmark velocity models.

Presentation Date: Wednesday, September 18, 2019

Session Start Time: 8:30 AM

Presentation Time: 9:45 AM

Location: 221D

Presentation Type: Oral