conditioning stratigraphic rule based models with
play

CONDITIONING STRATIGRAPHIC, RULE- BASED MODELS WITH GENERATIVE - PowerPoint PPT Presentation

THE AAPG 2019 ANNUAL CONVENTION & EXHIBITION CONDITIONING STRATIGRAPHIC, RULE- BASED MODELS WITH GENERATIVE ADVERSARIAL NETWORKS: A DEEPWATER LOBE, DEEP LEARNING EXAMPLE HONGGEUN JO JAVIER E. SANTOS MICHAEL J. PYRCZ Agenda Rule


  1. THE AAPG 2019 ANNUAL CONVENTION & EXHIBITION CONDITIONING STRATIGRAPHIC, RULE- BASED MODELS WITH GENERATIVE ADVERSARIAL NETWORKS: A DEEPWATER LOBE, DEEP LEARNING EXAMPLE HONGGEUN JO JAVIER E. SANTOS MICHAEL J. PYRCZ

  2. Agenda • Rule ‐ based Model: Deep ‐ water depositional setting • DCGAN and Image Inpainting • Proposed Method for Data Conditioning • Results • Conclusion Presenter’s notes: I will start with basic idea of rule-based models, which includes literature review and the motivation of this study. And then two deep learning algorithms will be covered: DCGAN and semantic image inpainting, which are followed by proposed method of this study for data conditioning in rule-based model. After presenting results, I will conclude with implementation and key points of this study at the end.

  3. Rule-based model • Simulate sediment dynamics to generate numerical description of reservoir architecture which captures geological processes informed features. • Enable geomodeller to integrate geological concepts • Preserve consequent geologic heterogeneity and continuity which are not readily achievable with conventional geostatistical methods. • Referred to as: ‐ event ‐ based (Pyrcz and Strebelle, 2006) ‐ hybrid (Michael et al., 2010) ‐ surface ‐ based (Pyrcz et al., 2005; Bertoncello et al., 2013) ‐ process ‐ oriented (Wen, 2005) ‐ rule ‐ based modeling (Pyrcz et al., 2015; Jo et al., 2019). Presenter’s notes: With RB model, we can 1)integrate geological concepts directly and 2) preserve realistic, geological heterogeneity/ continuity In recent work, rule-based modeling is referenced by a variety of names. Despite different names, these methods have a common point that they 1) apply depositional rule in temporal sequence and 2) update topographic surface accordingly.

  4. Rule-based model • Comparison between rule ‐ based model and conventional geostatistical models: Architecture deposit, Heterogenity/Continuity of element (Pyrcz et al., 2015) Presenter’s notes: In this graph, Pyrcz compares rule-based model with other geostatistical modeling method. By integrating stacking pattern, forward model, topography and flow path of sediment, rule-based models can capture more realistic architecture of deposit while preserving heterogeneity and continuity in depositional elements

  5. Rule-based model – Input parameters • Geometry of depositional element: – Ellipsoidal, lobate element (Similar to Zie et al., 2000) – Turbidite lobe complex – Controlled by its width, length, and thickness (Deptuck et al., 2008) • Depositional stacking pattern – Random stacking vs. perfect compensational stacking (Jo et al., 2019) – Measure tendency by compensation index (Straub et al., 2009) – Controlled by compositional exponent (0 for random, >5 for perfect comp. stack) • Distribution of petrophysical properties – After build compositional surfaces, allocate petrophysical properties (i.e., porosity and permeability) by hierarchical trend model (Pyrcz, 2004) – Coarsening ‐ up in complex scale but fining ‐ up is expected within element scale Presenter’s notes: Our rule-based model is designed for deep-water depositional setting, or distal submarine fans where turbidite lobe complex is dominant. Three input parameters should be defined in our model: 1) geometry of depositional element, 2) stacking pattern, and 3) distributions of reservoir properties. Geometry: S tacking: Compensational stacking, the tendency for sediments to preferentially deposit in topographic lows . Whereas random stacking means sediment are deposited regardless of topography. Different stacking patterns are commonly observed in different location and different scale and they can be measured by compensation index from Straub 2009 . (Presenter’s notes continued on next slide) Compensation index is mainly controlled by reorganization of the sediment transport field to minimize potential energy of a natural system (Mutti and Normark 1987, Stow and Johansson 2000, Straub et al. 2012).

  6. (Presenter’s notes continued from previous slide) Distribution: After building compositional surface, hierarchical trend model is applied to allocate petrophysical properties. Compensation index is mainly controlled by reorganization of the sediment transport field to minimize potential energy of a natural system (Mutti and Normark 1987, Stow and Johansson 2000, Straub et al. 2012). /

  7. Rule-based model – Flow chart Presenter’s notes: 1. First, we set initial bathymetry, reservoir extent and lobe element geometry. 2. Which we then use to calculate the probability map of the center of the lobe. 3. With the probability map, we apply MCS to stochastically locate a lobe element. 4. Then we update the topography accordingly and recalculate the probability map. 5. We do this repeatedly until it reaches the maximum iteration number we set. 6.6. After building the compositional surface, the hierarchical trend model is applied to allocate petrophysical properties.

  8. Rule-based model – Deep-water lobe reservoir • 5 km x 5 km x 60 m reservoir extent • Lobe element: 750 m in radius, 10 m in thickness • Perfect compensational stacking • Two different scales of hierarchical trends (Jo et al., 2019) Presenter’s notes: The figure shows an example of our rule-based model which has 5km x 5km x 60m dimension. Lobe element is 750 m in radius and 10 m in thickness. Perfect compensation assumed and two different scale of hierarchical trends are observed.

  9. Rule-based model - Limitation • Conditioning well data is an obstacle to broaden application of rule ‐ based models to reservoir modeling • Pyrcz (2004) generated multiple candidate surfaces and accepted them based on minimum misfit with adding stochastic residuals to match data • Michael et al. (2010) combined rule ‐ based model with conventional geostatistical methods • Bertoncello et al. (2013) selected the most significant parameters and used sequential optimization scheme. • However, all the attempts have strengths and weaknesses, and robust, direct conditioning to dense well data is still unsolved. Presenter’s notes: There have been several attempts to solve the data conditioning problem (Pyrcz 2004; Michael et al. 2010;; Bertoncello et al. 2013).

  10. Bridges from RB to ML RB ML • Machine that could Make the Model – Learn the features – Put the features into the reservoir while conserve heterogeneity/continuity in RB • Broadening applications – Conditioning hard date (e.g., well logs, core samples) – Navigating reservoir manifold Presenter’s notes: Overall goal of this study is putting a bridge between Rule-based model and Machine learning to broaden Rule-based model’s application. If a Machine can 1) learn the features of rule-based models and 2) put those features into the reservoir models directly, we can solve conditioning problem and navigate reservoir manifold. Moreover, we can use the machine for Dimensionality reduction and optimization problem such as history matching. In this study, we focus on the first two items.

  11. DCGAN • Generative Adversarial Networks (GANs) : the framework for training generative models in an adversarial manner against discriminative models (Goodfellow et al., 2014) • DCGAN: use of the Convolutional Neural Networks (CNNs) to GAN to improve its performance for high ‐ resolution images (Radford et al., 2015) • DCGAN consists of – Generative model (G): maps a latent vector z to image space – Discriminative model (D): maps an input image to a probability of true image • Loss function of DCGAN is: min � max � �, � � � � �log�� � �� � � � �log 1 � � � � � where � is the sample from real images and � is random variables from the latent space. Presenter’s notes: • Two machine learning algorithms are used in our study. First one is DCGAN and second one is semantic image inpainting . • Generative Adversarial Networks or GAN are framework for training generative models in an adversarial manner against discriminative models. • After GAN is first suggested by Goodfellow in 2014, Radfold improve its performance by using CNN, and name their algorithm DCGAN. • In DCGAN, there are two different models: Generative model and Discriminative model as shown in this figure. • Real images from training data set and fake images from Generative model are input to discriminative model sequentially. (Presenter’s notes continued on next slide) The formula shown below represents these processes.

  12. (Presenter’s notes continued from previous slide) Discriminative model is trained to distinguish real from fake while generative model is trained to generate more realistic images to deceive discriminative model. • The formula shown below represents these processes. /

Recommend


More recommend