Essential Site Maintenance: Authorea-powered sites will be updated circa 15:00-17:00 Eastern on Tuesday 5 November.
There should be no interruption to normal services, but please contact us at [email protected] in case you face any issues.

loading page

Multiscale fusion of digital rock images based on deep generative adversarial networks
  • Mingliang Liu,
  • Tapan Mukerji
Mingliang Liu
Stanford University

Corresponding Author:[email protected]

Author Profile
Tapan Mukerji
Stanford University
Author Profile

Abstract

Computation of petrophysical properties on digital rock images is becoming important in geoscience. However, it is usually complicated for natural heterogeneous porous media due to the presence of multiscale pore structures. To capture the heterogeneity of rocks, we develop a method based on deep generative adversarial networks to assimilate multiscale imaging data for the generation of synthetic high-resolution digital rocks having a large field of view. The reconstructed images not only honor the geometric structures of 3-D micro-CT images but also recover fine details existing at the scale of 2-D scanning electron microscopy images. Furthermore, the consistency between the real and synthetically generated images in terms of porosity, specific surface area, two-point correlation and effective permeability reveals the validity of our proposed method. It provides an effective way to fuse multiscale digital rock images for better characterization of heterogeneous porous media and better prediction of pore-scale flow and petrophysical properties.
16 May 2022Published in Geophysical Research Letters volume 49 issue 9. 10.1029/2022GL098342