Multiscale fusion of digital rock images based on deep generative
adversarial networks
- Mingliang Liu,
- Tapan Mukerji
Abstract
Computation of petrophysical properties on digital rock images is
becoming important in geoscience. However, it is usually complicated for
natural heterogeneous porous media due to the presence of multiscale
pore structures. To capture the heterogeneity of rocks, we develop a
method based on deep generative adversarial networks to assimilate
multiscale imaging data for the generation of synthetic high-resolution
digital rocks having a large field of view. The reconstructed images not
only honor the geometric structures of 3-D micro-CT images but also
recover fine details existing at the scale of 2-D scanning electron
microscopy images. Furthermore, the consistency between the real and
synthetically generated images in terms of porosity, specific surface
area, two-point correlation and effective permeability reveals the
validity of our proposed method. It provides an effective way to fuse
multiscale digital rock images for better characterization of
heterogeneous porous media and better prediction of pore-scale flow and
petrophysical properties.