Yoonjin Lee

and 3 more

Imagery from the GOES series has been a key element of U.S. operational weather forecasting for four decades. While GOES observations are used extensively by human forecasters for situational awareness, there has been limited usage of GOES imager data in numerical weather prediction (NWP), and operational data assimilation (DA) has ignored cloud and precipitating pixels. The motivation of this project is to bring the benefits of GOES-R Series enhanced capabilities to advance convective-scale DA for improving convective-scale forecasts. We have developed a convolutional neural network (CNN) prototype, dubbed “GOES Radar Estimation via Machine Learning to Inform NWP (GREMLIN)” that fuses GOES-R Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) information to produce maps of synthetic composite radar reflectivity. We find that the ability of CNNs to utilize spatial context is essential for this application and offers breakthrough improvement in skill compared to traditional pixel-by-pixel based approaches. Making use of ABI spatial information potentially provides benefits over radiance assimilation approaches that are limited by saturation of radiances in pixels with precipitation. This presentation will briefly describe the GREMLIN model and characterize its performance across meteorological regimes. We are developing a dense neural network (DNN) to produce vertical profiles of radar reflectivity and latent heating based on the two-dimensional fields output from GREMLIN. The resulting three-dimensional fields of latent heating will be used to initialize NWP simulations of convective-scale phenomena. Another DNN will be developed to produce uncertainty estimates of latent heating for each pixel. Our approach for data assimilation will be described and is innovative in being the first-time machine learning (ML) will be used for the nonlinear latent heating observation operator in the NOAA hybrid Gridpoint Statistical Interpolation (GSI) and/or JEDI systems. The approach will provide all the elements of the Jacobian needed for GSI DA and has the advantage of automatically maintaining the tangent linear and adjoint models through finite difference mathematics.
Radiosonde observations are the gold-standard for quantifying vertical profiles of atmospheric state variables. Knowledge of which is critical for quantifying moisture and instability, two main ingredients for severe weather. Unfortunately, radiosondes are very sparse, averaging just one observation per 500 x 500 km area over CONUS, and most locations have only two observations per day. This creates uncertainty in the representation of short wavelength and rapidly evolving synoptic and mesoscale features in numerical weather prediction (NWP) and provides few points of comparison for human forecasters to interpret NWP in making forecasts. To fill this gap in our knowledge of the atmospheric state, human forecasters make use of satellite imagery to estimate airmass properties for incrementing NWP outputs. Data from geostationary satellites have been especially useful because of its high temporal resolution (5-minutes) and high spatial resolution (2 km). While the Advanced Baseline Imager (ABI) was not designed as a sounding sensor, the three water vapor bands and three infrared window bands do provide some sounding capabilities. Satellite data are particularly useful in assessing position and timing errors, the representation of short waves, and humidity. The key question addressed by this work is can the mental process used by human forecasters be translated into a machine learning (ML) algorithm to provide automated and objective estimates of airmass properties from ABI? Experiments with convolutional neural networks (CNNs) show that ML can indeed be used. Related research efforts, such as NOAA Unique Combined Atmospheric Processing System (NUCAPS) has explored use of dense neural networks (DNNs), which are essentially replacing a radiative transfer model with a ML model. However, we find more skill can be achieved by making use of the spatial information captured with CNNs. This more closely mimics the human imagery interpretation process: it is the spatial patterns in the features (as much as the pixel-wise values themselves) that carry the useful information content. We will present our latest results, focusing especially on relative humidity, compare against radiosondes, and discuss whether skill is enough to potentially make a positive impact on NWP analyses.