An efficient Bayesian approach to learning droplet collision kernels:
Proof of concept using “Cloudy”, a new n-moment bulk microphysics
scheme
Abstract
The small-scale microphysical processes governing the formation of
precipitation particles cannot be resolved explicitly by cloud resolving
and climate models. Instead, they are represented by microphysics
schemes that are based on a combination of theoretical knowledge,
statistical assumptions, and fitting to data (“tuning”). Historically,
tuning was done in an ad-hoc fashion, leading to parameter choices that
are not explainable or repeatable. Recent work has treated it as an
inverse problem that can be solved by Bayesian inference. The posterior
distribution of the parameters given the data—the solution of Bayesian
inference—is found through computationally expensive sampling methods,
which require over O(10^5) evaluations of the forward model; this is
prohibitive for many models. We present a proof-of-concept of Bayesian
learning applied to a new bulk microphysics scheme named “Cloudy”,
using the recently developed Calibrate-Emulate-Sample (CES) algorithm.
Cloudy models collision-coalescence and collisional breakup of cloud
droplets with an adjustable number of prognostic moments and with easily
modifiable assumptions for the cloud droplet mass distribution and the
collision kernel. The CES algorithm uses machine learning tools to
accelerate Bayesian inference by reducing the number of forward
evaluations needed to O(10^2). It also exhibits a smoothing effect
when forward evaluations are polluted by noise. In a suite of
perfect-model experiments, we show that CES enables computationally
efficient Bayesian inference of parameters in Cloudy from noisy
observations of moments of the droplet mass distribution. In an
additional imperfect-model experiment, a collision kernel parameter is
successfully learned from output generated by a Lagrangian
particle-based microphysics model.