Old dog, new trick: Reservoir computing advances machine learning for
climate modeling
Abstract
Physics-informed machine learning (ML) applied to geophysical simulation
is developing explosively. Recently, graph neural net and vision
transformer architectures have shown 1-7 day global weather forecast
skill superior to any conventional model with integration times over
1000 times faster, but longer simulations rapidly degrade. ML that
achieves high skill in both weather and climate applications is a
tougher goal. This Commentary was inspired by
\citeA{ArcomanoEtAl2023}, who show impressive progress
toward that goal using hybrid ML, combining reservoir computing to a
coarse-grid climate model and coupling to a separate data-driven
reservoir computing model that interactively predicts sea-surface
temperature. This opens new horizons; where will the next ML
breakthrough come from, and is conventional climate modeling about to be
disrupted?