loading page

Old dog, new trick: Reservoir computing advances machine learning for climate modeling
  • Christopher S. Bretherton
Christopher S. Bretherton
Allen Institute for Artificial Intelligence

Corresponding Author:christopherb@allenai.org

Author Profile


Physics-informed machine learning (ML) applied to geophysical simulation is developing explosively. Recently, graph neural net and vision transformer architectures have shown 1-7 day global weather forecast skill superior to any conventional model with integration times over 1000 times faster, but longer simulations rapidly degrade. ML that achieves high skill in both weather and climate applications is a tougher goal. This Commentary was inspired by \citeA{ArcomanoEtAl2023}, who show impressive progress toward that goal using hybrid ML, combining reservoir computing to a coarse-grid climate model and coupling to a separate data-driven reservoir computing model that interactively predicts sea-surface temperature. This opens new horizons; where will the next ML breakthrough come from, and is conventional climate modeling about to be disrupted?
20 Apr 2023Submitted to ESS Open Archive
30 Apr 2023Published in ESS Open Archive