Essential Site Maintenance: Authorea-powered sites will be updated circa 15:00-17:00 Eastern on Tuesday 5 November.
There should be no interruption to normal services, but please contact us at [email protected] in case you face any issues.

Mengze Wang

and 3 more

Prediction of extreme events under climate change is challenging but essential for risk management of natural disasters. Although earth system models (ESMs) are arguably our best tool to predict climate extremes, their high computational cost restricts the application to project only a few future scenarios. Emulators, or reduced-complexity models, serve as a complement to ESMs that achieve a fast prediction of the local response to various climate change scenarios. Here we propose a data-driven framework to emulate the full statistics of spatially resolved climate extremes. The variable of interest is the near-surface daily maximum temperature. The spatial patterns of temperature variations are assumed to be independent of time and extracted using Empirical Orthogonal Functions (EOFs). The time dependence is encoded through the coefficients of leading EOFs which are decomposed into long-term seasonal variations and daily fluctuations. The former are assumed to be functions of the global mean temperature, while the latter are modelled as Gaussian stochastic processes with temporal correlation conditioned on the season. The emulator is trained and tested using the simulation data in CMIP6. By generating multiple realizations, the emulator shows significant performance in predicting the temporal evolution of the probability distribution of local daily maximum temperature. Furthermore, the uncertainty of the emulated statistics is quantified to account for the internal variability. The emulation accuracy in testing scenarios remains consistent with the training datasets. The performance of the emulator suggests that the proposed framework can be generalized to other climate extremes and more complicated scenarios of climate change.

Shixuan Zhang

and 6 more

Large-scale dynamical and thermodynamical processes are common environmental drivers of extreme weather events. However, such large-scale environmental conditions often display systematic biases in climate simulations, posing challenges to evaluating extreme weather events and associated risks in current and future climate. In this paper, a machine learning (ML) approach was employed to bias correct the large-scale wind, temperature, and humidity simulated by the E3SM atmosphere model at $\sim 1^\circ$ resolution. The usefulness of the proposed ML approach for extreme weather analysis was demonstrated with a focus on three extreme weather events, including tropical cyclones (TCs), extratropical cyclones (ETCs), and atmospheric rivers (ARs). We show that the ML model can effectively reduce climate bias in large-scale wind, temperature, and humidity while preserving their responses to imposed climate change perturbations. The bias correction is found to directly improve the water vapor transport associated with ARs, and the representations of thermodynamical flows associated with ETCs. When the bias-corrected large-scale winds are used to drive a synthetic TC track forecast model over the Atlantic basin, the resulting TC track density agrees better with that of the TC track model driven by observed winds. In addition, the ML model insignificantly interferes with the mean climate change signals of large-scale storm environments as well as the occurrence and intensity of three extreme events. This study suggests that the proposed ML approach can be used to improve the downscaling of extreme weather events by providing more realistic large-scale storm environments simulated by low-resolution climate models.

Stephen Guth

and 2 more

For many design applications in offshore engineering, including offshore wind turbine foundations, engineers need accurate statistics for kinematic and dynamic quantities, such as hydrodynamic forces, whose statistics depend on the stochastic sea surface elevation. Nonlinear phenomena in the wave--structure interaction require high-fidelity simulations to be analyzed accurately. However, accurate quantification of statistics requires a massive number of simulations, and the computational cost is prohibitively expensive. To avoid that cost, this study presents a machine learning framework to develop a reliable surrogate model that minimizes the need for computationally expensive numerical simulations, which is implemented for the monopile foundation of an offshore wind turbine. This framework consists of two parts. The first focuses on dimensionality reduction of stochastic irregular wave episodes and the resulting hydrodynamic force time series. The second of the framework focuses on the development of a Gaussian process regression surrogate model which learns a mapping between the wave episode and the force-on-structure. This surrogate uses a Bayesian active learning method that sequentially samples the wave episodes likely to contribute to the accurate prediction of extreme hydrodynamic forces in order to design subsequent CFD numerical simulations. Additionally, the study implements a spectrum transfer technique to combine CFD results from quiescent and extreme waves. The principal advantage of this framework is that the trained surrogate model is orders of magnitude faster to evaluate than the classical modeling methods, while built-in uncertainty quantification capabilities allows for efficient sampling of the parameter using with the CFD tools traditionally employed.

Bianca Champenois

and 1 more

Despite advancements in computational science, nonlinear geophysical processes still present important modeling challenges. Physical sensors (such as satellites, AUVs, or buoys) can collect data at specific points or regions, but these are often scarce or inaccurate. Here, we present a method to build improved spatio-temporal models that combine dynamics, inferred from high-fidelity numerical models (reanalysis data), and data from sensors. We are motivated by a data set of ocean temperature where sensor measurements are only available at the surface of the ocean. We first employ reanalysis data in the form of a 3D temperature field, and apply standard principal component analysis (PCA) at every ocean surface coordinate. For each coordinate, the vertical structure of the field can be represented with just two PCA modes and their corresponding time coefficients, significantly reducing the dimensionality of the data. Next, a conditionally Gaussian model, implemented through a temporal convolutional neural network, is built to predict the time coefficients of the PCA modes (i.e. vertical structure), as well as their variance, as a function of the surface temperature. These probabilistic predictions are made with the satellite data as input, and they are used with the PCA modes to stochastically reconstruct the full temperature field. The estimated temperature field is then combined with data from buoys through a multi-fidelity Gaussian process regression scheme, where the buoys have the highest fidelity and the satellite-based predictions have lower fidelity. The techniques described provide a framework for building less expensive and more accurate models of conditionally Gaussian estimates for full 3D fields, and they can be applied to geophysical systems where data from both sensors and numerical simulations are available. We implement these techniques to estimate the full 3D temperature field of the Massachusetts and Cape Cod Bay where temperature can serve as a useful indicator for ocean acidification. Finally, we discuss how the developed ideas can be leveraged to make more informed decisions about optimal in-situ sampling and path planning.