Siyoon Kwon

and 4 more

Remote sensing has been widely applied to investigate fluvial processes, but depth retrievals face significant constraints in deep and turbid conditions. This study evaluates the potential for depth retrievals under such challenging conditions using NASA’s Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) imagery. We employ interpretable machine learning to construct a hyperspectral regressor for water depth and explore the spectral characteristics of deep and turbid waters in Wax Lake Delta (WLD), LA. The reflectance spectra of WLD show minor effects from depth differences due to turbidity. Nevertheless, a Random Forest with Recursive Feature Elimination (RF-RFE) effectively generalizes high and low turbid cases in a single model, achieving a R² of 0.94 ± 0.005. Moreover, this model shows a maximum detectable depth of approximately 30 m, outperforming other methods. A spectral analysis using Shapley additive explanations (SHAP) points out the importance of learning various spectral bands and non-linear relationships between depth and reflectance. Specifically, the short blue and Near-InfraRed (NIR) bands, with high attenuation coefficients, play a crucial role. This finding highlights the attenuation as the key process for deep-depth retrievals. The depth maps of WLD captured by this model distinctly represent the spatial distribution of deep river and shallow delta regions. However, the high dependency on short blue and NIR bands leads to discontinuous areas due to the noise sensitivity of these bands. This result highlights a drawback of remote sensing using empirical models. Future research will focus on correcting such discontinuities by integrating data from multiple remote sensing sources.

Matthew Preisser

and 3 more

Increased interest in combining compound flood hazards and social vulnerability has driven recent advances in flood impact mapping. However, current methods to estimate event specific compound flooding at the household level require high performance computing resources frequently not available to local stakeholders. Government and non-government agencies currently lack methods to repeatedly and rapidly create flood impact maps that incorporate local variability of both hazards and social vulnerability. We address this gap by developing a methodology to estimate a flood impact index at the household level in near-real time, utilizing high resolution elevation data to approximate event specific inundation from both pluvial and fluvial sources in conjunction with a social vulnerability index. Our analysis uses the 2015 Memorial Day flood in Austin, Texas as a case study and proof of concept for our methodology. We show that 37% of the Census Block Groups in the study area experience flooding from only pluvial sources and are not identified in local or national flood hazard maps as being at risk. Furthermore, averaging hazard estimates to cartographic boundaries masks household variability, with 60% of the Census Block Groups in the study area having a coefficient of variation around the mean flood depth exceeding 50%. Comparing our pluvial flooding estimates to a 2D physics-based model, we classify household impact accurately for 92% of households. Our methodology can be used as a tool to create household compound flood impact maps to provide computationally efficient information to local stakeholders.

Nelson Tull

and 6 more

Hydrologic connectivity controls the lateral exchange of water, solids, and solutes between rivers and floodplains, and is critical to ecosystem function, water treatment, flood attenuation, and geomorphic processes. This connectivity has been well-studied, typically through the lens of fluvial flooding. In regions prone to heavy rainfall, the timing and magnitude of lateral exchange may be altered by pluvial flooding on the floodplain. We collected measurements of flow depth and velocity in the Trinity River floodplain in coastal Texas (USA) during Tropical Storm Imelda (2019), which produced up to 75 cm of rainfall locally. We developed a two-dimensional hydrodynamic model at high resolution for a section of the Trinity River floodplain inspired by the compound flooding of Imelda. We then employed Lagrangian particle routing to quantify how residence times and particle velocities changed as flooding shifted from rainfall-driven to river-driven. Our results show that heavy rainfall initiated lateral exchange before river discharge reached flood levels. The presence of rainwater also reduced floodplain storage, causing river water to be confined to a narrow corridor on the floodplain, while rainwater residence times were increased from the effect of high river flow. Finally, we analyzed the role of floodplain channels in facilitating hydrologic connectivity by varying model resolution in the floodplain. While the resolution of floodplain channels was important locally, it did not affect as much the overall floodplain behavior. This study demonstrates the complexity of floodplain hydrodynamics under conditions of heavy rainfall, with implications for sediment deposition and nutrient removal during floods.

Matthew Preisser

and 2 more

The frequency of major flooding events continues to increase, fueling the already growing concern in numerous fields about quantifying the inequitable distribution of flood hazards. Our previous work of overlaying high resolution flood exposure data with social vulnerability information has already begun to highlight how different communities experience varying levels of risk. However, this fails to capture the complex nature by which flooding affects interconnected infrastructure and service networks which further have an impact on an individual and community’s risk. Our goal is to quantitatively define an individual’s vulnerability to flooding, encompassing how both pluvial and fluvial inundation impacts an individual’s place of residence and disrupts their access to critical resources, including flood, gas, healthcare, and emergency services, while still considering an individual’s socioeconomic standing. With the goal of estimating household level disruption of access to critical resources in near real time, our approach relies on a multilayer network of social vulnerability, transportation infrastructure, essential resources, and emergency services. To estimate inundation in near real time, we utilize the Heigh Above Nearest Drainage (HAND) method and a topographic depression hierarchy algorithm to estimate fluvial and pluvial flooding. Using a minimum cost flow algorithm, we determine an individual’s relative cost to access resources before, during, and after a major flooding event. Combining technical and social information leads to the identification of communities that are more vulnerable to the physical, economical, and social components of floods. This model will be useful in future descriptive and prescriptive analytical frameworks by identifying critical nodes across networks and providing actionable knowledge on at risk communities. Our model will inform agencies involved in flood management, urban planning, and emergency response on where they can best apply resources to increase the resiliency of communities and the infrastructure they rely on.

Chris J Keylock

and 3 more

A long-standing question in geomorphology concerns the applicability of statistical models for elevation data based on fractal or multifractal representations of terrain. One difficulty with addressing this question has been the challenge of ascribing statistical significance to metrics adopted to measure landscape properties. In this paper, we use a recently developed surrogate data algorithm to generate synthetic surfaces with identical elevation values as the source dataset, while also preserving the value of the Hölder exponent at any point (the underpinning characteristic of a multifractal surface). Our primary data are from an experimental study of landscape evolution. This allows us to examine how the statistical properties of the surfaces evolve through time and the extent to which they depart from the simple (multi)fractal formalisms. We also study elevation data from Florida and Washington State. We are able to show that the properties of the experimental and actual terrains depart from the simple statistical models. Of particular note is that the number of sub-basins of a given channel order (for orders sufficiently small relative to the basin order) exhibit a clear increase in complexity after a flux steady-state is established in the experimental study. The actual number of basins is much lower than occur in the surrogates. The imprint of diffusive processes on elevation statistics means that, at the very least, a stochastic model for terrain based on a local formalism needs to consider the joint behavior of the elevations and their scaling (as measured by the pointwise Hölder exponents).

Kyle Wright

and 4 more

In response to a growing number of natural and anthropogenic threats, the long-term sustainability of coastal river deltas and wetlands has come into question worldwide. Tools such as remote sensing and numerical modeling have been implemented in an effort to monitor and predict the hydro-geomorphological evolution of our coasts. Hydrological connectivity is known to play an important role in deltaic evolution by delivering flow, sediment, and nutrients to the interior of deltaic islands/wetlands. However, estimating connectivity typically requires detailed field work or numerical modeling, which is difficult to implement over broad spatial and temporal scales. In the present work, we investigate the potential of using remote sensing to estimate hydrological connectivity in the Wax Lake Delta (WLD) and Atchafalaya Delta region of the Louisiana coast. During a three-hour window, five difference maps of water level in the WLD and surrounding wetlands were collected using UAVSAR L-band radar in repeat-pass interferometric mode. We then modeled the WLD subsection of the domain using a 2D shallow-water hydrodynamic model configured to run on the same discharge, tide, and wind conditions as recorded at nearby monitoring stations during the observational window, with vegetation parameterized as a source of additional drag in the deltaic islands. Modeling allowed us to determine the relative influence of tides, vegetation, and wind on WLD water levels, which could then be extrapolated to infer the behavior throughout the rest of the domain. Over the observational window, UAVSAR measured a cumulative loss of over 22 megatons of water from non-channelized wetlands as tides fell. We find that the model tends to under-predict the observed water level draw-down, as well as the degree of hydrological activity in proximal islands that we observe in the UAVSAR data. Models that neglect the influence of wind underestimate the volume of water leaving the islands by up to two-thirds, suggesting the importance of wind on deltaic hydrodynamics during the observational window. With the information gained from the numerical modeling, as well as the computation of information theory statistics, we extend the WLD results to analyze and quantify the water level behavior in the surrounding wetlands and Atchafalaya delta.

Christopher Keylock

and 3 more

Understanding the complex interplay between erosional and depositional processes, and their relative roles in shaping landscape morphology is a question at the heart of geomorphology. A unified framework for examining this question can be developed by simultaneously considering terrain elevation statistics over multiple scales. We show how a long-standing tool for landscape analysis, the elevation-area or hypsometry, can be complemented by an analysis of the elevation scalings to produce a more sensitive tool for studying the interplay between processes, and their impact on morphology. We then use this method, as well as well-known geomorphic techniques (slope-area scaling relations, the number of basins and basin size as a function of channel order) to demonstrate how the complexity of an experimental landscape evolves through time. Our primary result is that the complexity increases once a flux equilibrium is established as a consequence of the role of diffusive processes acting at intermediate elevations. We gauge landscape complexity by comparing results between the experimental landscape surfaces and those produced from a new algorithm that fixes in place the elevation scaling statistics, but randomizes the elevations with respect to these scalings. We constrain the degree of randomization systematically and use the amount of constraint as a measure of complexity. The starting point for the method is illustrated in the figure, which shows the original landscape (top-left) and three synthetic variants generated with no constraints to the randomization. The value quoted in these panels is the root-mean-squared difference in the elevation values for the synthetic cases relative to the original terrain. This value is greatest where the original ridge becomes a valley. All these landscapes contain the same elevation values (i.e. the same probability distribution functions), and the same elevation scalings at a point. The differences emerge because the elevations themselves are distributed randomly across the surface.