Whyjay Zheng

and 8 more

Accurate assessments of glacier velocity are essential for understanding ice flow mechanics, monitoring natural hazards, and projecting future sea-level rise. However, the most commonly used method for deriving glacier velocity maps, known as feature tracking, relies on empirical parameter choices that rarely account for glacier physics or uncertainty. The GLAcier Feature Tracking testkit (GLAFT) aims to assess velocity maps using two statistically and physically based metrics. Velocity maps with metrics falling within our recommended ranges contain fewer erroneous measurements and more spatially correlated noise than velocity maps with metrics that deviate from those ranges. Consequently, these metric ranges are suitable for refining feature-tracking workflows and evaluating the resulting velocity products. GLAFT provides modulized workflows for calculating these metrics and the associated visualization, facilitating the velocity map assessments. To ensure the package is available, reusable, and redistributable to the maximum extent, GLAFT adopts several open science practices including the narrative documentation and demos using Jupyter Book and cloud access using Ghub. By providing the benchmarking framework for evaluating the quality of glacier velocity maps procedure, GLAFT enables the cryospheric sciences and natural hazards communities to leverage the rich glacier velocity data now available, whether they are sourced from public archives or made through custom feature-tracking processes.

J. Michelle Hu

and 2 more

Fine-scale, sub-annual satellite stereo observations of snow cover and snow depth can help improve quantification of snow water equivalent at critical times during the accumulation and ablation season. We are refining very-high-resolution (VHR) spaceborne optical stereo methods to generate spatially-continuous digital surface models (DSMs) and maps of snow depth and snow water equivalent (SWE) over mountain sites in the Western U.S. In this work, we leverage the open-source software of NASA’s Ames Stereo Pipeline for extensive and iterative testing of stereogrammetric processing parameters to produce snow-free and snow-covered DSMs. Using open-source tools, we customize and improve automated surface co-registration using snow-free DSMs generated from spaceborne stereogrammetry and airborne lidar. High-resolution land cover classification maps derived from the input stereo images using machine learning methods improve the co-registration results and snow depth product quality. We assess our stereo-derived DSM and snow depth mapping methods across multiple sites in Colorado using USGS 3D Elevation Program (3DEP) and the Airborne Snow Observatory (ASO) airborne lidar DSMs and snow depth products. We present initial evaluations of our surface elevation reconstructions across variable terrain and land cover. Finally, we use a bulk density approach and empirical density models to convert snow depth maps into maps of snow water equivalent. We are developing a user-friendly notebook for the full workflow with default processing parameters tuned for mountain terrain. We hope that these tools will enable new users with limited photogrammetry experience to produce maps of snow depth and snow water equivalent from VHR satellite imagery.

George Brencher

and 2 more

Atmospheric errors in interferometric synthetic aperture radar (InSAR)-derived estimates of surface deformation often obscure real displacement signals, especially in mountainous regions. As climate change disproportionately impacts the mountain cryosphere, developing a technique for atmospheric correction that performs well in high-relief terrain is increasingly important. Here, we developed and implemented a statistical machine learning-based atmospheric correction that relies on the differing spatial and topographic characteristics of periglacial features and atmospheric noise. Our correction is applied at the native spatial and temporal resolution of the InSAR data (40 m, 12 days), does not require external atmospheric data, and can correct both stratified and turbulent atmospheric noise. Using Sentinel-1 data from 2015-2022, we trained a convolutional neural network (CNN) on atmospheric noise from 136 short-baseline interferograms and displacement signals from time-series inversion of 337 interferograms. The CNN correction was then tested on a densely connected network of 202 Sentinel-1 interferograms which were inverted to create a displacement time series. We used the Rocky Mountains in Colorado as our training, validation, and testing areas. When applied to our validation data, our correction offers a 690% improvement in performance over a global meteorological reanalysis-based correction and a 209% improvement over a high-pass filter correction. We found that our correction reveals previously hidden time-dependent kinematic behavior of three representative rock glaciers in our testing dataset. Our flexible, robust approach can be used to correct arbitrary InSAR data to analyze subtle surface deformation signals for a range of science and engineering applications.

Tushar Khurana

and 3 more

Shashank Bhushan

and 3 more

Image feature tracking with medium-resolution optical satellite imagery (e.g., Landsat-8) offers measurements of glacier surface velocity on a global scale. However, for slow-moving glaciers (<0.1 m/day), the larger pixel sizes (~15-30 m) and longer repeat intervals (minimum of 16 days, assuming no cloud cover) limit temporal sampling, often precluding analysis of sub-annual velocity variability. As a result, detailed records of short-term glacier velocity variations are limited to a subset of glaciers, often from dedicated SAR image tasking and/or field observations. To address these issues, we are leveraging large archives of very-high-resolution (~0.3-0.5 m) DigitalGlobe WorldView/GeoEye imagery with ~monthly repeat interval and high-resolution (~3-5 m) Planet PlanetScope imagery with ~daily-weekly repeat interval for the period from 2014 to 2019. We are using automated, open-source tools to develop corrections for sensor geometry and image geolocation, and integrating new, high resolution DEMs for improved orthorectification, reducing the uncertainty of short-term (monthly to seasonal) velocity measurements. These temporally dense records will be integrated with other velocity products (e.g., NASA ITS_LIVE), which will allow us to study the evolution of glacier dynamics, and its relationships with local climatology, geomorphology, and hydrology on a regional scale. In this study, we present initial results for surface velocity mapping for glaciers in Khumbu Himalaya, Nepal and Mt. Rainier, USA. We are using high-performance computing environments to scale this analysis to larger glacierized regions in High Mountain Asia and Continental U.S.

Friedrich Knuth

and 3 more

We present interannual to decadal glacier and geomorphic change measurements at multiple sites across Western North America from the 1950s until present. Glacierized study sites differ in terms of glacial geometry and climatology, from continental mountains (e.g., Glacier National Park) to maritime stratovolcanoes (e.g., Mt. Rainier). Quantitative measurements of glacier and land surface change are obtained using the Historical Structure from Motion (HSfM) package. The automated HSfM processing pipeline can derive high-resolution (0.5-2.0 m) Digital Elevation Models (DEMs) and orthomosaics from historical aerial photography, without manual ground control point selection. All DEMs are co-registered to modern airborne lidar and commercial satellite stereo reference DEMs to accurately measure geodetic surface elevation change and uncertainty. We use scanned historical images from the USGS North American Glacier Aerial Photography (NAGAP) archive and other aerial photography campaigns from the USGS EROS Aerial Photo Single Frames archive. We examine the impact of regional climate forcing on glacier volume change and dynamics using downscaled climate reanalysis products. By augmenting the record of quantitative glacier change measurements and better understanding the relationship between climate forcing and heterogeneous glacier response patterns, we aim to improve our understanding of regional glacier mass change, as well as inform management decisions impacting downstream water resources, ecosystem management, and geohazard risks.

Whyjay Zheng

and 4 more

Glacier velocity reflects the dynamics of ice flow, and its change over time serves a key role in predicting the future sea-level rise. Glacier feature tracking (also known as offset tracking or pixel tracking) is one of the most widely-used approaches for mapping glacier velocity using remote sensing data. However, running this workflow relies on multiple empirical parameter choices such as correlation kernel selection, image filter, and template size. As each target glacier area has different data availability, surface feature density, and ice flow width, there is no one-size-fits-all parameter set for glacier feature tracking. Finding an ideal parameter set for a given glacier requires quantitative and objective metrics to determine the quality of resulting velocity maps. The objective of our Glacier feature tracking test (gftt) project is both to devise a set of widely applicable metrics and to build a Python-based tool for calculating them. These metrics can be thus used for comparing the performance of different tracking parameters. We use Kaskawulsh glacier, Canada, as a test case to compare the velocity mapping results using Landsat 8 and Sentinel-2 images, various software packages (including Auto-RIFT, CARST, GIV, and vmap), and a range of input parameters. To begin with, we calculate random error over stable terrain, a metric that has been used for evaluating the uncertainty of the velocity products. We develop two other workflows for exploring new metrics and validating existing metrics, including the test with synthetic pixel offsets and the comparison with GNSS records. These existing and new metrics, calculated through the gftt software, will help determine optimal parameter sets for feature tracking of Kaskawulsh glacier and any other glacier around the world. This work is supported by the NSF Earth Cube Program under awards 1928406, 1928374.

Michelle Hu

and 1 more

Satellite remote sensing often requires a compromise between spatial resolution and spatial coverage for timely and accurate measurements of earth-system processes. But in recent years, increased availability of submeter-scale imagery dramatically altered this balance. Commercial satellite imagery from DigitalGlobe and Planet offer on-demand, very high-resolution panchromatic stereo and multispectral (MS) image collection over snow-covered landscapes, with individual image coverage of up to ~1900 km2. Repeat stereo-derived digital elevation models can be used to accurately estimate snow depth. Integration of contemporaneous ~1–2 m land cover classification maps can provide precise snow-covered area (SCA) products and improved processing, analysis, and interpretation of these snow depth estimates. We are developing machine learning classification algorithms to identify snow, vegetation, water, and exposed rock using varying combinations of available bands (panchromatic, 4/8-band multispectral, SWIR) and band ratios (e.g. NDVI, NDSI) from these products. We present findings for NASA SnowEx campaign sites (Grand Mesa and Senator Beck Basin, CO) and other snow monitoring sites in the Western U.S. using WorldView-3, PlanetScope, and Landsat 8 imagery. Preliminary results show that a tuned random forest algorithm using WorldView-3 MS and SWIR bands yielded the most accurate estimates of SCA of all band combinations and imagery products. With the power to resolve individual trees, these products offer direct measurements of SCA, without the need to account for mixed pixels and fractional SCA as with lower-resolution products. This open-source workflow will be used to process longer time-series and larger areas in a semi-automated fashion, allowing for rapid analysis, increased portability, and broader utility for the community.

Scott Springer

and 5 more

Pine Island Glacier Ice Shelf (PIGIS) is melting rapidly from beneath due to the circulation of relatively warm water under the ice shelf, driven primarily by buoyancy of the meltwater plume. Basal melt rates predicted by ocean models with thermodynamically active ice shelves depend on the representation of environmental characteristics including geometry (grounding line location, ice draft and seabed bathymetry) and ocean hydrographic conditions, and subgrid-scale parameterizations. We developed a relatively high resolution (lateral grid spacing of 0.5 km, 24 terrain following levels) model for the PIGIS vicinity based on the Regional Ocean Modeling System (ROMS). Initial stratification was specified with idealized profiles based on observed hydrographic data seaward of the ice front. Predicted basal melt rate distributions were compared with satellite-derived estimates and stratification beneath PIGIS was compared with Autosub profiles. As in previous studies, we found that the melt rate was strongly dependent on the (specified) depth of the thermocline separating cold surface waters from deep, relatively warm waters, and on the presence of a submarine ridge under the ice shelf that impedes circulation of warm deep water into the back portion of the cavity. Melt rates were sensitive to the model’s subgrid-scale parameterizations. The quadratic drag coefficient, which parameterizes roughness of the ice shelf base, had a substantial effect on the melt rate through its role in the three-equation formulation for ice-ocean buoyancy exchange. Turbulent tracer diffusion, which was parameterized by a constant value or various mixed layer models, played an important role in determining stratification in the cavity. Numerical diffusion became significant in some cases. We conclude that flow of warm water into the inner portion of the PIGIS cavity near the deep grounding line is sensitive to poorly constrained mixing parameterizations, both at the ice base and as a mechanism for allowing inflowing ocean heat to cross the sub-ice-shelf sill. Improved understanding of mixing processes is required as the community moves towards fully coupled ocean/ice-sheet models with evolving ice thickness and grounding lines.