Warm and dry föhn winds on the lee side of the Antarctic Peninsula (AP) mountain range cause surface melt that can destabilize vulnerable ice shelves. Topographic funneling of these winds through mountain passes and canyons leads to localized wind-induced melt which is difficult to identify without direct measurements. Our Föhn Detection Algorithm (FonDA) identifies the surface Föhn signature using data from twelve Automatic Weather Stations on the AP and uses machine learning to detect föhn in 5km Regional Atmospheric Climate Model 2 (RACMO2.3p2) output and ERA5 reanalysis data. We estimate and compare the climatology and impact of föhns on the AP surface energy budget, surface melt pattern, and melt quantity from 1979-2018. We show that föhn-induced melt is strongest at the eastern base of the AP and the northern portion of the Larsen C ice shelf. We identify previously unknown wind-induced melt possibly katabatic in nature on the Wilkins and George VI ice shelves. Neither RACMO2 nor ERA5 datasets show a significant increase in föhn melt thus far despite a more positive Southern Annular Mode and increasing surface temperatures. The warming climate and associated southward shift of westerly winds on the AP suggest a likely increase in the wind-induced melt that can densify firn, form melt ponds, and weaken ice shelf stability, however that trend remains insignificant for the past 40 years.
Finding quality solar data can be difficult, if not cumbersome at best, especially for students and early career researchers. The trend of having static files in an obscure format served in a hidden directory on some seemingly random server certainly doesn’t help the situation, not to mention the datasets that are only accessible via a researcher’s hard drive. The LASP Interactive Solar IRradiance Datacenter (LISIRD), lasp.colorado.edu/lisird, seeks to alleviate many of these pains. LISIRD is a website where students and researchers can discover, visualize, and download solar data from a variety of space missions, instruments, models, and laboratories. LISIRD focuses on making heliophysics research as effortless as possible by making solar data openly available and easy to analyze through an intuitive user interface, detailed metadata, interactive plotting capabilities, and a catalog of over 75 datasets. This poster will demonstrate the key features of LISIRD, provide details on the datasets it serves, outline future plans for improvement and growth, and discuss how it can be used as a valuable resource in space physics curricula.
Sensitivity analysis in Earth and environmental systems modelling typically demands an onerous computational cost. This issue coexists with the reliance of these algorithms on ad-hoc designs of experiments, which hampers making the most out of the existing datasets. We tackle this problem by introducing a method for sensitivity analysis, based on the theory of variogram analysis of response surfaces (VARS), that works on any sample of input-output data or pre-computed model evaluations. Called data-driven VARS(D-VARS), this method characterizes the relationship strength between inputs and outputs by investigating their covariograms. We also propose a method to assess ‘robustness’ of the results against sampling variability and numerical methods’ imperfectness. Using two hydrologic modelling case studies, we show that D-VARS is highly efficient and statistically robust, even when the sample size is small. Therefore, D-VARS can provide unique opportunities to investigate geophysical systems whose models are computationally expensive or available data is scarce.
An NSF EarthCube-funded project supported a field-based workshop designed to evaluate and refine the sedimentology/stratigraphy portion of the StraboSpot digital data management system. Eleven academics attended the workshop, representing a spectrum of career levels and specialties. The participants teach classes in sedimentology and conduct sedimentary research, but had not used any previous digital mobile apps in the field. The field component focused on learning the basic functionality of the StraboSpot app as a method of collecting digital data in the field. On the first day, teams of 2-3 participants measured a stratigraphic section in a highly visited locality of the well-studied Book Cliffs of central Utah. Teams saw how the vocabulary and spot functionality worked to collect sedimentary field data and to generate stratigraphic columns. The second day was spent measuring a more complex mixed carbonate-clastic sequence in the San Rafael Swell (Utah). Half of the third day was spent in discussion on major issues with workflow/vocabulary and getting feedback on how to simplify and streamline descriptive data collection functions (stratal attributes), and reviewing the more challenging interpretation functions (processes, depositional environments, and architecture). A major discussion point was how best to handle data collection and stratigraphic plotting of ‘interbedded’ intervals. As a result of the workshop, we streamlined workflow options and refined portions of the vocabulary. This field testing followed up on two previous workshops that solicited expert advice to develop the program categories and basic vocabulary for the sedimentary community. Overall, workshop participants were enthusiastic about the potential of digital data systems, and the ability to link annotated photographs and sketches to georeferenced localities. All participants indicated they were inclined to use StraboSpot in both teaching and research, particularly with versatile and customizable options.
Digital cameras on the surface are frequently used for monitoring atmospheric conditions. Several methods were developed to use the images for synoptic observations, cloud assessments, short term forecasting and so on. However, there are some restrictions not considered by these methods, especially when a linear camera is used to observe logarithmic ranges of atmospheric luminance. Cameras accommodate the scene to a linear scale causing distortions on pattern distributions by pixel value saturation (PVS) and drifts from its original hues. This brings on some simplifying practices commonly found in the literature to overcome these problems. But those practices result in loss of data, misinterpretation of valid pixels and restriction on the use of computer vision algorithms. The present work begins by illustrating these problems performing supervised learning for two reasons: all observation systems seek out automation of human synoptic observation in order to provide a sound mathematical modeling of the observed patterns. A new modeling paradigm is proposed to map the sky patterns to represent the existent physical atmospheric phenomena not considered by the literature. We validate the proposed method, and compared the results using 1630 images against two well-established methods. A hypothesis test showed that results are compatible with currently used binary approach with advantages. Differences were due to PVS and other restrictions not considered by the methods existent on literature. Finally, the present work concludes that the new paradigm presents more meaningful results of sky patterns interpretation, allows extended daylight observation periods and uses a higher dimensional space.
Understanding the global soil moisture (SM) dynamics and its governing controls beyond Darcy Scale is critical for various hydrologic, meteorological, agricultural, and environmental applications. In this study, we parameterize the pathways of the seasonal drydowns using global surface soil moisture (θ_RS) observation from SMAP satellite (between 2015 and 2019) at 36km X 36km. We develop a new data-driven non-parametric approach to identify the canonical shapes of θ_RS drydown, followed by a non-linear least-squares parameterization of the seasonal drydown pathways at each SMAP footprint. The derived parameters provide the effective soil water retention parameters (SWRPeff), land-atmospheric coupling strength, soil hydrologic regimes for SMAP footprint. Depending on footprint heterogeneity, climate and season, the characteristics curves comprising different drydown phases are discovered at SMAP footprints. Drydown curves respond to the within-footprint changes in the meteorological drivers, land-surface characteristics and the soil-vegetative and atmospheric dynamics. Drydown parameters display high inter-seasonal variability, especially in grasslands, croplands and savannah landscapes due to significant changes in the landscape characteristics and moisture patterns at the subgrid-scale. Soil texture exert influence on the characteristics soil water retention and drydown parameters only when the footprint mean θ_RS is low, specifically in arid and sparsely vegetated regions. The influence of soil texture on the inter-seasonal variability of SWRPeff is low compared to landuse and climate at RS-footprint scale. The global understanding of characteristics SM drydown features at SMAP footprints provides a significant step towards a scale-specific, effective soil hydrologic parameterization for various applications.
MagIC (earthref.org/MagIC (https://www2.earthref.org/MagIC)) is an organization dedicated to improving research capacity in the Earth and Ocean sciences by maintaining an open community digital data archive for rock and paleomagnetic data with portals that allow scientists and others to access to archive, search, visualize, download, and combine versioned datasets. A recent focus of MagIC has been to make our data more accessible, discoverable, and interoperable to further this goal. In collaboration with the GeoCodes/P418 group, we have continued to add more schema.org metadata fields to our data sets which allows for more detailed and deep automated searches. We are involved with the Earth Science Information Partners (ESIP) schema.org cluster which is working on extending the schema.org schema to the sciences. MagIC has been focusing on geo- science issues such as standards for describing deep time. We are also collaborating with the European Plate Observing System (EPOS)’s Thematic Core Service Multi-scale laboratories (TCS MSL). MagIC is sending its contributions’ metadata to TCS MSL via DataCite records for representation in the EPOS system. This collaboration should allow European scientists to use MagIC as an official repository for European rock and paleomagnetic data and help prevent the fragmenting of the global paleomagnetic and rock data into many separate data repositories. By having our data well described by an EarthCube supported standard (schema.org/JSON-LD), we will be able to more easily share data with other EarthCube projects in the future.
Storm-time geomagnetic disturbances induce significant geoelectric fields within the Earth that can adversely affect the operation of electric power grids. The recently completed magnetotelluric survey supported by the NSF EarthScope program (2006-2018) has produced a large public archive of impedance tensors across much of the continental United States (US). In this work, the EarthScope tensors are convolved with long time series of geomagnetic field variation recorded at USGS observatories to obtain estimated time series of historical geoelectric fields. Integrating these geoelectric fields across power transmission lines results in time series of geomagnetically induced voltages on each power line. These voltages are analyzed statistically to construct hazard maps of the maximum voltages that could be realized in transmission lines across the US for an extreme, once in one hundred-year, geomagnetic storm. In combination with grounding resistance data and network topology, these voltage estimates can be utilized by power companies to estimate extreme geomagnetically-induced currents within their networks. These voltage estimates can provide information on which power lines and substations are most vulnerable to geomagnetic storms and can guide power companies in assessments of where to install additional protections within their grid.
Empirical Mode Decomposition (EMD) is used to examine the relationship between precipitation and surface temperature from six regions. Three regions are defined by physiography: world, ocean, and land. The other three regions are defined by averaged precipitation: dry, normal and wet. Monthly averaged daily precipitation rate from the Global Precipitation Climatology Project are compared with average monthly surface air temperature anomalies from the Goddard Institute for Space Studies using EMD. The EMD process produces component time series referred to as intrinsic mode functions (IMFs). Theses IMFs are ordered by frequency from high to low. Eight IMFs were produced for each the time series. The first three IMFs corresponded to seasonal, semi-annual and annual variations, respectively. IMF 4 to 6 corresponded to a biennial, pentennial and decadal climate signals, respectively. IMF 7 was related to the broad 20-30 year period, with the trend being revealed in IMF 8. The time series spanned the period from January 1980 to December 2015 at monthly intervals. Temperature and precipitation time series from six sampling regions were analyzed for evidence of correlation. Results from the analysis reveal the following: (1) The EMD process reveals both linear and non-linear trends. The trends are not entirely consistent between regions though they are highly correlated. (2) Apparent wave-to-wave interactions between high and low frequency components appear to be observed in the IMF 1 and 2. These distortions appear to correspond to the troughs and peaks in the decadal cycle captured in IMF 6 and may related to the solar cycle. (3) The correlation between precipitation and temperature increases with increasing IMF number.
Formal data citation is a growing practice increasingly required by scientific journals. Roughly a decade ago, the Federation of Earth Science Information Partners (ESIP) began developing formal guidelines for data citation including acknowledgement of authors and archives and careful use of persistent identifiers (PIDs). Many Earth science data centers now provide formal citation text and PIDs for their data sets, typically a Digital Object Identifier (DOI). A central purpose of data citation (amongst many) is to aid scientific reproducibility through direct, unambiguous reference to the precise data used in a particular study, i.e., to aid provenance tracking. How has that worked in practice? ESIP is now in the process of revising and updating their guidelines and seeks to ensure that data citation meets its stated purpose. This presentation explores whether and how formal citation and the the use of PIDs for data sets has improved the tracking of data provenance. For example, is there is some commonality in the nature and granularity of objects that are assigned PIDs? We review how the guidelines are being revised to further enhance the transparency and reusability of data.
Reliable, accurate, and affordable linear motion systems for agricultural applications are currently not easily accessible due to their elevated cost. Most systems available to the public have price tags in the thousands and their dimensions cannot be easily customized. Current systems have a max length of about ten meters and for a typical greenhouse application, the length may not be sufficient. The price of the system increases with an increase in length and with a base price in the thousands it becomes almost impractical to buy a system for such application. The HyperRail is a modular linear motion system with a repeatability of 2mm and current top speed of 100mm/s. An advantage this system has is its ability to increase or decrease the length of system with minimum effort and nominal increase in price. The HyperRail can be mounted on a set of tripods or directly on the structure of a building such as a greenhouse. The base price for a three-meter system, on tripods, is US$240 and an additional US$45 for each additional one-and-a-half meter. The HyperRail was designed for the use of hyperspectral imaging but can be adapted for other sensor systems. We report on a nine-meter study over pine seedlings infected with a virus. A push-broom hyperspectral camera (Headwall Nano) was mounted on the carriage of the system imaging the seedlings. The rail is currently being adapted to an environmental sensor suite that will monitor CO2, luminosity, humidity, temperature, and the concentration of dust. The HyperRail also includes bidirectional-wireless communication between the drive and the carriage; this means that the sensor suite can operate autonomously and communicate to the HyperRail drive to move to a specific location and take measurements. This system includes a graphical user interface for users who are unfamiliar with programming but could also be used through a command line interface for individuals that want to work the code and see the effects of the changes immediately. This system was developed at Oregon State University’s OPEnS Lab, here is a link to the project page for more detailed information. URL for project page: http://www.open-sensing.org/hyper-rail/
Some machine learning (ML) methods such as classification trees are useful tools to generate hypotheses about how hydrologic systems function. However, data limitations dictate that ML alone often cannot differentiate between causal and associative relationships. For example, previous ML analysis suggested that soil thickness is the key physiographic factor determining the storage-streamflow correlations in the eastern US. This conclusion is not robust, especially if data are perturbed, and there were alternative, competing explanations including soil texture and terrain slope. However, typical causal analysis based on process-based models (PBMs) is inefficient and susceptible to human bias. Here we demonstrate a more efficient and objective analysis procedure where ML is first applied to generate data-consistent hypotheses, and then a PBM is invoked to verify these hypotheses. We employed a surface-subsurface processes model and conducted perturbation experiments to implement these competing hypotheses and assess the impacts of the changes. The experimental results strongly support the soil thickness hypothesis as opposed to the terrain slope and soil texture ones, which are co-varying and coincidental factors. Thicker soil permits larger saturation excess and longer system memory that carries wet season water storage to influence dry season baseflows. We further suggest this analysis could be formalized into a novel, data-centric Bayesian framework. This study demonstrates that PBM present indispensable value for problems that ML cannot solve alone, and is meant to encourage more synergies between ML and PBM in the future.
Sustainable concepts of ecologically functional rivers challenge engineers, researchers, and planners. Advanced numerical modeling techniques produce nowadays high-precision terrain maps and spatially explicit hydrodynamic data that aid river design. Because of their complexity, however, ecomorphological processes can only be reproduced to a limited extent in numerical models. Intelligent post-processing of hydrodynamic numerical model results still enables ecological river engineering measures to be designed sustainably. We have embedded state-of-the-art concepts in novel algorithms to effectively plan self-maintaining habitat-enhancing design features, such as vegetation plantings or the artificial introduction of streamwood, with high physical stability. The algorithms apply a previously developed lifespan mapping technique and habitat suitability analysis to terraforming and bioengineering river design features. The results not only include analytical synopses, but also provide actively created, automatically generated project plans, which are optimized as a function of an efficiency metric that describes “costs per m² net gain in seasonal habitat area for target species”. To make the benefits of these novel algorithms available to a wide audience, we have implemented the codes in an open-source program called River Architect. In this contribution, we present the novel design concepts and algorithms as well as a case study of their application to a river restoration project on the Yuba River in California (USA). With River Architect, we ultimately created an objective, parameter-based, and automated framework for the design of vegetative river engineering features. In addition, we are able to define a framework for stable and ecologically viable terraforming features, but part of the planning of earthworks is still left to expert assessment. Thus, improving the algorithms to plan terraforming of permanent, self-sustaining, and eco-morphodynamic riverbed structures based on site-specific parameters is one of the future challenges.
We will present signal processing techniques based on shape analysis of the time-frequency signatures of chorus elements in the Van Allen Radiation Belts. Specifically, we will employ the Radon transform and the Distance Transform, which are well-known for isolating local shapes and patterns in the image processing literature, to achieve automated detection of the spectral morphology of chorus elements. In particular, we will introduce “shrink-wrapping” techniques that quantify the salient features defining the microstructure of chorus elements. We will present preliminary results based on case studies of the EMFISIS data from the Van Allen Radiation Belts mission.
The Unified Access Framework (UAF) project of the NOAA Global Earth Observation - Integrated Data Environment (GEO-IDE) in an on-going effort to provide access to NOAA-wide data in a way that is FAIR and meets PARR requirements. The first priority of UAF is to copy success. We recognize: data that follows the Climate and Forecast netCDF convention is readily used by working scientists; THREDDS Data Servers and ERDDAP servers are a popular ways to serve such data; these servers can be interrogated by software to determine that the data follows the conventions and the servers can be federated. To make the collection we construct a master “raw” catalog of candidate data set from THREDDS servers around NOAA and other organizations. The raw catalog is examined by custom software to eliminate large data collections which are not aggregated in time and organize the results into a “clean” catalog. The catalog is then examined by ERDDAP to provide ERDDAP GridDAP access and to verify that the data sources follow the CF convention. The gridded data sets are merged into a collection of TableDAP (netCDF Discrete Sampling Geometry) data sources. Currently the UAF ERDDAP server is home to 10,712 data sets. After the UAF ERDDAP server has examined the data collection, a Live Access Server (LAS) is configured to offer data analysis and visualization access to all the data sets. The final piece of the puzzle is to make the data FAIR and to achieve PARR compliance. This requires some tools that have been adapted and developed for this purpose. We resurrected the ncISO tool which can examine the contents of CF netCDF data sources and create ISO metadata and score the data according the the Unidata Attribute Convention for Data Discovery. We can help the centers hosting the data meet their PARR requirements by properly integrating the resulting metadata from ncISO into NOAA’s central data catalog. We have recently updated the templates which are used to generate the metadata to insure they are meeting the latest ISO and ADDC specifications. Work is underway at NOAA and Unidata to integrate the ncISO code back into the GitHub repository for the THREDDS Data Server. This will bring together two disparate ncISO implementations. UAF is a few people working a few hours a month to maintain and large and useful data collection and in this talk we’ll tell you how we do it.
A system capable of reliably detecting catastrophic landslides and centimeter movement in land mass could save lives and offers landowners valuable information about gradual changes in soil displacement on their land. With precise acceleration and relative positioning data collected from an accelerometer and a set of GPS receivers, a system can be designed to detect subtle changes in sensor position due to land movement. With the rapid production of new microprocessors and greater memory storage capabilities the limits of microcontroller systems are continually expanding. The Slide Sentinel project offers landowners a low-cost alternative to commercial equipment consisting of a network of remote low power sensors that detect fast linear slides and eventually lower soil movements such as creep. Long range low-power (LoRa) radio connections on these sensor nodes wirelessly transmit three-dimensional acceleration, Real Time Kinematic (RTK) GPS coordinates, and sudden shift alerts to a common base station where they are exported to an online spreadsheet to be processed remotely.
NASA’s Ice, Cloud, and land Elevation Satellite-2, ICESat-2, carries the Advanced Topographic Laser Altimeter System, ATLAS, which sends 10,000 laser pulses per second towards Earth and records individual photons reflected back to its telescope. The volume of data produced by the instrument, nearly a TB of data every day, presents a challenges for the user wishing to explore and do quick analysis on the data. Although NSIDC, the data center responsible for archiving and distributing ICESat-2 data, provides services such as browse and spatial, temporal and parameter subsetting on the data, these are not necessarily conducive to exploratory work. OpenAltimetry, a collaborative project between NSIDC and the San Diego Supercomputer Center at the University of California, San Diego, has created an online platform that allows users to quickly view photon clouds, or waveform energy profiles in the case of ICESat/GLAS, the predecessor mission to ICESat-2/ATLAS, for any time and location of interest to the user, as well as the surface-specific elevations from the higher level ATLAS products. OpenAltimetry emphasizes ease-of-use and rapid response times. NASA’s Ice, Cloud, and land Elevation Satellite-2, ICESat-2, carries the Advanced Topographic Laser Altimeter System, ATLAS, which sends 10,000 laser pulses per second towards Earth and records individual photons reflected back to its telescope. The volume of data produced by the instrument, nearly a TB of data every day, presents a challenges for the user wishing to explore and do quick analysis on the data. Although NSIDC, the data center responsible for archiving and distributing ICESat-2 data, provides services such as browse and spatial, temporal and parameter subsetting on the data, these are not necessarily conducive to exploratory work. OpenAltimetry, a collaborative project between NSIDC, Scripps Institution of Oceanography and the San Diego Supercomputer Center at the University of California San Diego, has created an online platform that allows users to quickly view photon clouds, or waveform energy profiles in the case of ICESat/GLAS, the predecessor mission to ICESat-2/ATLAS, for any time and location of interest to the user, as well as the surface-specific elevations from the higher level ATLAS products. OpenAltimetry emphasizes ease-of-use and rapid response times. A user can do more in depth data analysis on a Jupyter notebook invoked through OpenAltimetry’s map-based interface, thus providing a full data analysis stack that lives in the cloud and enables scientists to do their work without investing a lot of time thinking about dependencies and deployments.
Hyperpycnal flows are produced when the density of a fluid flowing in a relatively quiescent basin is greater than the density of the fluid in the basin. The density differences can be due to the difference in temperatures, salinity, turbidity, concentration, or a combination of them. Turbulence-resolved numerical simulations of such flows, in particular DNS (Direct Numerical Simulations), generate vasts amounts of resulting data. In the case of poli-disperse particle laden gravity currents simulations, where several sediments diameters are considered, very detailed data of the concentration field is available near bed. It can be post processed and analysed as a deposition map of one geological event. Traditional visualization tools lack the geological visual metaphor, and a new visual and interactive tool is proposed in this work. The aim of this new tool is to provide a better way to visualize numerical simulation results of particle laden gravity currents, plotting the results with the visual resemblance of a stratigraphic image. Since numerical simulation results usually have better spatio-temporal resolution compared to traditional stratigraphy, as the resolution depends exclusively on the amount of computing power available and it gets higher each day, the proposed interactive tool let the user visualize how the deposition map evolves in time and space. This tool can be employed to analyse the link between the deposition map and the turbulent flow that produced it, and the influence of all governing parameters. Numerical data was provided by Incompact3d, a code based on a Boussinesq system for incompressible fluids, designed for supercomputers. However this particular approach is a data driven post processing tool, thus it should be compatible with any numerical solver.