The frequency, size, and intensity of wildfires in California have increased substantially in recent years, leading to widespread mandatory evacuations affecting millions of residents. However, because evacuation orders are implemented by local agencies, there is limited quantitative evidence on the scope of evacuations statewide. In order to improve the understanding of wildfire evacuations, we assembled information on historical evacuation orders for two distinct wildfire-prone regions --- Fresno and Sonoma county --- in California. This data was used to understand how the frequency and extent of evacuations have changed over time. We then combined this information with census data to characterize which populations have been most affected by evacuation orders. Ultimately, our work aims to quantify this important element of wildfire impacts in key regions around California. Collectively, it provides a starting point for a public database of evacuation orders that could be used by researchers and policymakers to better understand dynamics and improve decision-making around wildfire evacuations.
This article is composed of three independent commentaries about the state of ICON principles (Goldman et al. 2021) in the AGU Biogeosciences section and discussion on the opportunities and challenges of adopting them. Each commentary focuses on a different topic: Global collaboration, technology transfer and application (Section 2), Community engagement, citizen science, education, and stakeholder involvement (Section 3), and Field, experimental, remote sensing, and real-time data research and application (Section 4). We discuss needs and strategies for implementing ICON and outline short- and long-term goals. The inclusion of global data and international community engagement are key to tackle grand challenges in biogeosciences. Although recent technological advances and growing open-access information across the world have enabled global collaborations to some extent, several barriers ranging from technical to organizational to cultural have remained in advancing interoperability and tangible scientific progress in biogeosciences. Overcoming these hurdles is necessary to address pressing large-scale research questions and applications in the biogeosciences, where ICON principles are essential. Here, we list several opportunities for ICON, including coordinated experimentation and field observations across global sites, that are ripe for implementation in biogeosciences as a means to scientific advancements and social progress.
Future precipitation changes are controlled by the atmospheric energy budget, with temperature, water vapor, and absorbing aerosols playing dominant roles in driving radiative changes. Atmospheric energy budgets are calculated for different Shared Socioeconomic Pathways (SSPs) using ScenarioMIP projections from phase 6 of the Climate Model Intercomparison Project and are used to quantify the influence of 21st century aerosol cleanup on precipitation. Absorbing aerosol influences on shortwave absorption are isolated from the effects of water vapor. Apparent hydrologic sensitivity is ~40% higher for the “Middle of the Road” (SSP2-4.5) scenario with aerosol cleanup than for the “Regional Rivalry” (SSP3-7.0) scenario that maintains aerosol. Regionally, cleanup-induced changes in the atmospheric energy budget are of a similar magnitude to the precipitation increases themselves and are larger than the influence of changes in atmospheric circulation. Policy choices about future absorbing aerosol emissions will therefore have major impacts on global and regional precipitation changes.
The future of Arctic social systems and natural environments is highly uncertain. Climate change will lead to unprecedented phenomena in the pan-Arctic region, such as regular shipping traffic through the Arctic Ocean, urban growth, military activity, expanding agricultural frontiers, and transformed Indigenous societies. While intergovernmental to local organizations have produced numerous synthesis-based visions of the future, a challenge in any scenario exercise is capturing the ‘possibility’ space of change. In this work, we employ a computational text analysis to generate unique thematic input for novel, story-based visions of the Arctic. Specifically, we develop a corpus of more than 2,000 articles in publicly accessible, English-language Arctic newspapers that discuss the future in the Arctic. We then perform a latent Dirichlet allocation, resulting in ten distinct topics and sets of associated keywords. From these topics and keywords, we design ten story-based scenarios employing the Mānoa mashup, science fiction prototyping, and other methods. Our results demonstrate that computational text analysis can feed directly into a creative futuring process, whereby the output stories can be traced clearly back to the original topics and keywords. We discuss our findings in the context of the broader field of Arctic scenarios and show that the results of this computational text analysis produce complementary stories to the existing scenario literature. We conclude that story-based scenarios can provide vital texture toward understanding the myriad possible Arctic futures.
Near-Earth asteroids and meteoroids constitute various levels of impact danger to our planet. On the one end, billions of events associated with small-sized meteoroids have resulted in trivial effects. On the other end, the occurrences of large-sized asteroidal collisions that can cause mass extinctions and may wipe out the modern human civilization are extremely rare. In addition, large near-Earth asteroids are being monitored constantly for accurate and precise predictions of potential hazardous visits to our planet. However, small asteroids and large meteoroids can still often go under the radar and cause bolide explosions with potential of significant damage to communities on the ground. To facilitate management of bolide hazard, a number of scholarly works have been dedicated to estimation of frequencies of bolide events from a global perspective for planetary defense and mitigation. Nevertheless, few of the existing bolide frequency models were developed for local hazard management. In this presentation, the author introduces two recently developed frequency models for local management of bolide hazard. The first one, called the Dome model, computes the expected frequency of bolide explosions within a dome-shaped volume around a location. The second one, called the Coffee Cup model, is for a column-shaped volume above an area. Both models are based on empirical calibrations with historical data on energy, latitude, altitude, and frequency of bolide events. The modeling results indicate a linearly decreasing trend of frequency of bolide events from south to north latitudinally around the globe. The presented models can be applied to any location or area on Earth, including the entire surface of the planet.
Several bills moving through Congress are likely to provide significant funding for expanding research and results in climate change solutions (CCS). This is also a priority of the Biden-Harris Administration. The National Science Foundation (NSF) will be expected to distribute and manage much of this funding through its grant processes. Effective solutions require both a continuation and expansion of research on climate change–to understand and thus plan for potential impacts locally to globally and to continually assess solutions against a changing climate–and rapid adoption and implementation of this science with society at all levels. NSF asked AGU to convene its community to help provide guidance and recommendations for enabling significant and impactful CCS outcomes by 1 June. AGU was asked in particular to address the following: 1. Identify the biggest, more important interdisciplinary/convergent challenges in climate change that can be addressed in the next 2 to 3 years 2. Create 2-year and 3-year roadmaps to address the identified challenges. Indicate partnerships required to deliver on the promise. 3. Provide ideas on the creation of an aggressive outreach/communications plan to inform the public and decision makers on the critical importance of geoscience. 4. Identify information, training, and other resources needed to embed a culture of innovation, entrepreneurialism, and translational research in the geosciences. Given the short time frame for this report, AGU reached out to key leaders, including Council members, members of several committees, journal editors, early career scientists, and also included additional stakeholders from sectors relevant to CCS, including community leaders, planners and architects, business leaders, NGO representatives, and others. Participants were provided a form to submit ideas, and also invited to two workshops. The first was aimed at ideation around broad efforts and activities needed for impactful CCS; the second was aimed at in depth development of several broad efforts at scale. Overall, about 125 people participated; 78 responded to the survey, 82 attended the first workshop, and 28 attended the more-focused second workshop (see contributor list). This report provides a high-level summary of these inputs and recommendations, focusing on guiding principles and several ideas that received broader support at the workshops and post-workshop review. These guiding principles and ideas cover a range of activities and were viewed as having high importance for realizing impactful CCS at the scale of funding anticipated. These cover the major areas of the charge, including research and solutions, education, communication, and training. The participants and full list of ideas and suggestions are provided as an appendix. Many contributed directly to this report; the listed authors are the steering committee.
This article provides a commentary about the state of integrated, coordinated, open, and networked (ICON) principles in Earth and Planetary Science Processes (EPSP) and discussion on the opportunities and challenges of adopting them. This commentary focuses on the challenges with current inclusive, equitable, and accessible science and highlights how research undertaken in the earth and planetary surface processes community currently benefit from and would be able to grow as a discipline with more directed implementation of ICON principles.
Of immediate widespread concern is the accelerating transition from Holocene-like weather patterns to unknown, and likely unstable, Anthropocene patterns. A fell example is irreversible Arctic phase change. It is not clear if existing AOGCMs are adequate to model anticipated global impacts in detail; however, the GISS ModelE AOGCM can be used to locally compare and extend the PIOMAS Arctic ocean historical ice-volume dataset into the near future. Arctic Amplification (AA) mechanisms are poorly understood; to enable timely results, a simple linear, Arctic TOA grid-boundary energy-input is used to enforce AA, avoiding the perils of arbitrary modification of relatively well-studied parameterizations (e.g., restriction of cloud-top height to induce local warming). Only PIOMAS springtime/max and fall/min Arctic ice-volume decadal, linear trends were enforced. This temporally-broad grid-boundary modification produces a surprisingly detailed consonance with 10 out of 12 temporal profiles falling within 1-sigma of PIOMAS temporal data for the entire history modeled (2003 to 2021). The data are then integrated to 2050. The result is a zero-ice-volume, summer/fall half-year, beginning ca. 2035 (onset 1-sigma of ± ~5 years), with mean annual Arctic temperatures increasingly trending above freezing. Persistent, Arctic phase change follows this half-year transition about 20 years later. Also present in later stages, the 500 hPa height minimum is no longer nearly-coincident with the pole, suggesting jet stream disruption and its consequences. Hypothesized large clathrate-methane releases likely associated with Arctic temperature and phase change are also examined. A basic assumption is that the Arctic ice (i.e., temperature) must be preserved at all costs. This work establishes a reasonably detailed timeline for the Arctic phase change based on well-studied AOGCM physics, slightly tuned to decades of PIOMAS data. This result also points to the Arctic as a key, near-term site for localized, nondestructive intervention to mitigate Arctic phase change (e.g., Stjern ), thereby slowing the Holocene -> Anthropocene growing-season disruption. Although such an intervention cannot itself accomplish the requirements of the IPCC SP-15 , nor Planetary Boundaries theory, delaying the Arctic phase change will likely extend the time-window for accomplishing those critical tasks and ultimately to at least slow the rate of increase of climate emergencies.
Extreme weather conditions are associated with a variety of water quality issues that can pose harm to humans and aquatic ecosystems. Under dry extremes, contaminants become more concentrated in streams with a greater potential for harmful algal blooms, while wet extremes can cause flooding and broadcast pollution. Developing appropriate interventions to improve water quality in a changing climate requires a better understanding of how extremes affect watershed processes, and which places are most vulnerable. We developed a Soil and Water Assessment Tool model of the Cape Fear River Basin (CFRB) in North Carolina, USA, representing contemporary land use, point and non-point sources, and weather conditions from 1979 to 2019. The CFRB is a large and complex river basin undergoing urbanization and agricultural intensification, with a history of extreme droughts and floods, making it an excellent case study. To identify intervention priorities, we developed a Water Quality Risk Index (WQRI) using the load average and load variability across normal conditions, dry extremes, and wet extremes. We found that the landscape generated the majority of contaminants, including 90.1% of sediment, 85.4% of total nitrogen, and 52.6% of total phosphorus at the City of Wilmington’s drinking water intake. Approximately 16% of the watershed contributed most of the pollutants across conditions—these represent high priority locations for interventions. The WQRI approach considering risks to water quality across different weather conditions can help identify locations where interventions are more likely to improve water quality under climate change.
Water scarcity is a growing problem around the world, and regions such as California are working to develop diversified, interconnected, and flexible water supply portfolios. To meet their resilient water portfolio goals, water utilities and irrigation districts will need to cooperate across scales to finance, build, and operate shared water supply infrastructure. However, planning studies to date have generally focused on partnership-level outcomes (i.e., highly aggregated mean cost-benefit analyses), while ignoring the heterogeneity of benefits, costs, and risks across the individual investing partners. This study contributes an exploratory modeling analysis that tests thousands of alternative water supply investment partnerships in the Central Valley of California, using a high-resolution simulation model to evaluate the effects of new infrastructure on individual water providers. The viability of conveyance and groundwater banking investments are as strongly shaped by partnership design choices (i.e., which water providers are participating, and how do they distribute the project’s debt obligation?) as by extreme hydrologic conditions (i.e., floods and droughts). Importantly, most of the analyzed partnership structures yield highly unequal distributions of water supply and financial risks across the partners, limiting the viability of cooperative partnerships. Partnership viability is especially rare in the absence of groundwater banking facilities, or under dry hydrologic conditions, even under explicitly optimistic assumptions regarding climate change. These results emphasize the importance of high-resolution simulation models and careful partnership structure design when developing resilient water supply portfolios for institutionally complex regions confronting scarcity.
Urbanization and climate change are exacerbating stress on aging urban critical infrastructure systems, including water, energy, mobility, and telecommunication networks. Simulation tools and scenario analyses able to capture the interdependencies among these different infrastructure systems are crucial to support decision making and realize sustainable and resilient development. Yet, existing simulation tools are mostly developed within the boundaries of individual application sectors and information often remains siloed, despite the increasing data and computational opportunities offered by the digital transformation of many infrastructure sectors. In this work, we present how the ide3a project (international alliance for digital e-learning, e-mobility and e-research in academia – https://ide3a.net) addresses this research gap. ide3a is building a digital campus to support digital learning, research, and mobility in collaboration within a network of six European partner universities. Several senior and early career researchers with multidisciplinary backgrounds in water management, IT systems, mobility, energy, urban planning, sustainability, and psychology, work together to integrate state-of-the-art research on critical infrastructure and digitalization into traditional higher education curricula. As part of the ide3a portfolio of digital tools for learning and research, we present a prototype of “ConnectiCity”, an open-source simulation-based serious game that integrates multi-sectoral models to perform simulations of interconnected critical infrastructure systems and quantify cascading effects under various climate, social, and technical scenarios. Along with other ide3a activities, it is used to train early career researchers and students alike to enrich their transdisciplinary knowledge, foster critical system thinking, drive research on urban critical infrastructure dynamics, and ultimately working across disciplines to tackle contemporary urban challenges.
The NEXUS area covers approximately 30% of the Brazilian territory. In order to assist preservation and sustainable development policies in that region, this study proposes to replicate the work done by Yeh et al in Africa , in which a convolutional neural network estimates indicators through satellite images, each covering a region of approximately 45 km². This work compares the size and distribution of Brazil’s census tracts with those in Africa to define if the scale of images can be maintained and to define the clusters that will be used. To avoid biasing the model, special care must be taken in selecting clusters, such as keeping a balance between urban and rural sectors and, most importantly, making sure that there is little to no overlap of clusters. To do so, two approaches were proposed. The first one samples tracts in each municipality as centroids for clusters, the second merges neighboring urban tracts into a single group and fits clusters to these groups.
The PRISM Data Library (DL) is designed to optimize the display, analysis, and retrieval of multiple domains datasets. Originally created for climate data, we aggregated data from agriculture and hydrology domains, as well as non-traditional domains for the DL such as ecology, finance, power outage and space weather data. These datasets range from simple geospatial point observations, to spatially gridded data products, to high-resolution satellite measurements, to GIS representation of administrative or domain-specific geographic entities. These datasets are represented in a consistent multi-dimensional (most often spatial and temporal) framework. As a result, dimension-wise comparisons are easily enabled through selection or transformation. Gridded data can be averaged over discrete geometrical entities (e.g. Counties, Bird Conservation Regions). The DL can be used in a browser, by connecting to servers at San Diego Supercomputing Center (SDSC) over the internet. Data selection, processing, and analysis are performed by the SDSC DL servers, and the resulting images or data files are sent back to the client’s desktop. This model optimizes the use of internet bandwidth.
The spatial and angular emission patterns of artificial and natural light emitted, scattered, and reflected from the Earth at night are far more complex than those for scattered and reflected solar radiation during daytime. Here we demonstrate (through examples) that there is additional information contained in the angular distribution of emitted light. We argue that this information could be used to improve existing remote sensing retrievals based on night lights, and in some cases could make entirely new remote sensing analyses possible. We encourage researchers and funding agencies to pursue further study of how multi-angle views can be analyzed or acquired.
Spatial econometric models estimated on the big geo-located point data have at least two problems: limited computational capabilities and inefficient forecasting for the new out-of-sample geo-points. This is because of spatial weights matrix W defined for in-sample observations only and the computational complexity. Machine learning models suffer the same when using kriging for predictions; thus this problem still remains unsolved. The paper presents a novel methodology for estimating spatial models on big data and predicting in new locations. The approach uses bootstrap and tessellation to calibrate both model and space. The best bootstrapped model is selected with the PAM (Partitioning Around Medoids) algorithm by classifying the regression coefficients jointly in a non-independent manner. Voronoi polygons for the geo-points used in the best model allow for a representative space division. New out-of-sample points are assigned to tessellation tiles and linked to the spatial weights matrix as a replacement for an original point what makes feasible usage of calibrated spatial models as a forecasting tool for new locations. There is no trade-off between forecast quality and computational efficiency in this approach. An empirical example illustrates a model for business locations and firms’ profitability.
Since first diagnosed in the early 1990s, chronic kidney disease of unknown etiology (CKDu) has markedly increased in the North Central Province in the dry zone of Sri Lanka. CKDu has been identified as a global health issue in more than a dozen countries in Asia, South America, and the Middle East. It has been reported that out of these countries, Sri Lanka is the most affected, with the highest cases of CKDu patients and mortality rates. In Sri Lanka, the disease primarily affects male paddy (rice) farmers from low socioeconomic levels. A major river diversion scheme completed in the 70s feeds water from wet zones to ancient tanks that rely on rainwater only. The drinking water for the CKDu affected farming communities comes from the irrigation canals, shallow regolith water table aquifers recharged by canal seepage and precipitation, and deep-bored wells. Many contributing factors and hypotheses have been presented and discussed in the literature. Out of these multiple factors, the suspected environmental exposure pathways are through water (potable water and food) and air (unprotected pesticide spraying). Extensive data on water quality have been collected to develop, test, and support hypotheses on the role of water on the disease. However, no systematic investigations have been conducted to identify, study and analyze how pathways develop through the water storage and distribution systems from sources to the receptors where human exposure occurs. This study proposes a systems-based framework to conduct such analysis using numerical models of the intergraded surface and subsurface system. The models will simulate the fate and transport of naturally occurring toxins and agrichemicals and their geo-bio-chemicals transformation products. These models should incorporate characterization parameters of the surface water storage and distribution system and hydrogeologic data for shallow and deep aquifers, water quality data, epidemiological data, and climate drivers. Innovations methods to use the downscaled climate and regional hydrological model simulations to evaluate exposure pathways at local scales (e.g., villages) under different climate scenarios.
The hotter the climate is, the higher the demand for cooling is, leading to more electricity consumption and CO2 emissions. To understand the effect of future regional warming on the electricity demand and CO2 emissions in the Arabian Peninsula region, we selected a representative country, Qatar, and developed a model that relates daily electricity demand with temperature. By combining this model with temperature projections from of 1 23 the CMIP6 database (bias adjusted and statistically downscaled), as well as GDP and population projections from four SSP scenarios, we calculated Qatar’s demand for electricity until the end of the century. We found an average sensitivity of 1.7 GWh/°C for the electricity demand, equivalent to 0.4 MtCO2/°C for CO2 emissions. The electricity demand is projected to increase by 5 to 35% due to warming alone at the end of this century. Under SSP1, the warming-induced CO2 emissions could be offset by improvements of carbon intensity. Under SSP5, assuming no improvement of carbon intensity, future warming could add 20 to 35% of CO2 emissions per year by the end of the century, with half of the electricity demand related to extremely hot days becoming more frequent in the future. Our findings suggest that it is important to consider additional CO2 emissions arising from future warming in future temperature projections.