You need to sign in or sign up before continuing. dismiss

Near-Earth asteroids and meteoroids constitute various levels of impact danger to our planet. On the one end, billions of events associated with small-sized meteoroids have resulted in trivial effects. On the other end, the occurrences of large-sized asteroidal collisions that can cause mass extinctions and may wipe out the modern human civilization are extremely rare. In addition, large near-Earth asteroids are being monitored constantly for accurate and precise predictions of potential hazardous visits to our planet. However, small asteroids and large meteoroids can still often go under the radar and cause bolide explosions with potential of significant damage to communities on the ground. To facilitate management of bolide hazard, a number of scholarly works have been dedicated to estimation of frequencies of bolide events from a global perspective for planetary defense and mitigation. Nevertheless, few of the existing bolide frequency models were developed for local hazard management. In this presentation, the author introduces two recently developed frequency models for local management of bolide hazard. The first one, called the Dome model, computes the expected frequency of bolide explosions within a dome-shaped volume around a location. The second one, called the Coffee Cup model, is for a column-shaped volume above an area. Both models are based on empirical calibrations with historical data on energy, latitude, altitude, and frequency of bolide events. The modeling results indicate a linearly decreasing trend of frequency of bolide events from south to north latitudinally around the globe. The presented models can be applied to any location or area on Earth, including the entire surface of the planet.

Yi (Victor) Wang

and 4 more

Traditional deterministic and geostatistical methods for rainfall interpolation usually fall short of integration of data on a variety of variables. These omitted variables include seasonal variables such as time of year, topographic variables such as elevation, and/or remote sensing variables such as radar reflectivity. Meanwhile, poor quality in data on certain variables for some data points poses challenges to modelers who are using machine learning approaches to estimate rainfall amounts for locations without gauge measurements. To overcome these limitations, this presentation introduces a novel deep learning-based approach to recreate rainfall histories for large geographic areas with a high spatio-temporal resolution. The proposed approach enables integration of data on a variety of variables by adopting a multi-layer perceptron modeling framework. The introduction of binary variables on data quality as additional input variables resolves the issue of unequal data quality for different data points. As a demonstration, historical records of rainfall at hourly and daily intervals recorded at 139 rain gauge stations in or close to Harris County, Texas, from 1986 to 2013 are used, along with other auxiliary variables, to train deep learning regression models to interpolate rainfall at surface level. Results of validation and recreated spatiotemporal distributions of rainfall indicate good performance of the proposed approach compared to both gauged and radar data. The final product of the proposed approach can be applied to other regions, with information on hindcast historical rainfall events, for pluvial flood risk analysis. The approach will assist researchers and policy specialists to validate hydrologic modeling as well as for training machine learning models to identify extreme rainfall events to facilitate early warning and emergency response.