Abstract
We combine earthquake spectra from multiple studies to investigate
whether the increase in stress drop with depth often observed in the
crust is real, or an artefact of decreasing attenuation (Q) with
depth. In many studies, empirical path and attenuation corrections are
assumed to be independent of the earthquake source depth. We test this
assumption by investigating whether a realistic increase in Q
with depth (as is widely observed) could remove some of the observed
apparent increase in stress drop with depth.
We combine event spectra, previously obtained using spectral
decomposition methods, for over 50,000 earthquakes (M0-5) from 12
studies in California, Nevada, Kansas and Oklahoma.
We find that the relative high-frequency content of the spectra
systematically increases with increasing earthquake depth, at all
magnitudes. By analyzing spectral ratios between large and small events
as a function of source depth, we explore the relative importance of
source and attenuation contributions to this observed depth dependence.
Without any correction for depth-dependent attenuation, we find a
systematic increase in stress drop, rupture velocity, or both with
depth, as observed in the original studies. When we add an empirical,
depth-dependent attenuation correction, the depth dependence of stress
drop systematically decreases, often becoming negligible. The largest
corrections are observed in regions with the largest velocity increase
with depth. We conclude that source parameter analyses, whether in the
frequency or time domains, should not assume path terms are independent
of source depth, and should more explicitly consider depth-dependent
attenuation corrections.