Distortion of the apparent S-wave radiation pattern in the high-frequency wavefield: Tottori-Ken Seibu, Japan, earthquake of 2000

Authors


SUMMARY

The distortion properties of the apparent S-wave radiation pattern in the high-frequency seismic wavefield of over f > 2 Hz is investigated using a large number of waveform records of the main shock and 29 aftershocks of the Tottori-Ken Seibu, Japan, magnitude (Mw) 6.6 earthquake in 2000. The dense seismic records from the KiK-net strong motion network show a clear four-lobe pattern in the apparent S-wave radiation pattern in the low-frequency wavefield of f < 2 Hz, and shows an almost isotropic distribution in all directions as the frequency increases above 5 Hz. The distortion of the apparent S-wave radiation pattern in the high-frequency wavefield increases as travel distance increases. Therefore, the path effect caused by the scattering of seismic waves due to small-scale heterogeneities in the crust is a major cause of distortion of the radiation pattern. This hypothesis is examined by a 2-D finite-difference method simulation of seismic waves in heterogeneous structure models. The results of simulations clearly demonstrate the collapse of the S-wave front due to seismic wave scattering in the heterogeneous structure. By comparing the observed wavefield and the results of simulations using different sorts of stochastic heterogeneous models, the most preferable model that can explain the observation is described by a von Karman autocorrelation function with correlation distance of a= 3–5 km, order of κ= 0.5 and rms value of ɛ= 0.07. However, our simple stochastic random heterogeneity model proposed, herein, somewhat overestimates the scattering of low-frequency signals below 2 Hz.

Ancillary