Long-term slip history discriminates among occurrence models for seismic hazard assessment

Authors


Abstract

[1] Today, the probabilistic seismic hazard assessment (PSHA) community relies on stochastic models to compute occurrence probabilities for large earthquakes. Considerable efforts have been devoted to extracting information from long catalogs of large earthquakes based on instrumental, historical, archeological and paleoseismological data. However, the models remain only and insufficiently constrained by these rare single-slip event data. Therefore, the selection of the models and their respective weights necessarily involves ruling by a panel of experts. Since cumulative slip data with high temporal and spatial resolution are now available, we propose a new approach to incorporate these pieces of evidence of mid- to long-term fault behavior into PSHA: the Cumulative Offset-Based Bayesian Recurrence Analysis (COBBRA). For the Dead Sea Fault, our method provides weights to the competing recurrence and rupture models, allows time-independent models to be ruled out, and provides a means to compute the cumulative probability of occurrence for the next full-segment event reflecting all available data.

1. Introduction

[2] During the instrumental period (the past 120 yr), no major active fault segment worldwide has ruptured entirely more than once. Additional information on the recurrence of such events may be found in historical documents and damaged archeological structures. Surface ruptures also can be preserved in sedimentary deposits, leading to paleoseismological records. On average, the compilation of all these data sources for a given fault segment may yield catalogs of large earthquakes (CLE) of 3 to 4 events, exceptionally up to 15 (e.g., for the San Andreas Fault [Rockwell and Ben-Zion, 2007]). Using that data, mean recurrence intervals and other useful parameters are computed for each candidate occurrence model [Parsons, 2008; Rhoades and Dissen, 2003] and the spatial extent of paleoearthquakes is evaluated [Biasi and Weldon, 2009]. Still, unambiguously inferring the occurrence model for large (full-segment) events using CLEs alone remains out of reach [Parsons, 2008; Ellsworth et al., 1999]. This leads to epistemic uncertainties stemming from the choice of a recurrence model that are currently incorporated in the probabilistic seismic hazard assessment (PSHA) by means of subjective weights determined by expert panels [Working Group on California Earthquake Probabilities, 2003].

[3] Over several tens of earthquakes, successive deformation episodes affect markers of known age such as river streams for strike-slip faults and stratigraphic units for dip-slip faults. So far, these dated cumulative offsets are used to compare long-term (1000s or 10.000s yr) slip rates and present-day GPS strain rates. Thanks to recent advances in high-resolution dating and imaging techniques cumulative slip histories have started painting more complex patterns of the long-term behavior of active faults [Ferry et al., 2007; Ludwig et al., 2010; Zielke et al., 2010], which need to be incorporated into the next generation of PSHA. Here, we propose such an approach and show that measurements of cumulative slip at different times in the past can be used in combination with CLEs to better characterize the occurrence of earthquakes that rupture a whole fault or fault segment.

2. Why Choose the Jordan Valley for Our Case Study

[4] To illustrate our novel methodology, we apply it to the Jordan Valley segment (JVF) of the Dead Sea fault (Figure 1b). The JVF has a simple geometry and segmentation as well as a relatively long CLE (historical and archeological) and detailed cumulative slip history data (tectonic geomorphology). With a length of 120–150 km, the JVF is capable of producing Mw 7.2–7.6 earthquakes [Wells and Coppersmith, 1994] and has been identified as a major source of seismic hazard for northern Israel and northern Jordan [Yücemen et al., 2005]. The JVF is weakly sub-segmented and strongly separated from nearby segments by large (≥10 km) pull-apart basins, making it unlikely that a large event should stop before reaching the tips of the segment or go beyond the tips, and that an event initiated on another well-expressed segment of the Dead Sea Fault could propagate onto the JVF co-seismically [Wesnousky, 2006] (Figure 1a). Hence, we model large ruptures as occupying the full segment lenth. The CLE for the JVF contains no instrumental event but three historical events [AD 1033 ± 0, AD 749 ± 1, and 759 BC ± 1 Ambraseys, 2009] and three archeoseismic events (1150 BC ± 50, 2300 BC ± 50, and 2900 BC ± 50 [Franken, 1992; Savage et al., 2003]). Furthermore, combined analysis of high-resolution paleoclimatic records and satellite images [Ferry et al., 2007] provides 20 data points in the form of 6 distinct classes of dated cumulative offsets (with their uncertainties), spanning 48.5 kyr (Figure 1c). This dataset was updated with one supplementary point (114 m, 25 kyr BP) derived from a more recent study (Table C1 (M. Ferry et al., unpublished data, 2010)). It reveals significant variability in shorter-term slip rates which can switch from 3.5 to 11 mm/yr in a couple thousand years (i.e., over a few events (Figure 1c)). In this study, we use the CLE as well as the detailed record of dated cumulative displacements to constrain recurrence model parameters, estimate the relative plausibility of each model, and improve conditional probability estimates of a future event.

Figure 1.

The Jordan Valley segment of the Dead Sea Fault: location, structure, and associated cumulative slip data. (a) The Jordan Valley Fault (SRTM elevation data) is bounded by large pull-apart basins. (b) Location of the JVF within the Dead Sea fault system. (c) Examples of synthetic catalogs using each inter-event time distributions superimposed on the geomorphic markers [Ferry et al., 2007]. Slip rates are constrained to range from 2 to 13 mm/yr.

3. COBBRA Method

[5] Among frequently used occurrence models the two end members are 1) the periodic or quasi-periodic model, initially supported by Reid's elastic rebound theory [Reid, 1910], and 2) the Poisson model, in which all events are independent in time [Ang and Tang, 1975]. In Reid's model, while tectonic plates accumulate strain along faults at a constant rate, each large event releases the whole strain and resets the system. The next event does not happen unless the strain has built up again. The corresponding inter-event times are of equal length, or in the presence of minor variations, form narrow Gaussian distributions. The Poisson model is an exponential distribution and it is applied in PSHA when large earthquake inter-event times do not depend on the last event. When large event recurrence does not follow either of these end members, three stochastic models have been favored to describe their occurrence: Weibull, lognormal (logN), and Brownian Passage Time (BPT) (see Appendix B). Recurrence times for these distributions generally depend on the time since the last event, and for particular parameter choices, “include” the two cases described above: the Dirac is the asymptotic limit of Weibull at large shape parameters (≥15), the Gaussian is included in BPT for large scale parameters (≥50,000), and Poisson corresponds to Weibull with a shape parameter of 1. In the following, we will therefore focus on these three laws, without loss of generality.

[6] Our three-step COBBRA method (see auxiliary material) can be summarized as follows. Step 1: choose a prior distribution for the parameters, compute the probability of the parameters explaining the past known inter-event times from the CLE (CLE likelihood), and the corresponding posterior. Step 2: compute the probability of the parameters knowing the cumulative slip data using our novel algorithm. Step 3: use the posterior computed in Step 1 as a prior, and multiply by Step 2 to compute the posterior of the parameters knowing both the CLE and the cumulative slip data. The different models are ranked by evaluating the integral of the probability density functions (pdfs) obtained in Step 3, i.e., the evidence. Using the ratio of evidence as weights, we compute the best combination of models [MacKay, 2003, chap. 28].

4. A Better Constrained Large Earthquake Hazard for Jordan and Israel

[7] We choose a flat prior on b and a flat prior on the mean recurrence intervals. This latter corresponds to a flat prior on a for the BPT and Weibull models, and is proportional to exp(a) for the lognormal model (see discussion in the auxiliary material). The shape of the CLE posteriors strongly depends on the model (Figure 2a). Although the optimum is well defined, the posterior remains significant for a wide range of parameters for each distribution and the Poisson model (b = 1 in Weibull, Appendix B) cannot be excluded at this stage. However, both the periodic and the Gaussian models become negligible (the posterior of b ≥ 15 for Weibull and b ≥ 50,000 for BPT are more than 10 orders of magnitude lower than at the optimum). The inter-event time distribution accounts for the pdfs of all parameters through their posterior. Figure 2d shows that the broad pdfs are reflected in the rather shallow slopes of the present-day cumulative density functions (cdfs) (auxiliary material and Figure 1 therein). Also, the minimum probability of having the next event in the next 30 yr (at the 95% confidence level) is very low (0.5% for BPT and logN and 1.5% for Weibull), and even in the next 300 yr, it remains below 25% (11.5%, 2.5%, and 24.5%, respectively (Figure 2d)).

Figure 2.

Parameter space for each model as constrained by seismicity, cumulative slip, or both. Consequences on the probabilities of occurrence. a in years. (a) CLE posterior. (b) Geomorphology likelihood. (c) Posterior, for each model. (d) Cdfs for the occurrence of the next event computed from 2010 using A (red) or C (green). Inserts: 95% confidence intervals at 30 yr and 300 yr.

[8] The originality of our approach resides in the implementation of Step 2. We create synthetic CLEs (Appendix A) and compute the probability of matching the cumulative slip data for each possible set of parameters using our novel algorithm (auxiliary material, equation (4); Figure 2b). We arbitrarily choose a uniform distribution for the co-seismic slip. Its bounds (2.5–4 m) bracket the 3 to 3.5 m values obtained from the (poorly constrained) empirical scaling relationship for strike-slip faults by [Wells and Coppersmith, 1994] for rupture lengths around L = 135 km. Note that 1–4 m yield similar results.

[9] What we find most constraining is the (very conservative) slip rate constraint that we impose: between 2 and 13 mm/yr, thus including the known excursions of the slip rate (Figure 1c).

[10] In Step 3, we combine real and synthetic CLEs. For each model, the parameter space is greatly reduced (compare Figures 2a and 2c). The Poisson hypothesis has posterior probabilities rarely different from 0. We conclude that earthquake recurrence on the JVF is not random in time but rather includes some measure of time-dependence (see Ogata [1999] and Scharer et al. [2010] for similar results for faults in Japan and for the Southern San Andreas). Reducing the parameter space also constrains the 30 yr and 300 yr conditional probability estimates. The minimum hazard increases 2-fold at 30 yr and up to 10-fold at 300 yr for the lognormal model. These results could have a large impact on mitigation strategies.

[11] Finally, the evidence of each model is computed and the normalized weights are 0.45 for Weibull, 0.33 for lognormal and 0.22 for BPT, i.e., no model is significantly better than the others for that specific fault with the data currently available. When models are equivalent with respect to the data, it does not mean that their behavior is the same. Therefore, the answer is not to choose one of them but to use a combination of all. Figure 3 shows the cdf for the combination of models. We obtain 9% probability for the next event to occur within 30 yr, and 52% within 300 yr (Figure 3). This suggests that we could be in a long quiescence period comparable to the one observed before the AD 749 event. The synthetic catalogs computed for each model with its optimum parameters (Figure 1b) exhibit such episodic behavior.

Figure 3.

Cumulative density function for the next event in the JVF from 2010. Cdf of the next event from 2010. The red, green and blue curves show the cdfs computed using the posterior for the BPT, lognormal, and Weibull models respectively. The pink curve shows the weighted combination (final model: 0.45 for Weibull, 0.33 for lognormal and 0.22 for BPT). We obtain 9% probability of occurrence within 30 yr, and 52% within 300 yr.

5. Conclusion

[12] We show that cumulative slip measured at different times in the past can be used in combination with catalogs of large earthquakes to better characterize the occurrence of ruptures affecting a whole fault or fault segment. Working with these two independent datasets has major beneficial effects: i) it greatly reduces the parameter space for each recurrence model to be used in probabilistic seismic hazard assessment; in the case of the Jordan Valley fault minimum probabilities of having the next earthquake in 30 yr, and 300 yr increase by factors of 2 and 10, respectively, ii) it shows in the case of an intermediate tectonic regime with long and detailed historical and archeological records how we can constrain the inter-event time models even if we do not know a large number of past events, and iii) the proposed model is a Bayesian combination of each model weighted by their overall adequacy at explaining the data that could replace weights determined by an expert panel. If one of the three laws we used is the good one, then a minimum of 50 consecutive events are needed to prove it if only a CLE is used [Matthews et al., 2002]. Since it is unlikely that we will ever be able to get this much information, we urge the Seismic Hazard community to consider what cumulative slip data can tell us about the long-term behavior of faults and the hazard they pose.

Appendix A:: Synthetic CLEs Computation

[13] Choose a model (BPT, Weibull, Lognormal)

[14] Loop on model parameter values

[15] Loop on number of realizations (number of times we start building a catalog, 100,000)

[16] 1. Draw a sample of inter-event time.

[17] 2. Check that bounds of coseismic slip D comply with the slip rate constraints.

[18] 3. Redefine bounds if needed, and Draw D.

[19] 4. Update time and cumulative displacement, or reject catalog and start over.

[20] 5. When the catalog is finished, update the likelihood.

[21] The slip rate constraint ensures that slip rates over the last 4 inter-event times remain between 2 and 13 mm/yr, so as to include the known excursions of 3.5 and 11 mm/yr over 3 to 5 events [Ferry et al., 2007]. When there is a time in the catalog when no new couple of inter-event time and slip per event can be drawn that complies with this constraint, the catalog is rejected. We obtained 100,000 catalogs for each couple of parameters, for each model, reduced by the number of rejected ones. For some areas of the parameter space (such as b=1 for Weibull, i.e., the Poisson case), nearly all catalogs were rejected, showing the inadequacy of these parameters at explaining the observations while remaining consistent with the slip rate constraint. We filter the results using the arbitrary following relationship: if (N(a,b) <100) L(a,b) becomes L(a,b) × N(a,b)/100 so as to remove the outliers. N(a,b) is the number of accepted synthetic catalogs for (a,b), and L(a,b) is the synthetic CLEs likelihood.

Appendix B:: Three Distributions

[22] In Probabilistic Seismic Hazard Assessment, three stochastic models have been favored to describe the inter-event time distribution for large, characteristic events. Their fluctuations represent the variability from natural perturbations (e.g., fault roughness, seismicity in the surroundings): Weibull (equation (B1)), lognormal (logN, equation (B2)), and Brownian Passage Time (BPT, or inverse Gaussian, equation (B3)). They all have 2 parameters, that we decide to call a and b.

equation image
equation image
equation image

[23] We want to point out that these models do not rely heavily on physical bases, though Weibull is often used to describe fatigue processes in materials, and BPT is thought of as a combination of steady loading and random walk-type perturbations [Matthews et al., 2002]. Our model is ready to test more process-based models for inter-event time distributions [e.g., Fitzenz et al., 2007] as they become available.

Appendix C:: Datasets

[24] The inter-event times from the compiled seismicity catalog for the JVF are: 284, 1508, 391, 1150, and 600 yr. The geomorphological dataset is presented in Table C1.

Table C1. Geomorphology Dataseta
Date of the Marker (BP)Cumulative Slip (m)Standard Deviation (m)
5000 ± 100 yr172.5
7000 ± 100 yr252.5
9000 ± 100 yr462.5
13000 ± 500 yr635.0
25000 ± 500 yr1141.5
37500 ± 500 yr17510.0
47500 ± 500 yr23015.0

Acknowledgments

[25] This research was funded by the Fundação para a Ciência e a Tecnologia (PTDC/CTE-GIX/101852/2008 and FCOMP-01-0124-FEDER-009326). DF, MF, and AJ all benefit from the FCT Ciência 2007-08 program.

Ancillary