NZ Geomechanics News

Update on New Zealand National Seismic Hazard Model 

1.0 Introduction

A revision project for the New Zealand National Seismic Hazard Model (NSHM) has recently commenced. In this project, funded by the Ministry of Business, Innovation and Employment (MBIE) and the Earthquake Commission (EQC), we expect to deliver a revised NSHM by 31 August 2022.

The primary deliverables are: (1) a revised NSHM in an open-source computational library, (2) a web tool to make the NSHM and its results openly available and (3) peer-reviewed documentation.

The project design has been guided by the last decade or more of scientific research in seismic hazard modelling in New Zealand and internationally, by a 2017 international review of the NSHM and through interaction with key stakeholders.

This paper is a summary of GNS Science Report 2020/38 which can be downloaded at https://www.gns.cri.nz/NSHM/project-outputs. The full report includes comments and recommendations made by the Technical Advisory Group.

1.1 NSHM Project Structure

There are three main constituents within the project structure:

  • Science working groups
  • Technical Advisory Group (TAG)
  • Project Steering Group.

There are four main science working groups, each having sub-groups for specific tasks. The Core NSHM group leads the overall direction of the model, including the model integration. The Seismicity Rate Models (SRM) working group develops the models that forecast magnitudes and locations of earthquakes. The Ground Motion Characterisation Models (GMCM) working group develops the models and methods that estimate the shaking that an earthquake will produce, including any local site effects. The Service Delivery working group ensures that we have software and hardware solutions to the problems we need to solve.

The full science team (all working groups) comprises more than 50 scientists from across New Zealand, Australia, the USA, Canada and Europe.

The TAG consists of 16 members who are tasked with providing technical advice to the science teams. TAG membership is roughly evenly split between hazard scientists and technical endusers. The TAG’s role is to ensure that the NSHM is based on best-available science and that it provides the most useful outputs for end-users; this includes input into the development of the framework laid out in this document. As part of this task, the TAG works as a participatory review panel and will provide ongoing peer review of the NSHM as it is developed. The TAG also provides the first port-of-call for end-user engagement, and TAG members are expected to engage with their wider communities on NSHM topics.

The Project Steering Group has membership from GNS Science, MBIE and EQC. This group provides governance for the NSHM Programme.

1.2 End-User Engagement

Interaction with the end-user community is required for several primary reasons. Firstly, we must ensure that the outputs of the model are useful and provided in appropriate formats for users. Secondly, we need to understand and consider any impact that science decisions made by the NSHM team may have on end-users. Finally, success of the NSHM requires continual dialogue so that the end-user communities understand the NSHM and the implications of the hazard estimates
it provides.

The TAG is our primary contact for our end-user engagement; however, we intend to be in regular communication with New Zealand technical societies and to have direct engagement with end-users from across sectors. A series of end-user workshops are planned throughout the next two years to introduce the NSHM and to receive feedback on the proposed NSHM outputs.

1.3 Purpose of This Document

The purpose of this document is to describe the major components of work we intend to undertake prior to delivering a revised NSHM in August 2022. It details the overall philosophy of the model and the main components that we will explore and implement within the NSHM. The document is not intended to fully specify all details of the NSHM project but should provide enough insight for the reader to have an overview of the scientific components of the project and to tease out any further questions the reader may have.

Over the last five months, including with input from the TAG, we have identified the work packages we will pursue over the next two years. In many cases, the work is well underway. As part of the prioritisation work of the last months, we have identified work that needs further understanding (either scientifically or from the enduser community) before final priorities can be determined, or other work which may require redirection during the course of the NSHM development. In redirecting our work, we have developed alternative and aligned procedures. As such, this document reflects the view of the NSHM team at the time of writing and is subject to change based on further scientific understanding and community and stakeholder engagement. A primary purpose of this document is to present the plan for development of the NSHM in the next two years and to promote the engagement process.

2 Overall Model Considerations

2.1 Epistemic Uncertainty

To convey the full extent of our knowledge of hazard to end-users, it is necessary that epistemic uncertainty be considered, modelled and presented to end-users in a digestible form. However, within a two-year project (or any length project for that matter), it is impossible to completely explore and quantify the full range of epistemic uncertainties. Our goal over the next two years will be to evaluate and model what can be considered, with manageable effort, to be the largest and most critical uncertainties affecting hazard estimates. This exploration will not be exhaustive but will aim to provide indicative confidence bounds on the hazard estimates. The form in which the uncertainties will be provided to end-users is yet to be determined. We envision that fractiles will be considered, as well as less rigorous approaches. We also note that, while the need to provide useful and rigorous estimates of epistemic uncertainty in NSHMs is recognised internationally, limited efforts have been made to provide a thorough investigation and developing a feasible plan to address epistemic uncertainty in the 2022 revision requires additional consideration, so we expect the thoroughness of the uncertainty quantification to increase in future NSHM revisions.

2.2 Subjectivity and Expert Judgment

As with any scientific project, there is no escaping the need to use expert judgment in the development of the NSHM. Expert judgment in the NSHM ranges from minor un-influential decisions around parameter choices to highly influential decisions around, for example, logictree weights. It is not practical or pragmatic to require formal elicitation of expert judgment through all aspects of the NSHM development. Formal elicitation will be reserved for the weighting of logic trees and will follow guidelines from Christophersen and Gerstenberger (Forthcoming 2021), which is similar to what was applied in the development of the Canterbury Seismic Hazard Model (Gerstenberger et al. 2014, 2016) and other recent work. Subject-relevant experts from the NSHM team will provide weights via a structured expert elicitation process (Cooke 1991). For aspects in which we cannot apply structured elicitation, appropriate checks and balances to minimise bias (e.g. as described in Christophersen and Gerstenberger, forthcoming 2021) are required and are provided by the use of open team-based decision making and by participatory peer review by the TAG.

2.3 Model Delivery Timeline

We aim to have a draft of all SRM sub-models of the NSHM ready by July 1, 2021. In the eight months following, we will refine the draft models and endeavour to have a complete draft SRM by February 28, 2022. The GMCM framework is proposed to be complete by August 31, 2021, with a draft GMCM available by February 28, 2022. The final NSHM is due by August 31 2022, allowing six months for: (1) final review of the model and its implications, (2) further engagement with the technical end-user community and (3) final steps of the participatory review process by the TAG. NSHM outputs will be available online following completion of the project.

3 Seismicity Rate Model

3.1 Time-Dependence

The degree to which we will explore time-dependence is not yet decided. However, we are not intending to explore the implementation of a short-term clustering model, which would require regular updates on, for example, yearly cycles. The NSHM will be based on a static earthquake rate forecast for the time-window of interest. The time-window will be adaptable to specific needs but will be based on the same forecast information, and the forecast will not be updated during the life of the 2022 NSHM (i.e. we are not considering a ‘living’ model).Over the next 18 months, we will consider the application of three forms of explicit time-dependence.

  1. We will consider the conditional probability of rupture on a small selection of well-studied faults, as we have done in some past versions of the NSHM. This involves using uncertainty distributions of slip-rate, single-event displacement, times of past fault rupture and time since last rupture to estimate a pseudo-Poisson rate for the time-period of the forecast. Our default starting point will be the work of Rhoades and Van Dissen (2003), which uses a combination of Log-normal, Weibull and Brownian Passage Time recurrence-time models to estimate the conditional probability of rupture on a fault. Our effort will build upon the Kaikoura Seismic Hazard Model, and we will consider approximately 12 major faults (or others as data, time and priorities allow).
    • In deciding if we should apply these rates, we will explore the quality of timing data; the consistency of the conditional probability assumptions, which assume semi-regular recurrence of earthquakes, with our overall NSHM assumptions; and whether a time-dependence model is ultimately desirable.
  2. Past NSHMs (in New Zealand and internationally) have not considered the impact of aftershocks on hazard and, in fact, have explicitly removed their impact. This results in a systematic underestimation of the estimated hazard. More recently, the Boyd (2012) method allows for adding aftershock rates that are conditional on the occurrence of a main shock; in other words, it allows for accounting for potential aftershocks prior to their occurrence. This is not a model that requires updating and can be thought of as modelling potential aftershocks in a consistent manner with the modelling of main shocks.
    • Before applying the Boyd method to the final NSHM, we will consider its robustness, our ability to apply it in New Zealand and the impact on ground-motion model calculations (e.g. potentially ignored reductions in within-event uncertainty). Implications to the end-user community must also be considered.
  3. We will consider the impact of medium-term clustering on expected earthquake rates over the time-period covered by the NSHM. Most major earthquakes are preceded in the medium term by an increase in the magnitude and rate of minor earthquakes in an area similar to that occupied by the eventual aftershocks. This increase takes place over a period ranging from months to decades, depending on magnitude. The ‘Every Earthquake a Precursor According to Scale’ (EEPAS) model (Rhoades and Evison 2004) estimates the increase in earthquake rate expected when an earthquake of any magnitude occurs, based on observed relations that every earthquake may be a precursor to a larger one to follow. In constructing the seismicity rate model, we will use the EEPAS model to estimate the net effect of medium-term clustering on the expected rates at any magnitude and location for the forecast period of the NSHM, not the time variation of rates within the period.

The NSHM will include other data sets and decisions that are impacted by the temporal variability of seismicity and other related observations (i.e. non-stationarity of earthquake occurrence and earth processes). This is true of any NSHM. We will aim for a consistent philosophy on how we apply data sets and assumptions and will document known and implied impacts of these assumptions. The final form of the model is yet to be determined, but we will aim to produce one or more of the following:

a. A model that attempts, as well as possible, to remove all explicit and implied timedependence (e.g. a hazard forecast for any 50/100 years).

b. A second model that includes known time-dependence (e.g. a hazard forecast for the next 50/100 years).

c. A model that acknowledges the non-stationarity of seismicity and explicitly allows for time-dependent (TD) or time-independent (TI) contributions and which quantifies for this uncertainty.

It is our current opinion that a true TI model (model A) will never be possible due to implications of, and inconsistencies in, datasets used to constrain seismic hazard. Because of this, a model that is transparent and acknowledges the uncertainty from TI or TD assumptions will provide a forecast that is more consistent with our understanding of earthquake occurrence.

3.2 Constructing the Seismicity Rate Model

The SRM will follow a traditional logic tree structure that combines fault-based models with distributed seismicity models. We will pursue multiple options within each of the two components of the SRM logic tree; these components will aim to capture different hypotheses of how to model the earthquake occurrence process (e.g. epistemic uncertainty). We will initially scope a broad range of models with an aim of reducing the broad set down to a representative and unbiased set.

The fault-based models will be developed using ‘inversion’ techniques, following the recipe and software developed for the United States Geological Survey Third California Earthquake Rupture Forecast (UCERF3) in their OpenSHA software engine. The main steps involved in developing the inversion models include: (1) development of rupture sets, (2) development of deformation models and (3) defining the constraints for the inversion. Informally, the inversion method has become known as the ‘Grand Inversion’. Questions remain about the level of complexity we will be able to include in each of the three steps.

We envision that the biggest challenge to this ‘inversion’ approach will be the Hikurangi and Puysegur subduction zones; an inversion model has never been developed for a subduction interface before. Furthermore, the size and potential connectedness of the interface to crustal faults brings large complexity to the problem. If we are unable to develop an inversion model that includes Hikurangi and Puysegur, we will use a classical approach for the interfaces instead. Reducing the plausible ruptures on the interfaces, including joint ruptures with crustal faults, is a focus in the early part of the project. We will aim to allow for joint ruptures between each interface and its nearby crustal faults; however, this may prove too challenging in the next two years and may be reserved for future versions of the NSHM. Key research gaps include constraining the plausible joint ruptures and development (or applicability) of source-scaling relations. The Puysegur Subduction Zone will be treated with the same methods as the Hikurangi; however, with a much lower hazard and risk profile, it may be considered a lower priority and require simplification.

3.3 Seismicity Rate Model Work Packages

The primary components and work packages to develop the SRM include the following:

3.3.1 Inversion Models

Community Fault Model

A version 1.0 of the Community Fault Model (CFM) is in development. This CFM builds upon Litchfield et al. (2014) and aims to bring in new and missing fault data in a series of community workshops in late 2020.

  • Seismogenic depth across New Zealand will be re-evaluated and is expected to contain large changes from previous models.
  • The NSHM fault model will be extracted from the CFM with simplification of complex geometries where necessary.
  • Uncertainties on dip and also on seismogenic depth will be provided and hazard sensitivity to this will be explored. The impact of this uncertainty may prove to be as significant as rupture length.
Crustal Rupture Sets

The aim of the rupture sets is to define the plausible ruptures that will be considered in the inversion.
We will start with the constraints as implemented
in UCERF3, with some adaptations as necessary for
New Zealand.

  • Plausibility filters: The plausibility filters aim to reduce the rupture sets down to a number of ruptures that can be efficiently run using High Performance Computing. We aim to adopt UCERF3 plausibility filters as much as possible; however, NewZealand-specific adaptations will be necessary. It is currently unknown if we will pursue Coulomb-based filters (UCERF4 will use a new implementation of the Coulomb filters).
  • Splays: UCERF3 was limited to only linear ruptures, which do not adequately represent known New Zealand ruptures. We will allow for possible splay ruptures and will work with UCERF4 on this.
Hikurangi Rupture Sets

Similar to crustal faults, we must define the starting sets along the entire interface rupture. This is something that UCERF3 was not required to do due to the tectonic setting of California.

  • Plausibility filters: Instead of starting with a nominally comprehensive set of ruptures, as for crustal faults, we will build a limited set of ruptures using a limited range of aspect ratios and reduce these down to a representative set.
    • We will also explore the impact of uncertainty in constraint of up-dip and down-dip locking of the interface and uncertainty in the shape of the interface (e.g. depth beneath Wellington).
Puysegur Rupture Sets
  • We will follow similar methods as for Hikurangi; however, with fewer constraints and a lower risk profile for New Zealand, Puysegur is considered a lower priority than Hikurangi in the 2022 revision.
Joint Crustal-Hikurangi Ruptures

This work is given a lower priority than developing independent ruptures. Potential implementation paths include: no joint rupture, limited/selected key joint ruptures (e.g. Wairarapa + Hikurangi) or a more extensive and systematic suite. Which of the paths chosen will be based on available time once successful independent results can be confirmed. This will include developing
an understanding of the sensitivity of hazard tojoint ruptures.

Deformation Models

The deformation models provide the fundamental slip rate information for all faults in the fault model.

Earthquake Geology Deformation Model

Known slip rate data is being compiled with the CFM and will be applied to fault sections. Uncertainty estimates will be included in the form of a best estimate with upper and lower credible bounds. Ideally, the Earthquake Geology Deformation Model will be independent of the Geodetic Deformation model to maintain independence in the logic tree; however, for faults with unknown slip rate, at this point we have no known approach to use other than Geodetic estimates. We currently anticipate developing a single Earthquake Geology Deformation Model.

Geodetic Deformation Models

We will focus on using backslip-based geodetic deformation models to determine contemporary moment accumulation rates on the faults in the fault model (i.e. we will be determining slipdeficit rates). The backslip rates will be obtained for all faults in the source model by fitting to geodetic strain rate models using four different approaches to derive strain rates from the existing geodetic velocity field. A subset of the backslip models will be coupled with a traditional block modelling approach (constrained by the geodetic velocity field rather than strain rates), which will be used to estimate the backslip rates for the Hikurangi subduction zone and other major faults. We will pursue alternative approaches to partitioning the block model slip rates and backslip model slip rates to generate a suite of geodetically based slip deficit rate models. This will allow us to understand the impact of modeling methodologies and data uncertainties on the variability and uncertainties in the strain rates and slip deficit rates. 

The output suite of deformation models will need to be reduced to a representative suite of models. We will investigate multiple methods to represent the model space in the least biased way (e.g. combining strain rate models, combining deformation models, developing a ‘backbone’ model and/or by weighting a suite of models). We will also explore utilising the residuals to the backslip models as input to the background seismicity model to characterise offfault deformation rates

3.3.2 Distributed Seismicity Models

Distributed seismicity models allow for earthquakes on unknown faults. A recent study estimated that about 50% of major active faults in New Zealand may be unknown (Nicol et al. 2016). We will pursue two approaches: (i) hybrid gridded models and (ii) uniform area zones. 

Earthquake Catalogue

We will develop a revised earthquake catalogue based on the GeoNet catalogue but with homogenised Mw. The aim is to develop a catalogue that is homogeneous in magnitude but with variable completeness, back to the beginning of the GeoNet catalogue (~1800). Pre1900 data may have to be used as is, and we will determine if the large uncertainty introduced in this data brings sufficient information gain.

Hybrid Model

We will use multiplicative hybrid modelling techniques to develop a gridded seismicity model that includes multiple data sources and other sources of uncertainty. The hybrid modelling technique allows for multiplicative or additive scaling of multiple models and data sets. The scaling is optimised in a learning period and tested in independent time periods.Testing will occur both forward and backward in time. How epistemic uncertainty from the hybrid will be expressed in the final SRM needs to be considered. Components of the final distributed hybrid model will likely include the following:

  • Smoothed Seismicity (PPE; proximity to past earthquakes) – a similar approach to a density-based smoothed seismicity model.
  • Uniform Poisson model – typically used as a base model of least information of the hybrid.
  • Geodetic Strain Rate – four strain rate maps will be considered as input into the strain rate hybrid. The potential for double counting of strain between the Hybrid and the Geodetic Deformation Model will need to be understood and corrected.
  • EEPAS (every earthquake a precursor according to scale) – a clustering model that has peak information over roughly a 15–20 year time-frame. After 20 years, the information in the model decays. This model adds significant forecasting skill over smoothed seismicity models. The rate will be applied to the model by calculating, for example, a mean annual rate for the forecast time of interest (similar to conditional time-dependent rates on faults). We will not update the EEPAS rates during the life of the 2022 NSHM.
  • Other data inputs – proximity to mapped faults (PMF), proximity to plate interface (PPI).
  • Catalogue options:
    • Declustering. There is no true declustering method, and the choice of method can have a significant impact on the final estimated hazard, particularly in regions dominated by the distributed seismicity mode. We will explore multiple declustering methods and assess their impact on hazard estimates. If the Boyd (2012) method is used, we will need to ensure consistency with that method. Non-declustered catalogues will also be considered.
    • Time-dependence. The choice of time-length of the catalogue used has an impact on the final hazard forecast. Any earthquake catalogue has insufficient data to ensure a random and robust sample and will be impacted by non-stationarity of earthquake rates. In other words, the longest time period may not be any more likely to represent long-term earthquake rates than a shorter time period. Time variability represents an uncertainty in the hazard estimates and will be explored. Additionally, magnitude uncertainty and threshold of completeness increases significantly as the age of the catalogue increases, particularly pre-1900. Increased magnitude uncertainty may bias the hazard upwards. The trade-off of this increased uncertainty with the additional information it provides needs to be considered.
Regionalised and Uniform Area Zone Model

Past New Zealand NSHMs have relied solely on smoothed seismicity models to estimate distributed seismicity rates. Recent work (Gerstenberger and Schorlemmer, in revision) has shown that, for low seismicity regions, zones with constant rate are likely to have more forecast skill than smoothed seismicity models. We will therefore develop an alternative model to the hybrid based on uniform area zones that contain a uniform earthquake rate.

  • Gutenberg-Richter (GR) parameters: each zone will contain a single GR a-value and bvalue. The 2012 NSHM used area zones for b-value calculations.
  • Seismotectonic Zones
  • Other seismicity parameters will also be regionalised and will be applied to both the uniform area zones and hybrid seismicity models for the hazard calculations:
  • Maximum magnitude within a zone.
  • Preferential strike: we will consider if there is evidence for preferential strike within a zone. A magnitude dependent function will be considered.
  • Faulting mechanisms.
  • Hypocentral Depth distributions.

3.3.3   Slab Model

Development of the slab models for Hikurangi and Puysegur have been a low priority to date. We currently consider two options, a 3D gridded model based on Stirling et al. (2012) and alternative options based on OpenQuake capabilities that require further understanding.

3.3.4 Conditional Aftershock Model

We will explore the appropriateness of the application of the Boyd (2012) conditional aftershock model, including any implications it may have for end-users. Note: Boyd (2012) is not a time-dependent clustering model and is not similar to what was done in the Canterbury Seismic Hazard Model (Gerstenberger et al. 2014, 2016). It does not require updating and does not model a specific aftershock sequence. This method simply acknowledges that the total NSHM rate of earthquakes is too low when aftershocks are not included.

3.3.5   Total Rate / Moment Balance

The total rate of earthquakes expected by the NSHM can be considered an output of the model (e.g. as in Stirling et al. 2012) or the NSHM output can be constrained based on expected rates, which is the more common approach. In the UCERF3 model, this constraint contributed the largest uncertainties when loss was considered. When used as a constraint, it implicitly includes time-dependence, unless a truly long-term rate is used (however, this remains unknown). We will consider a range of catalogue, geodetic and geology-based constraints. Both moment- and rate-based constraints will be considered.

3.3.6 Magnitude Scaling Relations

Epistemic uncertainty in the scaling of magnitude as a function of source rupture area will be a feature of the new source model. Multiple relations will be assembled for crustal fault sources and also for interface sources. Weighting for the relations will be informed by their performance in residual analyses. A flatfile of historical large-to-great earthquakes from 1990 onwards is being assembled to provide the basis for the residual analyses.

4 Ground Motion Characterisation Model

4.1 2010 National Seismic Hazard Model

The GMCM largely defines what the details of the outputs of the NSHM look like. For example, the GMCM provides models for a suite of ground-motion intensity measure types, which are of interest to NSHM users. For the 2010 NSHM, the GMCM consisted of a single groundmotion model of McVerry et al. (2006). The McVerry et al. (2006) ground-motion model provided predictions for peak ground acceleration (PGA) and absolute acceleration response spectra at oscillator periods between 0.075 and 3 s.

Ground motion aleatory uncertainty was considered in the 2010 NSHM via the McVerry et al. (2006) standard deviation model but no epistemic uncertainty was considered. With no consideration of epistemic uncertainty, and also influenced by how the SRM was constructed, the outputs of the 2010 NSHM were considered ‘best estimate’ and not explicitly mean hazard.

4.2 Goals for National Seismic Hazard Model Revision

The NSHM revision will improve on the GMCM used for the 2010 NSHM. Particular areas for improvement are:

  • to provide models for a wider range of intensity measures that are of interest to NSHM users
  • to have a model that better utilises recorded New Zealand ground-motion data, and
  • to provide a modern characterisation of epistemic uncertainty.

The latter point is critical for the NSHM to provide estimates of epistemic uncertainty in the hazard results.

The programme of work for the GMCM has yet to be defined beyond the six-month period between August 2020 and January 2021. In this time, the GMCM working group has two goals: to develop a ‘minimum viable product’ GMCM and to decide the work priorities for the following 18 months.

A ‘minimum viable product’ GMCM is a draft model that may not be technically satisfying but stitches the pieces of work together in an efficient manner and is able to produce hazard outputs. The benefits of a minimum viable product are that it allows us to determine the type of computational resources we need to run the model, define data format and storage solutions and design the architecture of the software for an efficiently maintained data product.

To inform the work priority decisions for the following 18 months, a suite of tests will be performed. These tests will centre around evaluating the performance of existing models against New Zealand data and the sensitivity of hazard results to various aspects of existing ground-motion models. By testing the models against data, we will understand if New Zealand data differs markedly from existing ‘global’ ground-motion models and whether adjustments of these models are necessary for reliable application in New Zealand. The hazard sensitivity tests will help us to understand which aspects of ground-motion models have the greatest influence on the final results, for example:

  • How important is the large-magnitude scaling of Hikurangi subduction interface ground-motions to the seismic hazard in New Zealand?
  • How important are distant Hikurangi and Kermadec interface motions to the long-period seismic hazard in Auckland?
  • Do non-linear soil response models greatly influence hazard results in New Zealand’s highest hazard regions?
  • Do elaborate epistemic uncertainty models influence hazard greatly at probabilities of exceedance that are of interest to New Zealand’s NSHM users, as opposed to simpler methods?
  • Do path attenuation effects greatly influence hazard results in low-to-moderate seismicity regions of New Zealand?

Depending on how sensitive the hazard results are to these and other effects, we will prioritise our subsequent work accordingly.

4.3 Minimum Ground Motion Characterisation Model

To get to the minimum viable product GMCM, the working group has separated into three subgroups, each working on:

  • the ground-motion database
  • the ground-motion models, and
  • the previously described hazard sensitivity studies.

New Zealand already has an existing ground-motion database (Van Houtte et al. 2017; Kaiser et al. 2017) that contains nearly all of the major events recorded in New Zealand (Mw approximately greater than 4.5). There is a general desire from the GMCM working group to expand the database to include smaller-magnitude data, thereby increasing the database size. Smaller magnitude data tends to better constrain certain aspects of the model, particularly the spatial variation of path attenuation, and site-specific amplification at seismic recording stations, although these data also come with modelling issues. Perhaps the primary benefit to updating the ground-motion database is to incorporate the large quantity of measured site information (e.g. the Vs30, Z1 and Z2.5 parameters) that has been collected in the past few years. This will allow the models to better partition uncertainty across source, path and site terms and possibly improve median model predictions.

Another improvement to the New Zealand ground-motion database will be a parallel database of simulated shaking estimates that can be used jointly or separately from the observational database. Many GMCMs around the world have simulations underpinning the ground-motion models, particularly the hanging wall and non-linear soil response models underpinning the NGA-West2 models. Unfortunately, these simulated data are not typically made available alongside the recorded ground-motion data, which precludes adjustments and updates of the GMCMs as simulated data methods improve. Any simulated data used in this project will be included in the ground-motion database. The six-month workplan includes setting up this workflow, but how simulated data will be used in the project will become clearer after the hazard sensitivity studies are completed.

The ground-motion models working group is initially focusing on collating all available models and incorporating them into the OpenQuake hazard software. With the recent publication of parts of the NGA-Subduction project, many new subduction interface models are now available, including some specifically fit to New Zealand data. The group will then compare available models (crustal, subduction interface and subducted slab models) to New Zealand data to ascertain whether adjustments are necessary. These adjustments can be facilitated by the Hassani and Atkinson (2017) approach, where adjustments for ‘stress drop’, path attenuation, site attenuation and crustal amplification are simple to apply. An initial challenge will be determining whether it is necessary to derive a reference rock profile for New Zealand and whether the reference rock profile needs to be constrained with independent datasets.

4.4 Hazard Outputs

The GMCM largely defines the type of outputs provided by the NSHM. The working group is currently working under the assumption of providing:

  • Models for peak ground velocity (PGV), PGA and 5%-damped pseudo-acceleration spectra for 22 oscillator periods between 0.01 and 10 s. These models can be used for mean hazard, hazard disaggregation and mean magnitudes, at a minimum.
  • These hazard metrics will be provided for average horizontal motions (‘RotD50’) and maximum horizontal motions (‘RotD100’) to take into account horizontal polarisationof ground-motion, as well as the ‘larger of two as-recorded horizontal components’,for consistency with NZS1170.5:2004.
  • Hazard estimates as a function of Vs30. Such parameterisation of site effects is very common overseas but largely incompatible with the NZS1170.5:2004 site class. The NSHM Core Team is liaising with the TAG to best facilitate this issue.

Additional hazard outputs may be able to be provided, subject to budgetary constraints. These additional outputs will be prioritised according to the end-user group on the TAG and include conditional mean spectra (and other conditional intensity measures), inelastic response spectra and other values of damping.

4.5 Wellington Basin

Sedimentary basins around New Zealand are known to amplify and modify earthquake ground motions. In Wellington, amplification effects arising from the propagation of waves through the Wellington basin were identified as one factor likely to have exacerbated damage during the Kaik–oura earthquake (Bradley et al. 2018; MBIE 2017). Basin amplification effects were observed in the 1–2 s spectral period range corresponding to typical fundamental resonant periods of mid-rise structures. The geometry and sediment fill of the Wellington basin is relatively well-characterised (Kaiser et al. 2019) and provides a good case study to investigate ways to model local site and basin amplification effects not always fully captured in NSHM. We aim to gain an understanding of how well new NSHM ground-motion modelling captures basin amplification effects and trial advanced approaches to model these specific effects in Wellington.

Based on the lessons learned, we will provide a suggested roadmap (white paper) to improve how we capture local basin amplification effects in urban seismic hazard nationally.

To develop an understanding of what the final roadmap might look like, we will initially focus on
the following:

  • Detailed Vs30 maps of Wellington basin, an updated velocity model and the definition of generic
    velocity profiles as needed for the GMWC modelling subgroup.
  • Simulated 3D ground-motions from large events impacting the Wellington basin (e.g. Hikurangi interface, Wellington Fault, Kaik–oura earthquake)
  • Non-ergodic linear site effects terms at strong motion stations and assessment of how well site/basin effects are captured by modern and updated NSHM ground-motion models
  • Assessment of spatially variable amplification by merging results of the above.

5 National Seismic Hazard Model Testing

Useful testing of the forecast skill of seismic hazard models remains a difficult challenge. Statistical tests of model components will be included in both the SRM and GMCM model development (e.g. retrospective testing of geodesy and catalogue-based models). A later stage of the project will test the hazard estimates produced from the final 2022 NSHM against observations. A focus will be ground-motion-based testing at sites around New Zealand (e.g. Stirling and Petersen 2006; Stirling and Gerstenberger 2010). Sensitivity tests will also need to be conducted in both hazard and risk space.

6 Service Delivery

We have five primary Service Delivery objectives (1–5 below) for the NSHM Revision; these are summarised below. In order to achieve the first four, we will be working closely with the SRM and GMCM working groups to ensure that their science is being incorporated, stored, documented, etc. in a way that makes sense to them and supports the aims above. The final objective, in particular, relies heavily on input from the TAG and wider end-user communities.

In addition to the development of a revised seismic hazard model and delivery of useful and usable results, we are working to support the following aims for the project:

  • Reliability through implementation of best-practice development and systems, where possible. GitHub is our primary tool in ensuring our codebase is version controlled and tested.
  • Transparency through collaboration, testing and documentation and open-source data, models and results (e.g. use of GitHub, Slack).
  • Accessibility to data, models and results for both internal and external users and endusers.

Objective 1: The Grand Inversion and OpenSHA

The development of inversion models by the SRM follows the same (or a similar) recipe as UCERF3 and uses software developed for UCERF3 and OpenSHA. The main steps to achieving that from this group include:

  • developing tools to translate the New Zealand CFM into a format appropriate for OpenSHA
  • ensuring that we are able to use visualisation tools such as SCEC-vdo in order to understand and analyse our results
  • working with the OpenSHA team to understand the OpenSHA codebase (including new code to enable inversion modeling of a subduction zone) and computational requirements in order to implement New-Zealand-specific constraints and to set up sufficient infrastructure to test and run code to create rupture sets consistently
  • producing rupture sets, and
  • testing and benchmarking the inversion procedure to validate/confirm technical feasibility of the different options for SRM fault-based models (see Figure 3.2).
Objective 2: Hazard Calculation

We consider OpenQuake and OpenSHA to be our main options for seismic hazard calculation engines.

Currently, our preference is to use OpenQuake. The GMCM working group is using OpenQuake for the hazard sensitivity analysis to inform its work planning and is developing new NewZealand-relevant ground-motion models for this engine. OpenQuake’s wide international user base, integration with risk and current uptake by New Zealand stakeholders are also compelling reasons for this preference. However, several concerns remain:

  • Additional work is required to efficiently use inversion models in OpenQuake. Note: work has begun between USGS and the Global Earthquake Model Foundation / OpenQuake on this topic.
  • Uncertainty surrounding the long-term plans for continued development of OpenQuake.
  • As an alternative, we are confident that OpenSHA will be able to ingest the source models that the SRM working group is aiming to develop. If it is necessary to use only OpenSHA, our understanding is that the GMCM will be able to implement any new models into that engine; however, this needs to be better understood.

We still need to:

  • Determine and set up a consistent environment in which we will develop on and run these engines.
  • Evaluate and implement a storage system for inputs and results.
Objective 3: Tools and Infrastructure

In order to achieve our objectives for this project, we are making use of a number of tools to support development and implementation of the NSHM and different aspects of its science. As key examples, we are currently:

  • using GitHub for code storage, version control and documentation;
  • implementing a continuous integration system around our versions of OpenSHA and OpenQuake in order to run tests and verify that changes we make to those codebases are working correctly;
  • exploring higher-power computing options for producing rupture sets and eventually running hazard calculations (e.g. an internal GNS Science cluster, New Zealand super computing resources, cloud resources); and
  • using Jupyter notebooks as a tool for developing repeatable tests and sharing codes that can be easily run by less technical users.
Objective 4: Storage and Sharing

Other important components that underpin our ability to achieve any of the outcomes of this project are how we store and enable sharing of data and models and how we manage that information and changes to it. These components need to be stored in a way that is flexible, reliable, secure and accessible by the intended users and must be clearly and reliably documented.

Currently, we are exploring cloud storage solutions such as Amazon S3, which would enable scalable, secure, accessible and reliable storage that we can use to develop appropriate solutions for sharing our data and models.

We are also evaluating our datasets, models and codebases in order to develop appropriate data management plans for them. These plans will centrally record our storage and access solutions (including GitHub), as well as evaluate maintenance and
update expectations, risks and procedures around these components.

Objective 5: Results Delivery

A primary output for this project is the dissemination of useful and usable results to our endusers via a publicly available, web-based tool. In order to do this successfully, we need to:

  • Determine the infrastructure needs of a web tool that can:
  • provide necessary documentation and clearly communicate liability and any limitations/constraints of the results being provided
  • accept results from any calculation engine we are required to use and appropriately access our storage solutions, and
  • deliver static results and potentially dynamically calculate results with user input in a user-friendly
    way.
  • Confirm with end-user communities that the outputs we believe we need to provide and the formats in which they are provided are appropriate. We are currently aiming to provide:
    • The NSHM sub-components, including SRM component models, GMCM component models (e.g. via OpenQuake) and the logic-tree specifications.
    • Hazard curves: annual probability of exceedance versus shaking (in terms of spectral acceleration with 5% damping) by location, site condition and period.
    • Hazard curves with some level of uncertainty to be determined.
    • Hazard spectra: shaking versus period by location, site condition and probability of exceedance.
    • Hazard spectra with some level of uncertainty to be determined.
    • Hazard maps showing shaking by probability of exceedance, period and for various site conditions (using Vs30, Z1 and Z2.5).
    • Sets of results that are appropriate for risk will also be considered and made available.
    • The above results to be provided in CSV and HDF5 format, ideally with appropriate plots, tables and maps.

Ideally, we would also like to include:

  • Disaggregation:
    • Average magnitudes usable by geotechnical engineers.

Other items on the table but not necessarily in scope for this two-year project include:

  • vertical spectra
  • inelastic response spectra
  • conditional mean spectra (and other conditional intensity measures)
  • other values of damping, and
  • measures of duration.

For the Reference list and the response by the TAG to the framework, including comments and recommendations, see the full report at https://www.gns.cri.nz/NSHM/project-outputs 

Tags : #NHSM#Seismic

Published
16/12/2020
Collection
NZ Geomechanics News
Authors(s)
Matt Gerstenberger
Compilation
NZ Geomechanics News>Issue 100 - December 2020
Link
N/A
Issue
100
Volume
N/A
Version
N/A
Location
New Zealand
Type
Technical
Tags
N/A
ISBN
ISSN
0111-6851

Leave a Reply