loading page

Bayesian Elastic Full-Waveform Inversion using Hamiltonian Monte Carlo
  • Lars Gebraad,
  • Andreas Fichtner
Lars Gebraad
ETH Zürich

Corresponding Author:lars.gebraad@erdw.ethz.com

Author Profile
Andreas Fichtner
ETH Zürich
Author Profile

Abstract

We develop a Hamiltonian Monte Carlo (HMC) sampler which solves a multi-parameter elastic full-waveform inversion (FWI) in a probabilistic setting for the first time. This gives novel access to the full posterior distribution for this type of highly non-linear inverse problem. Typically, FWI has focused on using gradient descent methods with proper regularization to iteratively update models to a minimum misfit value. Non-uniqueness and uncertainties are mostly in this approach. Bayesian inversions offer an alternative by assigning a probability to each model in model space given some data and prior constraints. The drawback is the need to evaluate a very large number of models. Random walks from Markov chains counter this effect by only exploring regions of model space where probability is significant. HMC method additionally incorporates gradient information, i.e. local structure, typically available for numerical waveform tomography experiments. So far, HMC has only been implented for acoustic FWI. We implement HMC for multiple 2D elastic FWI set-ups. Using parallelized wave propagation code, wavefields and kernels are computed on an regular numerical grid and projected onto basis functions. These gradients are subsequently used to explore the posterior space of different target models using HMC. The free parameters in these experiments are P and S velocity, and density. Although simulating Hamiltonian dynamics in the resulting phase space is approximated numerically, the results of the Markov chain are nevertheless very insightful. No prior tuning of kernels, data or model space is required, under the constraint that the sampler is properly tuned. After a burn in phase during which the mass matrix is iteratively optimized, the Markov chain is run on multiple nodes. After approximately 100,000 samples (combined from all nodes) the Markov chain mixes well. The resulting samples give access to the full posterior distribution, including the mean and maximum-likelihood models, conditional probabilities, inter-parameter correlations and marginal distributions.
Mar 2020Published in Journal of Geophysical Research: Solid Earth volume 125 issue 3. 10.1029/2019JB018428