Working group leader: Karin Sigloch

When all random processes are described by Gaussian random variables, fast analytic expression for the final or posterior model can be employed. This is most likely not the case for the data nor forward theory approximations.

The only way forward is then to employ sampling techniques. Sampling techniques are well studied in the mathematical literature, but need a fast solution of the forward problem to overcome the curse of dimensionality due to the high number of parameters that need to be determined. The strongest approximations are therefore required to make the forward problem computationally tractable.

It is essential that we include a realistic description of the uncertainty in ray theoretical approximations, otherwise the posterior probability density could be seriously biased. A lot of progress has been made on the sampling side in high dimensions with the multinest Markov Chain Monte Carlo algorithms.

A fundamental question is if fast approximate forward algorithms with appropriate uncertainty give satisfactory solutions with enough structural detail. Do we need to include better forward theories, which in a Monte Carlo application are computationally exorbitant? For example, a solution to this problem might lie in surrogate modelling.

The idea is to replace the expensive forward theory with a computationally cheap surrogate based on neural networks or support vector machines. The surrogate, although not exact, maps almost all complexities of the expensive forward problem and is much better than ray theoretical approximations.

The uncertainty can also be modelled without any additional cost. While this is quite common in complex engineering problem the concept of surrogates needs to be fully investigated for the imaging and source modelling problems.