Friday session 1 (Zoom) (13:00–14:40 GMT)

View slides (pdf) (available under a CC BY 4.0 license)

Inverse problem applications often require finding the minimum and Hessian of an optimization problem with partial differential equation constraints. Computing the Hessian of the cost functional is useful to estimate the uncertainty in the inverse problem solution from both a deterministic and Bayesian point of view. Direct computation of the Hessian, however, is prohibitively expensive for inverse problems with high dimensionality. We present a computational algorithm that computes the Hessian as a by-product of solving the inverse problem at practically no additional cost. It is based on solving using conjugate gradient (CG) inner iterations to solve for Newton updates in outer iterations to find the minimum. As an iterative matrix solver, an advantage of CG is that of short term recurrence preserves global conjugacy of the search directions, and therefore prior searches may be discarded. By saving conjugate directions and the action of the Hessian on those directions, we show that we can recover the full Hessian while computing the minimum. We present the algorithm in weak form in Hilbert space, and implement it in FEniCS. We verify the implementation in simulated inverse problems of modest size, and demonstrate its applicability to real data in an application of ultrasound elastography.