Most inverse problems are ill-posed due to the fact that inputs such as parameters, physics and data are missing or inconsistent. This results in solution estimates that are not unique or unstable, i.e. small changes in the inputs result in large changes in the estimates. One common approach to resolving ill-posedness is to use regularization methods whereby information is added to the problem so that data are not over-fitted. Alternatively, one could take the Bayesian point of view and assign a probability distribution to the unknowns and estimate it by exploiting Monte Carlo techniques.
In this work we take the regularization approach and use uncertainties to weight added information and data in an optimization problem. This allows us to apply statistical tests with the null hypothesis that inputs are combined within their uncertainty ranges to produce estimates of the unknowns. For example, the Discrepancy Principle can be viewed as using a chi-squared test to determine the regularization parameter.
The chi-squared method developed by myself and colleagues uses a chi-squared test similar to the Discrepancy Principle, but differs in that the test is applied to the regularized residual rather than the data residual. This approach leads to a general methodology of using statistical tests to estimate regularization parameters or uncertainties in an inversion. I will give statistical tests for nonlinear algorithms and show results from benchmark problems in Geophysics. I will also describe how statistical tests can be used to find a regularization parameter for Total Variation and show results from Imaging.

Additional Information

Location: ESB 4133 Jodi Mead (Boise State University)