Computational Methods in Uncertainty Quantification (TCC Short Graduate Course)
The term “Uncertainty Quantification” is as old as the disciplines of probability and statistics, but as a field of study it is newly emerging. It combines probability and statistics, with mathematical and numerical analysis, large-scale scientific computing, experimental data, model development and application sciences to provide a computational framework for quantifying input and response uncertainties which ultimately can be used for more meaningful predictions with quantified
and reduced uncertainty.
We will motivate the central questions in computational uncertainty quantification through illustrative examples from subsurface flow,
weather and climate prediction, material science, nuclear reactor physics and biology. The key challenge that we face in all those
applications is the need for fast (tractable) computational tools for high-dimensional quadrature. After a short overview of the available techniques, we study sampling-based approaches in more detail. We put a particular emphasis on multilevel (or multiscale) methods that exploit the natural model hierarchies in numerical methods for partial differential equations. In the final part of the course, we will briefly consider the inverse problems of Bayesian inference, data assimilation and filtering and show how the multilevel techniques presented in the earlier parts of the course can be extended to these more challenging tasks.
A rough outline of the course is:
- Introduction: What is Uncertainty Quantification?
- Motivating Examples from the Earth Sciences, Material Sciences,
Physics and Biology
- High-dimensional quadrature and tractability
- Uncertainty Propagation: “The Forward Problem” Sampling-based approaches
- Basic Monte Carlo Simulation
- Quasi-Monte Carlo Methods
- Multilevel Monte Carlo Methods
- Stochastic Collocation and Polynomial Chaos
- Uncertainty Quantification: “The Inverse Problem"
- Bayes’ Rule and Bayesian Inference
- Markov Chain Monte Carlo
- Multilevel Bayesian Inference
- Future perspectives: Data Assimilation and Filtering
Doctoral students, postdocs and master students with an interest in reliable scientific computing.
Aims of the Course
The aim of the course is to give a basic, hands-on introduction to the evolving field of large scale uncertainty quantification, with a particular emphasis on novel sampling based approaches for high dimensional parameter spaces. Using the tools and techniques addressed in the course, students should be able to decide independently which computational approaches are most suited to a given problem and carry out simple uncertainty quantification studies in their field of study. They should also be able to give a basic assessment of the complexity of the various approaches and assess their feasibility in a given situation.
[Department of Mathematical Sciences]
[University of Bath]
Last updated 03/12/2008