Uncertainty in computer simulations, deterministic and probabilistic methods for quantifying uncertainty, OpenTurns software, Uranie software
Uncertainty quantification takes into account the fact that most inputs to a simulation code are only known imperfectly. It seeks to translate this uncertainty of the data to improve the results of the simulation. This training will introduce the main methods and techniques by which this uncertainty propagation can be handled without resorting to an exhaustive exploration of the data space. HPC plays an important role in the subject, as it provides the computing power made necessary by the large number of simulations needed.
The course will present the most important theoretical tools for probability and statistical analysis, and will illustrate the concepts using the OpenTurns software.
- General methodology for handling uncertainty, presentation of a case study
- Fundamental notions from probability and statistics
- General introduction to the software tools: OpenTurns and Uranie
- Statistical estimation: parametric and non-parametric, testing
- Modeling with non-numerical data: expert judgement, entropy
- Central trend: local and gloal sensitivity indices (design of experiments, sampling, Sobol indices)
- computing the probability of rare events, simulation methods, FORM/SORM
- Distributed computing: parallel solvers, batch jobs submission on a parallel computer, implementation within OpenTurns / Salome
- Introduction to meta-model building, least-squares, other response surface, Kriging, neural networks
- Introduction to polynomial chaos
Learn to recognize when uncertainty quantification can bring new insight to simulations.
Know the main tools and techniques to investigate uncertainty propagation.
Gain familiarity with modern tools for actually carrying out the computations in a HPC context.
Basic knowledge of probability will be useful, as will a basic familiarity with Linux.