- have a clear understanding of the foundational concepts of probability theory and bayesian inference.
- have developed methods for uncertainty quantification for solving problems in the real world.
- understand the different components of bayesian inference, namely the prior, likelihood, marginal likelihood and posterior.
- be able to solve simple inverse uncertainty quantification problems using direct methods and markov chain monte carlo.
Uncertainty Quantification is coming! Modern approaches to Bayesian Inference are leaving traditional predictive models in the dust. Get behind the wheel and be part of a new dawn of simple and interpretable probabilistic modelling!
We will focus on giving you an intuition for Bayesian inference, allowing you to apply intricate methods to any problem.
In this course we will cover:
- The foundations of probability and uncertainty, including the frequentist and Bayesian interpretations.
- Interpretations of the components of Bayes' Theorem and how they interact.
- How to fit probabilistic models to data, to yield, not one result, but all the probable results!
- Formulating a Bayesian Inference problem from scratch and solving it using both analytical and numerical methods.
We‘ll work on the basis that you’re pretty new to Python, but have some basic understanding of fundamental programming concepts and can run code locally on your machine.
If you feel like you could do with an introduction or a refresher, then take a look at Seán's free course which will get you up to speed on Python, Getting Started and Python Basics.
Section 1: Intro to Bayes
In this first section, we introduce the foundations of probability and Bayes' Theorem.
- What is Bayes' Theorem and how to use it to compute conditional probabilities?
- How to apply Bayes' theorem to solve inference problems and compute posterior densities from expert knowledge and data.
We are going to start gently by introducing different perspectives on probability and uncertainty. We will then use those insights to formulate inference problems in the Bayesian formalism and solve them on a computer.
Section 2: Bayesian Linear Regression and parameter Estimation
In section two, we will introduce the Bayesian framework for linear regression and non-linear parameter estimation.
- Why would you need Bayesian Uncertainty Quantification for something as simple as linear regression? An example of an ill-posed regression problem.
- How to formulate and solve a nonlinear parameter estimation problem using the Bayesian formalism to explore the full posterior of probable model parameters.
We are going to solve these problems using direct methods when possible and when not, we are going to harness the amazing power of the Metropolis-Hastings algorithm - a simple and elegant approach for exact Bayesian Inference!