“AMSI Summer School is one of the first times [many students]
have interacted with mathematicians from outside of their home
institution. It is a great way to form long-lasting connections. ”

Forrest Koch, UNSW

Bayesian Statistical Learning

Lecturer

Dr Matias Quiroz, University of Technology Sydney

Synopsis

The course aims to give a solid introduction to the Bayesian approach to statistical inference. The course contains a mix of theoretical and methodological concepts, with an emphasis on computer implementation of modern Bayesian methods.

Course Overview

The course consists of four modules (see details below). The first module introduces the Bayesian paradigm and develops inferential tools for some simple models. The second module considers more advanced models, such as linear regression, spline regression and classification models. The third module focuses on computation and presents state-of-the-art algorithms to carry out Bayesian inference. Finally, the fourth module presents model comparison techniques and advanced topics such as Bayesian variable selection and hierarchical models.

Module 1: The basics.
The Bayesian paradigm. Single parameter models. Conjugate priors. Prior elicitation, Noninformative priors. Jeffreys’ prior. Multi parameter models. Bayesian computation via simulation. Analytic marginalisation. Marginalisation via simulation.

Module 2: Regression models.
Bayesian prediction. Bayesian inference as a decision theory problem. Bayesian linear regression. Shrinkage priors. Bayesian spline regression. Asymptotics. Normal approximation. Bayesian classification. Generative models (naïve Bayes). Discriminative models (logistic regression).

Module 3: Bayesian computation.
More on Bayesian computations. Monte Carlo integration. Importance sampling. Inverse cdf method. Rejection sampling. Markov processes. The Gibbs sampler. Data augmentation. The Metropolis and Metropolis-Hastings sampler. Hamiltonian Monte Carlo proposals. Efficiency of simulation. Assessing convergence.

Module 4: Model inference and hierarchical models.
Bayesian model comparison. Marginal likelihoods. Bayesian model averaging. Bayesian variable selection. Posterior predictive model evaluation. Hierarchical models. Pooling estimates. MCMC sampling with RStan.

Prerequisites

  • This course is suitable for students who have taken at least one course in statistics above the introductory level and have some programming experience. The students should be confident in their knowledge of concepts such as integrals, derivatives, probability distributions, conditional probability and expectations. The students should also have some basic knowledge in linear algebra, including matrix operations such as multiplication and inverses, and concepts such as eigenvalues and eigenvectors.

Assessment*

  • Each module has a corresponding computer lab.

(*Assessment components subject to change)

Resources/pre-reading (if available)

A textbook that contains most of the material we will cover is

Bayesian Data Analysis (3rd edition) by Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin.
The book is freely available via the first author’s webpage:
http://www.stat.columbia.edu/~gelman/book/.

A good and compact text on some of the mathematical tools used in this subject can be found here: https://gwthomas.github.io/docs/math4ml.pdf. However, note that we will only encounter a subset of these, so students are not expected to know the whole content.

Each module has a computer lab in which the students should implement Bayesian procedures. The recommended programming language to solve the computer labs is R because the material presented will use it and the datasets provided will be in R format. However, students may use any software they want (e.g. Python or Julia).

It is recommended that students install R and an editor for writing R code. RStudio is an excellent choice and it is free for academic use. Students can be obtain Rstudio from https://www.rstudio.com/.

Not sure if you should sign up for this course?

Take this quiz and look at some of the expected foundational skills in this topic

Dr Matias Quiroz, University of Technology Sydney

Matias Quiroz is a Lecturer at the University of Technology Sydney (UTS). Matias was born in Chile, but was raised in Sweden where he obtained his PhD degree in Statistics from Stockholm University 2015, and also a Master’s degree in Engineering Mathematics (major in Financial Mathematics) from Lund University 2009. Prior to his PhD studies, he worked as a Research Assistant at the Research Division of the Central Bank of Sweden. Matias did a Post-Doctoral Fellowship in the University of New South Wales Business School 2017-2019, before joining the School of Mathematical and Physical Sciences at the UTS as a Lecturer in May 2019.

His research interests lie in the area of Bayesian Statistics, particularly in Bayesian computations, such as Monte Carlo methods and variational Bayes. His research has been published in top-tier journals and conference proceedings in Statistics and Machine learning, such as Journal of the American Statistical Association, Journal of Computational and Graphical Statistics, Journal of Machine Learning Research, International Conference on Artificial Intelligence and Statistics (AISTATS) and International conference on Machine Learning (ICML).

You can find out more about Matias’ research on his personal webpage www.matiasquiroz.com.