Skip to main content
Department of Information Technology

Probabilistic Machine Learning (PML) PhD course (5+3hp)

Spring 2018

Lectures

Each lecture comes with a list of recommended problems to be solved as shown in the table below. If there are no letters in front of the numbers, they refer to problems in the book by Bishop. If the letters HTF appears in front of the number that means that exercise is to be found in the book by Hastie, Tibshirani and Friedman.

Note that the slides provided below only covers a small part of the lectures, the blackboard is used quite extensively.

The schedule is available via TimeEdit by clicking here.

Nr. Contents Chapter Pres. Problems
1. Introduction 1-2, notes le1 2.13, 2.29, 2.32, 2.34, 2.40, 2.44, 2.47.
2. Linear regression 3, HTF:3 le2 1.25, 1.26, 3.8, 3.9, 3.12, 3.13.
3. Linear classification 4 le3 4.5, 4.19, 4.25, HTF:2.8.
4. Learning (deep) neural networks 5 le4 5.4, 5.16, HTF:11.5.
5. Introducing kernel methods, Gaussian process 6 le5 6.3, probl. (GP part)
6. Gaussian process and support vector machines 6-7 le6 probl. (SVM part) m-file
7. EM and clustering 9, notes le7 9.8, 9.9, 9.11, 12.24 (also in Matlab, see lecture 1).
8. Graphical models 8 le8 8.1, 8.3, 8.4, 8.7.
9. Inference on graphical models, and probabilistic programming 8 le9 8.10, 8.11, 8.19, 8.23, 8.27.
10. Variational inference 9.4, 10.1, 10.2, 10.4, 10.5 le10 10.3, 10.12, 10.13, 10.19, 10.32
11. Variational inference notes1, notes2 See notes
Updated  2019-01-21 11:57:57 by Niklas Wahlström.