**Prerequisites**

- Basic probability theory (in particular conditional probability, expectations, discrete and continuous distributions, Markov's and Hoeffding's inequalities)
- Basic linear algebra (finite dimensional vector spaces, positive definite matrices, singular value decomposition)
- Basic calculus (differentiation and minimisation of multivariate convex functions) as covered e.g. in any bachelor mathematics program in the Netherlands, and as reviewed in the Appendix of the course book [1]. The course does require general 'mathematical maturity', in particular the ability to combine insights from all three fields when proving theorems.

**Aim of the course**Machine learning is one of the fastest growing areas of science, with far-reaching applications. In this course we focus on the fundamental ideas, theoretical frameworks, and rich array of mathematical tools and techniques that power machine learning. The course covers the core paradigms and results in machine learning theory with a mix of probability and statistics, combinatorics, information theory, optimization and game theory.

- Formalize learning problems in statistical and game-theoretic settings.
- Examine the statistical complexity of learning problems using the core notions of complexity.
- Analyze the statistical efficiency of learning algorithms.
- Master the design of learning strategies using proper regularization.

**Lecturers**

- Wouter Koolen, CWI and UT
- Tim van Erven, UvA

- Docent: Hidde Fokkema
- Docent: Wouter Koolen
- Docent: Jack Mayo
- Docent: Tim van Erven