Prerequisites
- Basic probability theory (in particular conditional probability, expectations, discrete and continuous distributions, Markov's and Hoeffding's inequalities)
- Basic linear algebra (finite dimensional vector spaces, positive definite matrices, singular value decomposition)
- Basic calculus (differentiation and minimisation of multivariate convex functions)
as covered e.g. in any bachelor mathematics program in the Netherlands, and as reviewed in the Appendix of the course book [1]. The course does require general 'mathematical maturity', in particular the ability to combine insights from all three fields when proving theorems.
Aim of the course
Machine learning is one of the fastest growing areas of science, with far-reaching applications. In this course we focus on the fundamental ideas, theoretical frameworks, and rich array of mathematical tools and techniques that power machine learning. The course covers the core paradigms and results in machine learning theory with a mix of probability and statistics, combinatorics, information theory, optimization and game theory.
During the course you will learn to
- Formalize learning problems in statistical and game-theoretic settings.
- Examine the statistical complexity of learning problems using the core notions of complexity.
- Analyze the statistical efficiency of learning algorithms.
- Master the design of learning strategies using proper regularization.
This course strongly focuses on theory. (Good applied master level courses on machine learning are widely available, for example here, here and here). We will cover statistical learning theory including PAC learning, VC dimension, Rademacher complexity and Boosting, as well as online learning including prediction with expert advice, online convex optimisation, bandits and reinforcement learning.
Lecturers
Wouter Koolen, CWI and UT
Tim van Erven, KdVI, UvA
- Docent: Hidde Fokkema
- Docent: Wouter Koolen
- Docent: Sarah Sachs
- Docent: Tim van Erven