Level: M2


Lecturers: Bodo Manthey (U Twente) and Tjark Vredeveld (Maastricht U)


Description:
Traditionally, the classical analysis of algorithms focuses on the worst-case behavior. E.g., what is the worst case running time, or the worst case approximation or competitive ratio. In many settings, this gives too pessimistic insights and does not reflect empirical observations. A classic example is the Simplex Algorithm for Linear Programming, which takes exponential time in the worst case, but works spectacularly fast in practice.

In recent years, there has been increasing focus on exploring more realistic models for various problems and developing a robust theory for the design and analysis of algorithms. This has led to both beautiful mathematical ideas and new algorithmic insights. A nice example is the smoothed analysis framework developed by Spielman and Teng to explain the empirical behavior of Simplex.

This course will focus on methods and techniques to rigorously analyze algorithms beyond the classical worst-case analysis, with special focus on heuristics that work well in practice. We will consider a wide variety of algorithmic problems spanning different areas such as classic NP-Hard problems, online algorithms, compressed sensing and machine learning. These problems will be explored both using probabilistic approaches such as average-case analysis, smoothed analysis and semi-random input models and deterministic approaches such as local-search, resource augmentation, stability and perturbation resilience, refined analysis based on input parameters such as conditions numbers or non-degeneracy conditions.


Prerequisites
* Mathematical maturity
* Basic knowledge of discrete algorithms, see e.g., Ahuja, Magnanti and Orlin, “Network Flows”, Chapters 2-6; or Cormen, Leiserson, Rivest, Stein, "Introduction to Algorithms”, Parts IV and VI.
* Basic knowledge of probability theory: any textbook in your bachelor on probability theory, e.g., Grimmett and Stirzaker, “Probability and Random Processes”, Chapters 1-5.
* Familiarity with randomized algorithms and the probabilistic method in combinatorics will be helpful, but is not strictly necessary.
* Knowledge of discrete optimization is a plus.


Aim of the course
* Understanding and applying methods to analyze algorithms for which the classical worst-case analysis fails to explain the performance.

Rules about Homework/Exam
* Homework assignments and a final exam (depending on the number of participants, the final exam will be written or oral). The homework assignments count for 30% of the final grade, the exam counts for 70%. This also applies to the resit. The grade of the exam (or resit) has to be at least 5.0 in order to pass.


Lecture notes/Literature
* Relevant research articles and other relevant material will be given during the course.
* additional reading (not required): Tim Roughgarden, "Beyond the worst-case analysis of algorithms", Cambridge University Press, 2020.