Prerequisites
The student should have a solid knowledge of linear algebra and multivariable calculus. The student should also have knowledge of linear programming (including linear programming duality) and convex analysis to the level of being able to follow the text and do the exercises from:
- Chapters 1 and 2 (including the exercises) from the book 'Linear Programming, A Concise Introduction, Thomas S. Ferguson' (https://www.math.ucla.edu/~tom/LP.pdf)
- Exercises 2.1, 2.2, 2.12, 3.1, 3.3, 3.5, and 3.7 from the book 'Convex Optimization, Stephen Boyd and Lieven Vandenberghe' (http://stanford.edu/~boyd/cvxbook)
Aim of the course
Continuous optimization is the branch of optimization where we optimize a (differentiable) function over continuous (as opposed to discrete) variables. Here the variables can be constrained by (differentiable) equality and inequality constraints as well as by convex cone constraints. Optimization problems like this occur naturally and commonly in science and engineering and also occur as relaxations of discrete optimization problems. Differentiability of the functions defining the problems allows for the use of multivariable calculus and linear algebra techniques to study the problems and to design and analyze efficient algorithms.
In this course we study the theory, algorithms, and some applications of continuous optimization. In the theory part we discuss convexity, Lagrangian duality, optimality conditions, conic programming, and an application in information theory. In the algorithmic part we discuss derivative free methods, first order optimization methods, neural networks/supervised learning, second order optimization methods, interior point methods, and support vector machines. For some of these algorithms we discuss the convergence analysis.
Lecturer
David de Laat (TU Delft)
- Docent: David de Laat