Prerequisite knowledge:
The student should have a solid bachelor level knowledge linear algebra and multivariate analysis. The student should also have knowledge of linear optimization and convex analysis to the level of being able to follow the text and do the exercises from the following:
Linear Programming, A Concise Introduction, Thomas S. Ferguson:
Available at https://www.math.ucla.edu/~
Chapters 1 and 2, along with the accompanying exercises.
Convex Optimization, Stephen Boyd and Lieven Vandenberghe:
Available at http://stanford.edu/~boyd/
Sections: 2.1, 2.2 and 3.1.
Exercises (from the book): 2.1, 2.2, 2.12, 3.1, 3.3, 3.5 and 3.7
Aim of the course:
Continuous optimization is the branch of optimization where we optimize a (differentiable) function over continuous (as opposed to discrete) variables. Here the variables can be constrained by (differentiable) equality and inequality constraints as well as by convex cone constraints. Optimization problems like this occur naturally and commonly in science and engineering and also occur as relaxations of discrete optimization problems. Differentiability of the functions defining the problems allows for the use of multivariable calculus and linear algebra techniques to study the problems and to design and analyze efficient algorithms.
This course aims to provide a concise introduction into the basics of continuous unconstrained, constrained and conic optimization.
Learning goals:
The student will be able to:
- Recognize convex optimization problems and prove results on these.
- Solve the KKT conditions for basic constrained optimization problems.
- Be able to formulate the Lagrange dual, and understand/prove basic results on these problems.
- Give both sufficient and necessary optimality conditions for constrained continuous optimization problems.
- Formulate and recognize conic optimization problems, along with being able to construct and analyze their dual problems.
- Use a range of techniques to solve both unconstrained and constrained continuous optimization problems, and prove results on these techniques. These include the Frank-Wolfe method, gradient / subgradient descent, and interior point methods.
Rules about Homework/Exam
Homework exercises will be given on a weekly basis, together with solutions, but will not be graded.
The course will be assessed by a final exam. The students may bring two sheets of A4 paper with handwritten notes to the exam
Lecture notes/literature
Lecture slides will be provided online during the course.
The course will also rely on the following source material:
Boyd & Vandenberg CVX Book:
[BV] https://web.stanford.edu/~
- Docent: Daniel Dadush