Prerequisites

Familiarity with basic linear algebra, probability theory, discrete math, algorithms and complexity theory.

Aim of the course

Today's computers---both in theory (Turing machines) and practice (PCs and smart phones)---are based on classical physics. However, modern quantum physics tells us that the world behaves quite differently. A quantum system can be in a superposition of many different states at the same time, and can exhibit interference effects during the course of its evolution. Moreover, spatially separated quantum systems may be entangled with each other and operations may have ``non-local'' effects because of this. Quantum computation is the field that investigates the computational power and other properties of computers based on quantum-mechanical principles. Its main building block is the qubit which, unlike classical bits, can take both values 0 and 1 at the same time, and hence affords a certain kind of parallelism. The laws of quantum mechanics constrain how we can perform computational operations on these qubits, and thus determine how efficiently we can solve a certain computational problem. Quantum computers generalize classical ones and hence are at least as efficient. However, the real aim is to find computational problems where a quantum computer is much more efficient than classical computers. For example, Peter Shor in 1994 found a quantum algorithm that can efficiently factor large integers into their prime factors. This problem is generally believed to take exponential time on even the best classical computers, and its assumed hardness forms the basis of much of modern cryptography (particularly the widespread RSA system). Shor's algorithm breaks all such cryptography. A second important quantum algorithm is Grover's search algorithm, which searches through an unordered search space quadratically faster than is possible classically. In addition to such algorithms, there is a plethora of other applications: quantum cryptography, quantum communication, simulation of physical systems, and many others. The course is taught from a mathematical and theoretical computer science perspective, but should be accessible for physicists as well.

Rules about Homework / Exam

This is an 8 ECTS course over 15 weeks, plus a final exam, which comes to roughly 13 hours of work per week. There will be homework exercises for each lecture, to be handed in before the start of the next lecture, i.e., next Monday 14:00. You can either hand in a paper version right before the start of the lecture (if you come in late, you're too late for handing in homework), or email a readable file to Andras Gilyen: gilyen at cwi dot nl, with cc to rdewolf at cwi dot nl. Please start the email subject with [QCHW]. If you have questions about the homework exercises, email Ronald. The answers should be in English. Handwritten solutions (or emailed scans thereof) are fine, as long as they are clearly readable. Cooperation among the students is allowed, but everyone has to hand in their own solution set in their own words. Do NOT post the homework questions and/or anwers online. If you use LaTeX and want to draw circuits, you could consider using qasm2circ, which is the package used for the Nielsen-Chuang book. 

Each homework set will get a grade between 1 and 10; if you don't hand it in you'll score a 1 for that week. When determining the average grade for the homework, we will ignore your two lowest scores. The final exam (June 12) will be open book, meaning you can bring the lecture notes, your own notes, homework, and any other papers you want, but no electronic devices (there's the possibility for a re-examination on July 3). The final grade is determined 40% by the homework-grade and 60% by the final exam. In accordance with the Mastermath rules, the final grade will be rounded to the nearest integer (also for Master of Logic students).

Lecture Notes / Literature

Ronald's lecture notes

Those who want to read more (much more...) can consult the standard textbook in this area:  Michael A. Nielsen and Isaac L. Chuang, Quantum Computation and Quantum Information, Cambridge University Press, 2000.

Lecturers

Ronald de Wolf (CWI and ILLC