The roots of ergodic theory go back to Boltzmann's ergodic hypothesis concerning the equality of the time mean and the space mean of molecules in a gas, i.e., the long term time average along a single trajectory should equal the average over all trajectories. The hypothesis was quickly shown to be incorrect, and the concept of ergodicity (`weak average independence') was introduced to give necessary and sufficient conditions for the equality of these averages. Nowadays, ergodic theory is known as the probabilistic (or measurable) study of the average behavior of ergodic systems, i.e., systems evolving in time that are in equilibrium and ergodic. The evolution is represented by the repeated application of a single map (in case of discrete time), and by repeated applications of two (or more) commuting maps in case of `higher dimensional discrete time'. The first major contribution in ergodic theory is the generalization of the strong law of large numbers to stationary and ergodic processes (seen as sequences of measurements on your system). This is known as the Birkhoff ergodic theorem. The second contribution is the introduction of entropy to ergodic theory by Kolmogorov. This notion was borrowed from the notion of entropy in information theory defined by Shannon. Roughly speaking, entropy is a measure of randomness of the system, or the average information acquired under a single application of the underlying map. Entropy can be used to decide whether two ergodic systems are not `the same' (not isomorphic).

With a basic knowledge of measure theory, the notions of measure preserving (stationarity), ergodicity, mixing, isomorphism and entropy will be introduced. Also applications to other fields, in particular number theory will be given.

Prerequisites
Measure Theory is mandatory.