**Class Timings:** Wednesdays from 2:00 PM - 5:00 PM

**Venue:** Chern Lecture Hall

**First Meeting:** 09 August 2023

**Course Description:** Entropy is a fundamental quantity which quantifies the amount of chaos, information and complexity of a given system. We will begin by formalising this idea and studying the variational principle (entropy maximisers exist under reasonable hypothesis) and work by Dobrushin, Lanford and Ruelle of Gibbs measures. After this we switch to the more modern era where we look at Gibbs measures in several contexts: Gibbs measures in the context of hyperbolic maps and random fields like the Ising and Potts model. Depending on interest we will dip into more recent works on the dimer model and proper colourings of the Z^d lattice.

**References: **

1) Thermodynamic formalism by Ruelle

2) Gibbs measures and Phase transition by Georgii

3) Lectures on random lozenge tilings by Vadim Gorin

**Course Evaluation:** Student lectures and active participation

**Prerequisites: **1) Basic measure theoretic probability (material present in Durret's book Probability: Theory and Examples 2019 edition until the end of Chapter 5)

2) Basic information theory (Chapter 2 of Elements of information theory by Cover and Thomas).

3) The proof of the ergodic theorem and ergodic decomposition for $\Z^d$ actions (say sections 2.1-2.3 of Equilibrium States in Ergodic Theory by Gerhard Keller)

- Teacher: Nishant Chandgotia