Entropy and conservation of information

Statistical Mechanics (Spring, 2013)

April 1, 2013

Professor Susskind introduces statistical mechanics as one of the most universal subjects in modern physics in terms of it's ability to explain and predict natural phenomena.  He begins with a brief introduction to probability theory and then moves on to draw the connection between the concept of laws of motion as rules for updating states of a system, and the probability of being in a given state.  Proper laws of physics are reversible and therefore preserve the distinctions between states - i.e. information.  In this sense, the conservation of information is more fundamental that other physical quantities such as temperature or energy.

Professor Susskind then moves on to continuous systems and phase space, and Liouville's theorem.  The lecture concludes with the presentation of formulas for computing entropy, and some examples.

  • Conservation of information