Stochastic Thermodynamics of Learning
Unravelling the physical limits of information processing is an important goal of non-equilibrium statistical physics. It is motivated by the search for fundamental limits of computation, such as Landauer's bound on the minimal work required to erase one bit of information. Further inspiration comes from biology, where we would like to understand what makes single cells or the human brain so (energy-)efficient at processing information.
In this talk, we analyse the thermodynamic efficiency of learning. We first discuss the interplay of information processing and dissipation from the perspective of stochastic thermodynamics. We then show that the dissipation of any physical system, e.g. a neural network, bounds the information that it can infer from data or learn from a teacher. We discuss a number of examples along the way and outline directions for future research.