Introduction:

The AM-GM inequality states that for positive reals \(\{x_i\}_{i=1}^n \subset \mathbb{R}_{+}^n\):

\begin{equation} \frac{1}{n} \sum_{i} x_i \geq \big(\prod_{i} x_i \big)^{\frac{1}{n}} \end{equation}

Interestingly, this may be deduced using Thermodynamic principles. What follows is a proof whose main arguments can be found on different parts of the internet but it raises interesting questions concerning the mathematical structure of the laws of thermodynamics.

It is beyond the scope of the present analysis but I hope to explore this subject in more detail in the future from the angle of information geometry and algorithmic thermodynamics.

The emergence of the AM-GM inequality in a Thermodynamic setting:

Let’s suppose we are in a closed room that contains \(n\) beakers of water, each at temperature \(T_i\). When all beakers are mixed the final temperature will equal the arithmetic mean:

\begin{equation} T = \frac{1}{n} \sum_{i=1}^n T_i \end{equation}

as a consequence of the first law of thermodynamics.

Now, if we consider the entropy change that results from mixing \(n\) beakers we may model this as a two-step process:

  1. Changing the temperatures from \(T_i\) to \(T\).
  2. Mixing all the beakers.

where we use the fact that entropy is a state function. Since all beakers are identical after the first step, the second operation doesn’t change the entropy of the system.

The Second Law implies the AM-GM inequality:

If \(C\) is the heat capacity of water, then the entropy change of the first step is:

\begin{equation} \Delta S = C \cdot \log \big(\frac{T}{T_i}\big) \end{equation}

and the total entropy change is:

\begin{equation} \Delta S_{net} = C \cdot \sum_i \log \big(\frac{T}{T_i}\big) = C \cdot \log \big(\frac{T^n}{\prod_i T_i}\big) \end{equation}

and the Second Law implies:

\begin{equation} \Delta S_{net} \geq 0 \implies \frac{1}{n} \sum_{i} T_i \geq \big(\prod_{i} T_i \big)^{\frac{1}{n}} \end{equation}

which concludes our demonstration.