generalized AM-GM inequality

As I was reading through my linear analysis notes today, there was a small
passage where our lecturer Jim Wright provided a proof of a generalisation
of the inequality of arithmetic and geometric means:

Let A, B \geq 0 and 0 \leq \theta \leq 1. Then A^{\theta} B^{1-\theta} \leq \theta A + (1-\theta) B .

I didn’t bother to read the proof as I thought I could probably come up with a
good one myself. Indeed, I am sufficiently happy with the proof I found that
I think it’s worth sharing. Here it is:

Proof:

If AB=0 the proof is trivial. So I shall proceed by assuming the contrary:

  1. Let’s define \beta := \frac{A}{B} \in \mathbb{R}_{+} and divide by  B . We then obtain:\beta^{\theta} \leq \theta \beta + (1-\theta)
  2. Now, let’s define the following differentiable functions:\begin{cases} f(x) =\theta x + (1-\theta)  \\ g(x) =x^{\theta} \\ \end{cases}
  3. The derivatives of these functions are given by:\begin{cases} f'(x) =\theta  \\ g'(x) =\theta x^{\theta-1} \\ \end{cases}
  4. Now, we can quickly reach the following conclusions:a) f(x) \geq g(x) for x \geq 1
    b) For x \in [0,1]  \begin{cases} f(0)  \geq g(0)=0  \\ f(1)=g(1)=1 \\ \end{cases}

and both functions are monotonically increasing on (0,1) so there can’t be x \in (0,1) such that f(x) = g(x) unless \theta equals zero or one.

It follows that f(x) \geq g(x) on (0,1).

I must say that this was an easy problem but I liked the method I found for
solving it. Namely, reducing the number of variables and then replacing
variables with functions that can then be readily analysed. But, we can go a bit further
and show how Hölder’s inequality follows easily.

Hölder’s inequality:

For any two sequences (a_i)_{i=1}^n,(b_i)_{i=1}^n \subset \mathbb{N} , we have:

\sum_{j=1}^{n} a_j b_j = 1 \leq (\sum_{j=1}^{n} a_j^p)^{\frac{1}{p}}(\sum_{j=1}^{n} b_j^q)^{\frac{1}{q}}

for 1 \leq p \leq \infty where \frac{1}{p} + \frac{1}{q}=1 .

Proof:

If we define a=(a_i)_{i=1}^n, b= (b_i)_{i=1}^n , and normalize these vectors we have:

\begin{cases} \alpha = \frac{a}{||a||_p} \\ \beta = \frac{b}{||b||_q} \\ \end{cases}

Now, we may apply the generalised AM-GM inequality to deduce:

\sum_{j=1}^{n} \alpha_j \beta_j \leq \frac{1}{p}(\sum_{j=1}^{n} \alpha_j^p) +\frac{1}{q}(\sum_{j=1}^{n} \beta_j^q)= ||\alpha||_p||\beta||_q

One thought on “generalized AM-GM inequality

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s