# Differentiating under the integral sign

Back when I was in high-school I really enjoyed reading about Feynman’s entertaining numerical exploits. In particular, I remember integration challenges where he would use the Leibniz method of differentiating under the integral sign. I wasn’t taught this method at university but this detail resurfaced in my mind recently when I tackled a problem in hamiltonian dynamics where I had to differentiate under the integral sign. After using this method I decided to take a closer look at its mathematical justification.

Leibniz method:
For an integral of the form:

\begin{aligned} u(x) = \int_{y_0}^{y_1} f(x,y) dy \end{aligned}

For all $x$ in some open interval, the derivative of this integral is expressible as

\begin{aligned} u'(x) = \int_{y_0}^{y_1} f_x(x,y) dy \end{aligned}

provided that $f$ and $f_x$ are both continuous over a region $[x_0,x_1] \text{x}[y_0,y_1]$

Proof:

Given

\begin{aligned} u(x) = \int_{y_0}^{y_1} f(x,y) dy \end{aligned}

\begin{aligned} u'(x) = \lim_{h\to 0} \frac{u(x+h)-u(x)}{h} \end{aligned}

we may deduce:

\begin{aligned} u'(x) = \lim_{h\to 0} \frac{\int_{y_0}^{y_1} f(x+h,y) dy-\int_{y_0}^{y_1} f(x,y) dy}{h} = \lim_{n\to \infty} \int_{y_0}^{y_1} \frac{f(x+\frac{1}{n},y)-f(x,y)}{\frac{1}{n}} dy \end{aligned}

Now, if we define:

\begin{aligned} f_n (y) = \frac{f(x+\frac{1}{n},y)-f(x,y)}{\frac{1}{n}} \end{aligned}

it’s clear that we may apply the Bounded convergence theorem where:

\begin{aligned} \int_{y_0}^{y_1} \lim_{n\to \infty} f_n(y) dy = \lim_{n\to \infty} \int_{y_0}^{y_1} f_n(y) dy \end{aligned}

This is justified as the existence and continuity of $f_x(x,y)$ combined with the compactness of closed intervals implies that $f_x(x,y)$ is uniformly bounded.

Note 1: I plan to share the generalization of this result to higher dimensions in the near future.

Note 2: This blog post is almost identical to the Wikipedia entry with some modifications and clarifications which I find useful.