## Motivation:

Physical sciences have often made progress by identifying useful coordinate transformations. Whether we are using Euclidean Geometry for Newton’s Calculus or Riemannian Geometry for General Relativity, the objective is to find a useful and parsimonious parametrization that allows us to disentangle the variables responsible for the data-generating process. In the past this required human ingenuity but today thanks to the invention of computers we can run algorithms that allow us to discover useful parametrizations for a given dataset.

If our data happens to be organised into rows and columns, like a matrix, one very powerful algorithm is the Singular Value Decomposition. It is in some sense a data-driven Fourier Transform and the objective of this article is to provide an existence proof that such decompositions are possible for rectangular matrices.

## Derivation of the Singular Value Decomposition:

Given a matrix $X \in \mathbb{R}^{m \times n}$ we may form the correlation matrices $X \cdot X^T \in \mathbb{R}^{m \times m}$ and $X^T \cdot X \in \mathbb{R}^{n \times n}$. Now, since these correlation matrices are both square they both have eigendecompositions and we may show that they have the same eigenvalues.

Indeed, let’s suppose:

$$\exists \vec{u} \in \mathbb{R}^{m},X \cdot X^T \cdot \vec{u} = \lambda \vec{u}$$

then we have:

$$\vec{v} = X^T \vec{u} \in \mathbb{R}^n$$

$$X^T \cdot X \cdot \vec{v} = \lambda \vec{v}$$

so the eigenvectors of $X \cdot X^T$ and $X^T \cdot X$ are linear transformations of each other and we may deduce that they have the following eigendecompositions:

$$\exists V \in \mathbb{R}^{n \times m}, X^T \cdot X \cdot V = V \cdot \text{diag}(\vec{\lambda})$$

$$\exists U \in \mathbb{R}^{m \times m}, X \cdot X^T \cdot U = U \cdot \text{diag}(\vec{\lambda})$$

so $\text{diag}(\vec{\lambda}) \in \mathbb{R}^{m \times m}$.

Furthermore, we may show that all the eigenvalues $\lambda_i$ are non-negative since $X^T \cdot X$ is positive-semidefinite:

$$\forall z \in \mathbb{R}^n, z^T (X^T \cdot X) z = (Xz)^T (Xz) = \lVert Xz \rVert^2 \geq 0$$

and from this it follows that $\text{diag}(\vec{\lambda})= \Sigma^2$ for some $\Sigma \in \mathbb{R_{+}}^{m \times m}$.

Now, we may note that $X \cdot X^T$ is real and symmetric so we have:

$$X \cdot X^T = U \Sigma^2 U^{-1} = (U \Sigma^2 U^{-1})^T \implies U^T = U^{-1}$$

and so we may deduce that $U$ and $V$ are both unitary matrices.

Combining these results, we have:

$$X \cdot X^T = U \Sigma^2 U^T = (U \Sigma V^T) \cdot (V \Sigma U^T) = (U \Sigma V^T) \cdot (U \Sigma V^T)^T$$

Finally, we claim that $X = U \Sigma V^T$ since by (1) and (2) we have:

$$X \cdot \vec{v_i} = \lambda_i \vec{u_i} \implies \lVert X \vec{v_i} \rVert^2 = \lambda_i = \sigma_i^2$$

which implies:

$$\vec{u_i}^T \cdot X \cdot \vec{v_i} = \sigma_i$$

so in matrix form we have:

$$U^T \cdot X \cdot V = \Sigma \implies X = U \Sigma V^T$$

and this concludes our existence proof that a singular value decomposition exists for rectangular matrices.