Motivation:

Physical sciences have often made progress by identifying useful coordinate transformations. Whether we are using Euclidean Geometry for Newton’s Calculus or Riemannian Geometry for General Relativity, the objective is to find a useful and parsimonious parametrization that allows us to disentangle the variables responsible for the data-generating process. In the past this required human ingenuity but today thanks to the invention of computers we can run algorithms that allow us to discover useful parametrizations for a given dataset.

If our data happens to be organised into rows and columns, like a matrix, one very powerful algorithm is the Singular Value Decomposition. It is in some sense a data-driven Fourier Transform and the objective of this article is to provide an existence proof that such decompositions are possible for rectangular matrices.

Derivation of the Singular Value Decomposition:

Given a matrix we may form the correlation matrices and . Now, since these correlation matrices are both square they both have eigendecompositions and we may show that they have the same eigenvalues.

Indeed, let’s suppose:

\begin{equation} \exists \vec{u} \in \mathbb{R}^{m},X \cdot X^T \cdot \vec{u} = \lambda \vec{u} \end{equation}

then we have:

\begin{equation} \vec{v} = X^T \vec{u} \in \mathbb{R}^n \end{equation}

\begin{equation} X^T \cdot X \cdot \vec{v} = \lambda \vec{v} \end{equation}

so the eigenvectors of and are linear transformations of each other and we may deduce that they have the following eigendecompositions:

\begin{equation} \exists V \in \mathbb{R}^{n \times m}, X^T \cdot X \cdot V = V \cdot \text{diag}(\vec{\lambda}) \end{equation}

\begin{equation} \exists U \in \mathbb{R}^{m \times m}, X \cdot X^T \cdot U = U \cdot \text{diag}(\vec{\lambda}) \end{equation}

so .

Furthermore, we may show that all the eigenvalues are non-negative since is positive-semidefinite:

\begin{equation} \forall z \in \mathbb{R}^n, z^T (X^T \cdot X) z = (Xz)^T (Xz) = \lVert Xz \rVert^2 \geq 0 \end{equation}

and from this it follows that for some .

Now, we may note that is real and symmetric so we have:

\begin{equation} X \cdot X^T = U \Sigma^2 U^{-1} = (U \Sigma^2 U^{-1})^T \implies U^T = U^{-1} \end{equation}

and so we may deduce that and are both unitary matrices.

Combining these results, we have:

\begin{equation} X \cdot X^T = U \Sigma^2 U^T = (U \Sigma V^T) \cdot (V \Sigma U^T) = (U \Sigma V^T) \cdot (U \Sigma V^T)^T \end{equation}

Finally, we claim that since by (1) and (2) we have:

\begin{equation} X \cdot \vec{v_i} = \lambda_i \vec{u_i} \implies \lVert X \vec{v_i} \rVert^2 = \lambda_i = \sigma_i^2 \end{equation}

which implies:

\begin{equation} \vec{u_i}^T \cdot X \cdot \vec{v_i} = \sigma_i \end{equation}

so in matrix form we have:

\begin{equation} U^T \cdot X \cdot V = \Sigma \implies X = U \Sigma V^T \end{equation}

and this concludes our existence proof that a singular value decomposition exists for rectangular matrices.