## Introduction:

A classical result in the theory of random matrices states that any random matrix sampled from a continuous distribution is nonsingular with probability one. This has many important consequences.

One fundamental consequence is that almost all linear models with square jacobian matrices are invertible.

## The spectral norm and the largest eigenvalue:

For random matrices $A \sim \mathcal{U}([-N,N])^{n \times n}$, if $\lambda$ is the largest eigenvalue of $A$ and $\upsilon$ the corresponding unit eigenvector:

$$\lVert A \rVert_{2} \geq \lVert A \upsilon \rVert_{2} = \lVert A \lambda \rVert_{2} = | \lambda | \lVert \upsilon \rVert_{2} = | \lambda |$$

where

$$\lVert A \rVert_{2} = \sqrt{\lambda_{max}(A^T A)} = \sigma_{max}(A)$$

so the largest eigenvalue of $A$ is less than or equal to its spectral norm.

## Hölder’s inequality and the determinant:

Now, by Hölder’s inequality we have:

$$\lVert A \rVert_{2} \leq \sqrt{\lVert A \rVert_{1} \lVert A \rVert_{\infty}}$$

where $\lVert A \rVert_{1}$ is the maximum column sum and $\lVert A \rVert_{\infty}$ is the maximum row sum:

$$\lVert A \rVert_{1} = \max_{1 \leq j \leq n} \sum_{i=1}^m |a_{ij}|$$

$$\lVert A \rVert_{\infty} = \max_{1 \leq i \leq m} \sum_{j=1}^n |a_{ij}|$$

Given that the determinant of a matrix is the product of its eigenvalues we may use (1) and (3):

$$\det(A) = \prod_{i=1}^n \lambda_i \leq \lVert A \rVert_{2}^n \leq \sqrt{\lVert A \rVert_{1} \lVert A \rVert_{\infty}}^n \leq (\sqrt{n}N)^n$$

and using (6) we may assert the following equivalence relation:

$$\forall A \in [-N,N]^{n \times n} \exists \vec{\lambda} \in [-\sqrt{n}N,\sqrt{n}N]^n, A \sim \vec{\lambda} \implies \det(A-\vec{\lambda}I_n) = 0$$

so the determinant mapping is defined:

$$\det: [-N,N]^{n \times n} \longrightarrow [-(\sqrt{n}N)^n,(\sqrt{n}N)^n]$$

and we can also show that this mapping is analytic.

## The determinant maps sets of positive measure to sets of positive measure:

The determinant is a polynomial in the coordinates of $A \in [-N,N]^{n \times n}$:

$$\det(a_{ij}) = \sum_{\sigma \in S_n} \big( \text{sgn}(\sigma) \prod_{i=1}^n a_{i,\sigma_i} \big)$$

so if we define the set of singular matrices:

$$S = \{A \in [-N,N]^{n \times n}: \det(A)=0\}$$

we note that $\{0\}$ is a set of measure zero and since analytic functions map sets of positive measure to sets of positive measure, $S$ must be a set of measure zero.