Erdős’ proof of Euclid’s theorem

An information-theoretic adaptation of Erdős’ proof of Euclid’s theorem, which shows that the information content of finitely many primes is insufficient to generate all the integers.

Aidan Rocke https://github.com/AidanRocke
01-01-2022

Let’s suppose that the set of primes \(\mathbb{P}\) is finite so we have \(\mathbb{P} = \{p_i\}_{i=1}^{\pi(N)}\) where \(\pi(N)\) is constant. It follows that if we define the random variable \(1 \leq Y \leq \sqrt{N}\), and the random variables \(X_i \in \{0,1\}\) we may define the uniformly distributed random variable:

\[\begin{equation} \forall Z \sim U([1,N]), Z = \big(\prod_{i=1}^{\pi(N)} p_i^{X_i}\big) \cdot Y^2 \tag{1} \end{equation}\]

where \(\prod_{i=1}^{\pi(N)} p_i^{X_i}\) is square-free.

Now, given the bound on the Shannon Entropy of \(Y\) in base-2:

\[\begin{equation} H(Y) \leq \frac{1}{2}\log_2 N \tag{2} \end{equation}\]

we may use the fact that the Expected Kolmogorov Complexity of a random variable equals its Shannon Entropy [3], \(\mathbb{E}[K_U(Z)]= H(Z) + \mathcal{O}(1)\), in order to deduce:

\[\begin{equation} \mathbb{E}[K_U(Z)] \sim H(Z) = H(Y,X_1,...,X_{\pi(N)}) \tag{3} \end{equation}\]

which implies:

\[\begin{equation} \log_2 N \leq \frac{1}{2}\log_2 N + \pi(N) \tag{4} \end{equation}\]

Thus, we may conclude:

\[\begin{equation} \pi(N) \geq \frac{1}{2}\log_2 N \tag{5} \end{equation}\]

which contradicts the assumption that \(\pi(N)\) is constant and provides us with an effective lower-bound on the prime counting function.

While Euclid’s theorem is typically taken to mean that there are infinitely many primes, this information-theoretic derivation of Euclid’s theorem explains why exactly they must be infinite. The information content of finitely many primes does not suffice to generate all the integers.

Acknowledgements: This line of argument first occurred to I. Kontoyiannis [4,5].

References:

  1. Martin Aigner, Günter M. Ziegler. Proofs from THE BOOK. 1998.
  2. Julian Havil. Gamma: Exploring Euler’s Constant. Princeton : Princeton University Press. 2003.
  3. Peter Grünwald and Paul Vitányi. Shannon Information and Kolmogorov Complexity. 2010.
  4. I. Kontoyiannis, Counting the primes using entropy, IEEE Information Theory Society Newsletter, (2008), pp. 6–9.
  5. I. Kontoyiannis. “Some information-theoretic computations related to the distribution of prime numbers.” In Festschrift in Honor of Jorma Rissanen, (P. Grunwald, P. Myllymaki, I. Tabus, M. Weinberger, B. Yu, eds.), pp. 135-143, Tampere University Press, May 2008.

Citation

For attribution, please cite this work as

Rocke (2022, Jan. 1). Kepler Lounge: Erdős' proof of Euclid's theorem. Retrieved from keplerlounge.com

BibTeX citation

@misc{rocke2022erdős',
  author = {Rocke, Aidan},
  title = {Kepler Lounge: Erdős' proof of Euclid's theorem},
  url = {keplerlounge.com},
  year = {2022}
}