## Motivation:

How could a reasonable scientist make sense of the astonishing developments in number theory over the course of millenia(dating back to Euclid and Eratosthenes) without a precise application in mind? What more, how did number theory singularly avoid developing into a baroque construct; the tragic fate of all pure sciences that aren’t moulded by practical applications? Yet, if not for these developments secure communications and modern cryptography in general would be non-existent. It is as if the best musicians in history developed the fundamentals of jazz music without a particular audience in mind. Each generation of mathematicians passing down their craft in number theory to the next with a consistency and originality unmatched by any other branch of science.

## The distribution of primes and information-theoretic mechanisms of human cognition:

From the Prime Number Theorem, we may infer that the number of primes less than $$N$$ is given by:

$$\pi(N) \sim \frac{N}{\ln N}$$

What makes the density of primes interesting for an information-theorist is that the typical frequency with which a large integer $$N$$ is prime is inversely proportional to the information gained from observing that integer $$K_U(N)$$ multiplied by the constant $$\frac{\frac{d}{dN} 2^N}{2^N} = \ln 2$$:

$$K_U(N) \sim \log_2(N) \sim \frac{1}{\ln 2} \cdot \Big(\frac{\pi(N)}{N} \Big) ^{-1} \sim \frac{\ln N}{\ln 2}$$

where $$\ln N$$ corresponds to the average information gained from identifying a unique object distributed uniformly among $$N$$ possible locations.

In addition, information theory and information-theoretic formulations of human cognition such as the Free Energy Principle provide us with an ideal vantage-point to clarify the nature of Godfrey Hardy’s remarkable insight:

A science is said to be useful if its development tends to accentuate the existing inequalities in the distribution of wealth, or more directly promotes the destruction of human life. The theory of prime numbers satisfies no such criteria. Those who pursue it will, if they are wise, make no attempt to justify their interest in a subject so trivial and so remote, and will console themselves with the thought that the greatest mathematicians of all ages have found in it a mysterious attraction impossible to resist.-Godfrey Hardy

The second half of this quote is crucial as it appears that we may use information-theory to address part of the mystery Hardy refers to. The information-theoretic arguments presented thus far suggest that to a large extent the human brain is an instrument for data compression besides being a movement co-processor. The implicit argument here is that good mathematicians maximise expected surprise(i.e. information gained).

Besides telling us something profound about the human mind, might this also reveal something important about the distribution of primes? It might, if we first make the reasonable conjecture that all of physics may be simulated by a Universal Quantum Computer. In light of this hypothesis, it is worth considering that a number of reasonable theories of Quantum measurement are not observer-independent. This includes the Wigner-Von Neumann formulation of the measurement problem as well as the Many-World formulation of the measurement problem.

Deeper investigations in these complementary directions may reveal a direct correspondence between information-theoretic mechanisms of human cognition and the distribution of prime numbers.

## References:

1. J. Hadamard, Sur la distribution des z´eros de la fonction ζ(s) et ses cons´equences arithm´etiques, Bull. Soc. Math. France 24 (1896), 199–220; reprinted in Oeuvres de Jacques Hadamard, C.N.R.S., Paris, 1968, vol 1, 189–210.

2. Aidan Rocke (https://mathoverflow.net/users/56328/aidan-rocke), information-theoretic derivation of the prime number theorem, URL (version: 2021-02-20): https://mathoverflow.net/q/384109

3. Lance Fortnow. Kolmogorov Complexity. 2000.

4. John A. Wheeler, 1990, “Information, physics, quantum: The search for links” in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley.

5. Karl Friston. The free-energy principle: a rough guide to the brain? Cell Press. 2009.

6. Hugh Everett Theory of the Universal Wavefunction, Thesis, Princeton University, (1956, 1973), pp 1–140