The Von Neumann Entropy and the Riemann Hypothesis

Assuming that all of physics may be derived from Peano Arithmetic and that all the Quantum Information in the Universe is conserved, we may conjecture that the integers must have been specified as part of the initial state of the Universe and that they have a natural encoding using the Von Neumann Entropy.

Aidan Rocke https://github.com/AidanRocke
03-08-2022

We construct the integers \(\mathbb{N}\) using a generalisation of the double-slit experiment to countably-many slits. Within this thought experiment, it is shown that all the information in the integers may be expressed as a sum of Quantum Amplitudes. This information-theoretic approach to the construction of the integers may be understood as an alternative to the Von Neumann construction of the integers [6]. In this physical construction, it is naturally assumed that the entire construction must take place within the Planck time scale.

Constructing the integers using density matrices:

Given that the entropy of the distribution of primes is naturally expressed using the natural base(i.e. base-e) [1], we may encode compact sets of integers as follows:

\[\begin{equation} \forall X \sim U([1,N]), H(X) = \ln N \tag{1} \end{equation}\]

as all the information in the integers is in the primes. Now, we may encode the integer \(X\) in a lossless manner with a density matrix of dimension \(N \in \mathbb{N}\) provided that:

\[\begin{equation} S \circ \rho_N = \sum_{i=1}^N \lvert \Psi_{i,N} \rvert^2 \cdot \ln \lvert \Psi_{i,N} \rvert^2 = \ln N \tag{2} \end{equation}\]

where \(\rho_N' = U \circ \rho_N \implies S \circ \rho_N' = S \circ \rho_N\) and \(\Psi_{i,N}\) represents the Quantum Amplitude associated with the ith slit in the Nth experiment.

A control-theoretic analysis:

In order to analyse all entropy-valued observables (2), we may define the entropy distribution:

\[\begin{equation} f(t) = \sum_{N=1}^\infty \delta(t-S \circ \rho_N) = \sum_{N=1}^\infty \delta(t- \ln N) \tag{3} \end{equation}\]

where the Dirac delta distribution is a natural choice because the Shannon Source coding theorem tells us that we can’t guarantee lossless compression of integers unless: \(\forall \epsilon > 0, \lvert S \circ \rho_N - \ln N \rvert < \epsilon\).

Moreover, we may note that the causal structure of \(f\) is encoded in its Laplace Transform:

\[\begin{equation} F(s) = \mathcal{L}\{f\}(s) = \sum_{n=1}^\infty \int_{0}^\infty e^{-st}\cdot \delta(t-\ln n)\,dt = \sum_{n=1}^\infty \frac{1}{n^s}= \zeta(s) \tag{4} \end{equation}\]

where \(\zeta\) is the Riemann Zeta function.

As \(\zeta\) is an entire function on \(\mathbb{C} \setminus \{1\}\), by the Weierstrass Factorisation theorem we may reconstruct \(\zeta\) using its non-trivial zeros:

\[\begin{equation} \zeta(s) = \frac{\pi^{s/2} \prod_{\omega} \big(1-\frac{s}{\omega} \big)}{2(s-1)\Gamma(1+\frac{s}{2})} \tag{5} \end{equation}\]

where \(\omega\) denote the non-trivial zeroes i.e. \(0 < \text{Re}(\omega) < 1\). Thus, we may infer that all the predictive information in \(\zeta\) must be concentrated in its non-trivial zeros.

The method of Quantum Amplitudes:

As the Dirac Delta may be viewed as a generalised distribution, we may reformulate (4) as an Expectation:

\[\begin{equation} \mathbb{E}[\Psi] = \int_{\mathbb{R}_{+}} \sum_{n=1}^\infty \Psi_n \cdot P_n(\xi) d\xi = \zeta(s) \tag{6} \end{equation}\]

where \(\Psi_n = e^{-s\xi}\), \(\xi\) denotes entropy and \(P_n(\xi)=\delta(\xi-\ln n)\).

Applying the Law of the Unconscious Statistician:

\[\begin{equation} \mathbb{E}[\lvert \Psi_n \rvert] = \int_{\mathbb{R}_{+}} \lvert \Psi_n \rvert \cdot \delta(\ln n +\ln \lvert \Psi_n \rvert^2)d\Psi_n = \int_{\mathbb{R}_{+}} \lvert e^{-s \xi} \rvert \cdot \delta(\xi - \ln n)d\xi = \frac{1}{\sqrt{n}} \tag{7} \end{equation}\]

which implies that:

\[\begin{equation} \forall n \in \mathbb{N}, \mathbb{E}[\lvert \Psi_n \rvert] = \frac{1}{\sqrt{n}} \implies \zeta(s=\frac{1}{2}+i\omega) = \sum_{n=1}^\infty \frac{1}{n^{\frac{1}{2}+i\omega}} \tag{8} \end{equation}\]

Meanwhile, we’ll note that \(\forall n \in \mathbb{N}, \frac{d \Psi_n}{d \xi}=-s \cdot \Psi_n\) so we have:

\[\begin{equation} \mathbb{E}[\frac{d \Psi}{d \xi}] = -s\cdot \zeta(s) \tag{9} \end{equation}\]

where our uncertainty concerning the outcome of a Quantum Measurement vanishes if and only if an accurate prediction is made using the theory of Quantum Probability.

Furthermore, if we consider where the entropy vanishes:

\[\begin{equation} \forall \xi \neq \ln n, \frac{d \Psi_n}{d \xi} \cdot \delta(\xi -\ln n) = 0 \tag{10} \end{equation}\]

we may determine that the probability wave \(\Psi\) is stationary:

\[\begin{equation} \mathbb{E}[\frac{d \Psi}{d \xi}]=-s \cdot \zeta(s) = 0 \implies \zeta(s)= 0 \tag{11} \end{equation}\]

at precisely those locations where the probability mass of the Dirac Delta is concentrated.

In light of (7), Quantum Measurements where the Von Neumann Entropy collapses are determined by the zeros of the Boltzmann Clock:

\[\begin{equation} \zeta(\frac{1}{2}+it)= \sum_{n=1}^\infty \frac{1}{\sqrt{n}}\cdot e^{-i\ln n \cdot t} \tag{12} \end{equation}\]

so all the non-trivial zeroes of the Riemann Zeta function must lie on the critical line.

Discussion:

The reader might want to know how entropy measurements are made.

We suppose that all \(N \geq 1\) experiments are observed simultaneously, where for the Nth experiment a choice is made(by the Universal Wave Function) as to which slit the particle will go through i.e. a Quantum Amplitude is chosen. Hence, Quantum Amplitudes and the Universal Wave Function are part of an objective reality that is not observable. Assuming that the experimenter is free to choose between different measurement settings, any slit is equiprobable so any slit will do. However, this is not what \(\zeta\) predicts.

As \(\zeta\) is computable, whenever the zeroes of \(\zeta\) are used we are implicitly using the Axiom of Computable Choice to choose global states \(s \in \mathbb{C}, \text{Re}(s) =\frac{1}{2}\) which control the outcome of countably many experiments simultaneously. There are countably many computable functions, and countably many zeroes of the Riemann Zeta function. On the other hand, if every possible experimental outcome occurred simultaneously as posited by Everettian Quantum Mechanics then \(\zeta(s)\) would have uncountably many zeroes as there are uncountably many ways the entropy terms may vanish. As \(\zeta\) is entire, it would be identical to zero so we have a definite contradiction.

Thus, we may conclude that Everettian Quantum Mechanics is incompatible with the Church-Turing thesis and that Quantum Measurements are super-deterministic.

References:

  1. Rocke (2022, Jan. 15). Kepler Lounge: Three master keys for Probabilistic Number Theory. Retrieved from keplerlounge.com

  2. Rocke (2022, March 4). Kepler Lounge: Elements of Quantum Probability theory. Retrieved from keplerlounge.com

  3. Rocke (2022, March 4). Kepler Lounge: Super-determinism via Solomonoff Induction. Retrieved from keplerlounge.com

  4. Rocke (2022, Jan. 3). Kepler Lounge: The Law of Conservation of Information. Retrieved from keplerlounge.com

  5. Rocke (2022, Jan. 5). Kepler Lounge: Revisiting the unreasonable effectiveness of mathematics. Retrieved from keplerlounge.com

  6. von Neumann, John. “Zur Einführung der transfiniten Zahlen”. Acta Scientiarum Mathematicarum (Szeged). 1923.

Citation

For attribution, please cite this work as

Rocke (2022, March 8). Kepler Lounge: The Von Neumann Entropy and the Riemann Hypothesis. Retrieved from keplerlounge.com

BibTeX citation

@misc{rocke2022the,
  author = {Rocke, Aidan},
  title = {Kepler Lounge: The Von Neumann Entropy and the Riemann Hypothesis},
  url = {keplerlounge.com},
  year = {2022}
}