Random Walks and the Riemann Hypothesis

Starting from the observation that the statistical behaviour of the Mertens function is reminiscent of a one-dimensional random walk on non-zero values of the Möbius function, we demonstrate that the Riemann Hypothesis implies an entropy-bound on the Mertens function.

Aidan Rocke https://github.com/AidanRocke
03-23-2022

The statistical behaviour of a random walk:

Let’s suppose we define a random walk in terms of the sequence of independent random variables \(X_i\) where:

\[\begin{equation} P(X_i = 1) = P(X_i = -1) = \frac{1}{2} \tag{1} \end{equation}\]

so we may define the sum:

\[\begin{equation} S_n = \sum_{i=1}^n X_i \tag{2} \end{equation}\]

Due to symmetry, we have:

\[\begin{equation} \mathbb{E}[S_n] = \sum_{i=1}^n \mathbb{E}[X_i] = 0 \tag{3} \end{equation}\]

and if we measure the standard deviation of \(S_n\), we find:

\[\begin{equation} \sigma_{S_n} = \sqrt{\text{Var}[S_n]} = \sqrt{\mathbb{E}[S_n^2]-(\mathbb{E}[S_n])^2} = \sqrt{n} \tag{4} \end{equation}\]

Upon closer inspection, we may deduce that \(\lvert S_n \rvert = \mathcal{O}(\sqrt{n})\) almost surely; a statistical law that shall illuminate our analysis of the Riemann Hypothesis.

The expected distance travelled by a random walk:

We may demonstrate that \(\mathbb{E}[\lvert S_n \rvert] \sim \sqrt{\frac{2 n}{\pi}}\) using an elegant combinatorial argument.

Let’s suppose we have two buckets of paint; one red and one blue. Our task is to fill \(N\) cups with either red paint, blue paint, or a mixture of red and blue so that \(k\) cups will contain paint of one color while \(N-k\) will contain a mixture. If we define two indexing sets \(I\) and \(G\) such that \(\lvert I \rvert = k\) and \(\lvert G \rvert = N-k\), we’ll note that:

\[\begin{equation} \lvert \sum_{k \in I} X_k \rvert = k \tag{5} \end{equation}\]

\[\begin{equation} \lvert \sum_{k \in G} X_k \rvert \approx 0 \tag{6} \end{equation}\]

as the mixture corresponds to a cancellation of \(+1\) and \(-1\) steps in the random walk. In fact, we may infer that \(\lvert \sum_{k \in G} X_k \rvert \leq 1\) and due to symmetry it suffices that if we calculate the middle term of the binomial, which dominates all the other terms, we find:

\[\begin{equation} P(S_n \approx 0) \sim \frac{1}{2^n} {n \choose \frac{n}{2}} \sim \frac{1}{2^n} \frac{\sqrt{2 \pi n}\big(\frac{n}{e}\big)^n}{\pi n \big(\frac{n}{2e}\big)^n} \sim \sqrt{\frac{2}{\pi n}} \tag{7} \end{equation}\]

and thus we may deduce the expectation:

\[\begin{equation} \mathbb{E}[\lvert S_n \rvert] \sim \frac{n}{2} \cdot P(S_n = 0) + \frac{n}{2} \cdot P(S_n = 0) \sim \sqrt{\frac{2 n}{\pi}} \tag{8} \end{equation}\]

where we implicitly used the fact that \(k=\frac{n}{2}\) may occur either when half the cups are pure red or pure blue.

The statistics of the Mertens function:

We may define the Möbius function \(\mu\) as follows:

\[\begin{equation} \mu: \mathbb{N} \rightarrow \{-1,0,+1\} \tag{9} \end{equation}\]

where \(\mu(n) = (-1)^k\) if the integer \(n \in \mathbb{N}\) has \(k\) distinct prime factors, and \(\mu(n)=0\) if \(n\) is not square-free.

Given our definition of the Möbius function, we may define the Mertens function:

\[\begin{equation} \Phi(n) = \sum_{k=1}^n \mu(k) \tag{10} \end{equation}\]

whose statistical behaviour is curiously reminiscent of the random walk.

As the proportion of integers that are square-free is given by Euler’s product formula for the Riemann Zeta function:

\[\begin{equation} \frac{1}{\zeta(2)} = \prod_{p \in \mathbb{P}} \big(1-\frac{1}{p^2}\big) \tag{11} \end{equation}\]

and since Euler proved that:

\[\begin{equation} \frac{1}{\zeta(2)} = \frac{6}{\pi^2} \approx 60\% \tag{12} \end{equation}\]

we may infer that there is an approximately \(1-\frac{1}{\zeta(2)} \approx 40\%\) chance that \(\forall x \sim U([1,N]), \mu(x)=0\) as \(N \to \infty\).

Furthermore, assuming that the Mertens function’s statistical behavior is indistinguishable from that of a random walk:

\[\begin{equation} \mathbb{E}[\Phi(\zeta(2) \cdot n)] = 0 \tag{13} \end{equation}\]

\[\begin{equation} \sigma_{\Phi(n)} = \sqrt{\mathbb{E}[\Phi(\zeta(2) \cdot n)^2]} = \sqrt{n} + \mathcal{O}(1) \tag{14} \end{equation}\]

where the validity of formulas (13) and (14) may be ascertained from the information-theoretic derivation of the Erdős-Kac distribution [2].

Thus, we may infer that:

\[\begin{equation} \lvert \Phi(n) \rvert = \mathcal{O}(\sqrt{n}) \tag{15} \end{equation}\]

almost surely.

An entropy bound on the Mertens function:

From an information-theoretic perspective, (15) implies that:

\[\begin{equation} H(\lvert \Phi(n) \rvert) = \mathbb{E}[\log_2 \lvert \Phi(n) \rvert] \sim \frac{\log_2 n}{2} \tag{16} \end{equation}\]

which means that \(\lvert \Phi(n) \rvert\) may be encoded using \(\sim \frac{\log_2 n}{2}\) coin tosses almost surely as \(n \to \infty\).

Now, given that for a random variable \(X\):

\[\begin{equation} \mathbb{E}[K_U(X)] = H(X) + \mathcal{O}(1) \tag{17} \end{equation}\]

we may deduce that the expected description length of \(\lvert \Phi(n) \rvert\) is on the order of: \[\begin{equation} \mathbb{E}[K_U(\lvert \Phi(n) \rvert)] \sim \frac{\log_2 n}{2} \tag{18} \end{equation}\]

which implies:

\[\begin{equation} \forall \epsilon > 0, \lim_{n \to \infty} P\big(\frac{\log_2 \lvert \Phi(n) \rvert}{\log_2 n} > \frac{1}{2} + \epsilon \big) = 0 \tag{19} \end{equation}\]

and if (19) is true, it follows that:

\[\begin{equation} \forall \epsilon > 0, \Phi(n) = \mathcal{O}(n^{\frac{1}{2} + \epsilon}) \tag{20} \end{equation}\]

almost surely, which is approximately equivalent to the Riemann Hypothesis.

Random Walks and the Riemann Hypothesis:

A direct connection between the growth-rate of the Mertens function and the Riemann Hypothesis may be established as follows.

Using the Euler product formula for the Riemann Zeta function, we have:

\[\begin{equation} \frac{1}{\zeta(s)} = \prod_{p} \big(1-p^{-s}\big) \tag{21} \end{equation}\]

and due to the Unique Prime Factorisation theorem,

\[\begin{equation} \frac{1}{\zeta(s)} = \sum_{n=1}^{\infty} \frac{\mu(n)}{n^s} \tag{22} \end{equation}\]

which converges for \(\text{Re}(s) > 1\).

Expressing (22) as a Stieltjes integral:

\[\begin{equation} \frac{1}{\zeta(s)} = \int_{0}^{\infty} x^{-s} d \Phi(x) \tag{23} \end{equation}\]

and integrating by parts, (23) simplifies to:

\[\begin{equation} \frac{1}{\zeta(s)} = x^{-s} \Phi(x) \Big|_{0}^{\infty} + s \int_{0}^{\infty} \Phi(x) x^{-s-1} dx = s \int_{0}^{\infty} \Phi(x) x^{-s-1} dx \tag{24} \end{equation}\]

so the reciprocal of the Riemann Zeta function may be expressed as a Mellin transform:

\[\begin{equation} \frac{1}{s \zeta{s}} = \big\{M \Phi\big\}(-s) = \int_{0}^{\infty} x^{-s-1}\Phi(x) dx \tag{25} \end{equation}\]

and using the inverse Mellin transform, we have:

\[\begin{equation} \Phi(x) = \frac{1}{2\pi i} \int_{\sigma - i \infty}^{\sigma + i \infty} \frac{x^s}{s \zeta(s)} ds \tag{26} \end{equation}\]

which is well-defined for \(\frac{1}{2} < \sigma < 2\) assuming that the Riemann Hypothesis is true.

Finally, from (26) we may conclude that:

\[\begin{equation} \forall \epsilon > 0, \Phi(n) = \mathcal{O}(n^{\frac{1}{2}+\epsilon}) \tag{27} \end{equation}\]

which would imply that propositions (18) and (19) are true. Hence, the Riemann Hypothesis is consistent with-but not reducible to-a proposition in Classical Information theory.

References:

  1. Leonhard Euler. Remarques sur un beau rapport entre les séries des puissances tant directes que réciproques. Mémoires de l’académie des sciences de Berlin. 1749.

  2. Rocke (2022, Jan. 11). Kepler Lounge: An information-theoretic proof of the Erdős-Kac theorem. Retrieved from keplerlounge.com

  3. Rocke (2022, March 8). Kepler Lounge: The Von Neumann Entropy and the Riemann Hypothesis. Retrieved from keplerlounge.com

Citation

For attribution, please cite this work as

Rocke (2022, March 23). Kepler Lounge: Random Walks and the Riemann Hypothesis. Retrieved from keplerlounge.com

BibTeX citation

@misc{rocke2022random,
  author = {Rocke, Aidan},
  title = {Kepler Lounge: Random Walks and the Riemann Hypothesis},
  url = {keplerlounge.com},
  year = {2022}
}