A proof that the Riemann Zeta function is the statistical signature of Open-Ended Evolution.
In the following analysis, we postulate three self-evident axioms for Open-Ended Evolution. From these axioms we demonstrate that the Riemann Zeta function is the statistical signature of Open-Ended Evolution.
The main line of reasoning is taken from Zipf’s law, unbounded complexity and open-ended evolution[1] though it is worth noting that some arguments which were found to be erroneous were fixed by the author.
We shall consider three reasonable constraints on a necessary and sufficient model for Open-Ended Evolution that concern computability, algorithmic randomness and path-dependence(aka heredity):
Information results from the growth of genome complexity through a combination of gene duplication and interactions with the external world. This process of information growth must therefore be a path-dependent process.
Algorithmic Probability allows us to distinguish predictable from unpredictable sequences in a meaningful way.
The algorithmic definition based on the use of a program matches our intuition that evolution may be captured by a computational description.
We shall generally focus on dynamical systems whose description may be made in terms of finite binary strings \(\sigma_t\) at each time step \(t\) over evolutionary time. If \(\sigma_t\) is the description of the system at time \(t\), let the sequence:
\[\begin{equation} \Sigma(t) = \{\sigma_1,\sigma_2,...,\sigma_t \} \end{equation}\]
be the history of the system until time \(t\) in arbitrary time units.
Given the above constraints on Open-Ended Evolution we may formulate the following necessary and sufficient axioms in terms of Kolmogorov Complexity relative to a Universal Turing Machine \(U\):
We say that the process that generates \(\sigma_t\) is open-ended if:
\[\begin{equation} \frac{K_U(\Sigma(t))}{t} \leq \frac{K_U(\Sigma(t+1))}{t+1} \tag{1} \end{equation}\]
for all \(t \in \mathbb{N}\). Of all open-ended processes that obey (1) we are interested in those whose complexity is unbounded.
We say that the process generating \(\sigma_t\) has an unbounded complexity if for any \(N \in \mathbb{N}\) there is a time \(t\) such that:
\[\begin{equation} \frac{K_U(\Sigma(t))}{t} > N \tag{2} \end{equation}\]
These two axioms imply that information is always being added by the generative process in the long-term. The knowledge of the history up to time \(t\) is not enough to predict what will happen next.
Evolutionary processes attempt to minimize the action:
\[\begin{equation} S(\Sigma(t) \rightarrow \Sigma(t+1)) \equiv K_U(\Sigma(t) | \Sigma(t+1)) \tag{3} \end{equation}\]
This axiom defines an Algorithmic Least-Action principle that imposes that the information carried between successive steps is maximized as much as other constraints allow, turning the generative process into a path-dependent one. Moreover, in consideration of the previous axioms we may deduce the following fundamental inequality:
\[\begin{equation} \frac{K(\Sigma(t))}{t} \leq K(\sigma_{t+1}|\sigma_{t}) \leq K(\sigma_{t+1}) \tag{*} \end{equation}\]
Now, in addition to these axioms we need Kolmogorov’s Lemma which may be derived from these axioms as well as the hypothesis that a Universal Wave Function simulates the Observable Universe.
Before demonstrating that the Riemann Zeta function is a fundamental signature of Open-Ended Evolution, we will need the key lemma:
\[\begin{equation} \mathbb{E}[K_U(X)] = H(X) + \mathcal{O}(1) \tag{4} \end{equation}\]
which is demonstrated in the article: Lesser known miracles of Levin’s Universal Distribution.
Given that the information-theoretic properties of the Shannon Entropy are invariant to the choice of base of the logarithm we may observe that:
\[\begin{equation} \forall \lambda \in \mathbb{R^*}_{+}, \mathbb{E}[K_U(X)] \propto \lambda \cdot H(X) \tag{5} \end{equation}\]
which motivates our analysis of the typical information of an observable: \(\langle \ln n \rangle\).
Given a countable set of observable Combinatorial Objects(ex. words, proteins, Lego) generated by a Universal Grammar we may define their un-normalised frequency counts using the integers \(\mathbb{N}\).
The maximum entropy approach to characterizing the appropriate frequency distribution estimates the probabilities \(p_n\) by maximizing the Shannon Entropy:
\[\begin{equation} S = -\sum_n p_n \ln p_n \tag{6} \end{equation}\]
subject to a number of constraints that represent epistemic limits on the underlying generative process.
If we define two reasonable constraints, a Unitarity constraint on the Universal Wave Function and the Asymptotic Equipartition Property(\(\mathcal{\chi}\)):
\[\begin{equation} \sum_n p_n = 1 \tag{7} \end{equation}\]
\[\begin{equation} \langle \ln n \rangle = \sum_{n=1}^\infty p_n \ln n = \mathcal{\chi} \tag{8} \end{equation}\]
we may now maximize the Shannon Entropy subject to these constraints using the method of Lagrange Multipliers, so we find:
\[\begin{equation} \hat{S} = -z\big(\sum_{n=1}^\infty p_n \ln n - \mathcal{\chi}\big) - \lambda \big(\sum_{n=1}^\infty p_n - 1\big) - \sum_{n=1}^\infty p_n \ln p_n \tag{9} \end{equation}\]
and if we apply the change of variables \(\lambda: Z \mapsto \ln Z - 1\):
\[\begin{equation} \hat{S} = -z\big(\sum_{n=1}^\infty p_n \ln n - \mathcal{\chi}\big) - (\ln Z - 1) \big(\sum_{n=1}^\infty p_n - 1\big) - \sum_{n=1}^\infty p_n \ln p_n \tag{10} \end{equation}\]
…varying with respect to \(p_n\) yields the extremality condition:
\[\begin{equation} -z \ln n - \ln Z - \ln p_n = 0 \tag{11} \end{equation}\]
with explicit solution:
\[\begin{equation} \forall z > 1, p_n = \frac{n^{-z}}{\zeta(z)} \tag{12} \end{equation}\]
where \(Z = \zeta(z)\) is the renormalisation factor.
The typical information of the observable \(\mathcal{\chi}\) is therefore given by:
\[\begin{equation} \mathcal{\chi}(z) = \langle \ln n \rangle = \frac{\sum_{n=1}^\infty n^{-z} \ln n}{\zeta(z)} = \frac{-d \zeta(z)/dz}{\zeta(z)} = \frac{-d \ln \zeta(z)}{dz} \tag{13} \end{equation}\]
At maximum entropy, by imposing the extremality condition we find:
\[\begin{equation} \hat{S}(z) = S(z) = -\sum_{n=1}^\infty p_n \ln p_n = \ln \zeta(z) + z \mathcal{\chi}(z) \tag{14} \end{equation}\]
Finally, we may deduce the typical frequency using the typical information:
\[\begin{equation} \exp \langle \ln n \rangle = \prod_{n=1}^\infty n^{p_n} \tag{15} \end{equation}\]
which is the geometric mean of the integers with exponents weighted by the probabilities \(p_n\). To characterize \(\zeta(z)\) in terms of its unique singularity at \(z=1\), we may observe that:
\[\begin{equation} \forall z \in \mathbb{C} \setminus \{1\}, \zeta(z) = \frac{1}{z-1} + \sum_{n=1}^\infty \frac{(-1)^n}{n!}(z-1)^n \tag{16} \end{equation}\]
where the Stieltjes constants satisfy:
\[\begin{equation} \gamma_n = \frac{(-1)^n n!}{2 \pi} \int_{0}^{2 \pi} e^{-nix} \zeta(e^{ix}+1) dx \tag{17} \end{equation}\]
When I meet God, I’m going to ask him two questions: why relativity? And why turbulence? I really believe he’ll have an answer for the first.-Heisenberg
Does turbulence need God? That is a difficult question. What I can say is that non-equilibrium turbulence requires the Riemann Zeta function.
Corominas-Murtra Bernat, Seoane Luís F. and Solé Ricard. Zipf’s Law, unbounded complexity and open-ended evolution. Journal of the Royal Society. 2018.
Matt Visser. Zipf’s law, power laws, and maximum entropy. Arxiv. 2012.
Rocke (2023, April 19). Kepler Lounge: Lesser known miracles of Levin’s Universal Distribution. Retrieved from keplerlounge.com
For attribution, please cite this work as
Rocke (2023, June 2). Kepler Lounge: The Emergent Complexity of the Riemann Zeta function. Retrieved from keplerlounge.com
BibTeX citation
@misc{rocke2023the, author = {Rocke, Aidan}, title = {Kepler Lounge: The Emergent Complexity of the Riemann Zeta function}, url = {keplerlounge.com}, year = {2023} }