A proof that the Riemann Zeta function is the statistical signature of Open-Ended Evolution.
In the following analysis, we postulate three self-evident axioms for Open-Ended Evolution. From these axioms we demonstrate that the Riemann Zeta function is the statistical signature of Open-Ended Evolution.
The main line of reasoning is taken from Zipf’s law, unbounded complexity and open-ended evolution[1] though it is worth noting that some arguments which were found to be erroneous were fixed by the author.
We shall consider three reasonable constraints on a necessary and sufficient model for Open-Ended Evolution that concern computability, algorithmic randomness and path-dependence(aka heredity):
Information results from the growth of genome complexity through a combination of gene duplication and interactions with the external world. This process of information growth must therefore be a path-dependent process.
Algorithmic Probability allows us to distinguish predictable from unpredictable sequences in a meaningful way.
The algorithmic definition based on the use of a program matches our intuition that evolution may be captured by a computational description.
We shall generally focus on dynamical systems whose description may be made in terms of finite binary strings σt at each time step t over evolutionary time. If σt is the description of the system at time t, let the sequence:
Σ(t)={σ1,σ2,...,σt}
be the history of the system until time t in arbitrary time units.
Given the above constraints on Open-Ended Evolution we may formulate the following necessary and sufficient axioms in terms of Kolmogorov Complexity relative to a Universal Turing Machine U:
We say that the process that generates σt is open-ended if:
KU(Σ(t))t≤KU(Σ(t+1))t+1
for all t∈N. Of all open-ended processes that obey (1) we are interested in those whose complexity is unbounded.
We say that the process generating σt has an unbounded complexity if for any N∈N there is a time t such that:
KU(Σ(t))t>N
These two axioms imply that information is always being added by the generative process in the long-term. The knowledge of the history up to time t is not enough to predict what will happen next.
Evolutionary processes attempt to minimize the action:
S(Σ(t)→Σ(t+1))≡KU(Σ(t)|Σ(t+1))
This axiom defines an Algorithmic Least-Action principle that imposes that the information carried between successive steps is maximized as much as other constraints allow, turning the generative process into a path-dependent one. Moreover, in consideration of the previous axioms we may deduce the following fundamental inequality:
K(Σ(t))t≤K(σt+1|σt)≤K(σt+1)
Now, in addition to these axioms we need Kolmogorov’s Lemma which may be derived from these axioms as well as the hypothesis that a Universal Wave Function simulates the Observable Universe.
Before demonstrating that the Riemann Zeta function is a fundamental signature of Open-Ended Evolution, we will need the key lemma:
E[KU(X)]=H(X)+O(1)
which is demonstrated in the article: Lesser known miracles of Levin’s Universal Distribution.
Given that the information-theoretic properties of the Shannon Entropy are invariant to the choice of base of the logarithm we may observe that:
∀λ∈R∗+,E[KU(X)]∝λ⋅H(X)
which motivates our analysis of the typical information of an observable: ⟨lnn⟩.
Given a countable set of observable Combinatorial Objects(ex. words, proteins, Lego) generated by a Universal Grammar we may define their un-normalised frequency counts using the integers N.
The maximum entropy approach to characterizing the appropriate frequency distribution estimates the probabilities pn by maximizing the Shannon Entropy:
S=−∑npnlnpn
subject to a number of constraints that represent epistemic limits on the underlying generative process.
If we define two reasonable constraints, a Unitarity constraint on the Universal Wave Function and the Asymptotic Equipartition Property(χ):
∑npn=1
⟨lnn⟩=∞∑n=1pnlnn=χ
we may now maximize the Shannon Entropy subject to these constraints using the method of Lagrange Multipliers, so we find:
ˆS=−z(∞∑n=1pnlnn−χ)−λ(∞∑n=1pn−1)−∞∑n=1pnlnpn
and if we apply the change of variables λ:Z↦lnZ−1:
ˆS=−z(∞∑n=1pnlnn−χ)−(lnZ−1)(∞∑n=1pn−1)−∞∑n=1pnlnpn
…varying with respect to pn yields the extremality condition:
−zlnn−lnZ−lnpn=0
with explicit solution:
∀z>1,pn=n−zζ(z)
where Z=ζ(z) is the renormalisation factor.
The typical information of the observable χ is therefore given by:
χ(z)=⟨lnn⟩=∑∞n=1n−zlnnζ(z)=−dζ(z)/dzζ(z)=−dlnζ(z)dz
At maximum entropy, by imposing the extremality condition we find:
ˆS(z)=S(z)=−∞∑n=1pnlnpn=lnζ(z)+zχ(z)
Finally, we may deduce the typical frequency using the typical information:
exp⟨lnn⟩=∞∏n=1npn
which is the geometric mean of the integers with exponents weighted by the probabilities pn. To characterize ζ(z) in terms of its unique singularity at z=1, we may observe that:
∀z∈C∖{1},ζ(z)=1z−1+∞∑n=1(−1)nn!(z−1)n
where the Stieltjes constants satisfy:
γn=(−1)nn!2π∫2π0e−nixζ(eix+1)dx
When I meet God, I’m going to ask him two questions: why relativity? And why turbulence? I really believe he’ll have an answer for the first.-Heisenberg
Does turbulence need God? That is a difficult question. What I can say is that non-equilibrium turbulence requires the Riemann Zeta function.
Corominas-Murtra Bernat, Seoane Luís F. and Solé Ricard. Zipf’s Law, unbounded complexity and open-ended evolution. Journal of the Royal Society. 2018.
Matt Visser. Zipf’s law, power laws, and maximum entropy. Arxiv. 2012.
Rocke (2023, April 19). Kepler Lounge: Lesser known miracles of Levin’s Universal Distribution. Retrieved from keplerlounge.com
For attribution, please cite this work as
Rocke (2023, June 2). Kepler Lounge: The Emergent Complexity of the Riemann Zeta function. Retrieved from keplerlounge.com
BibTeX citation
@misc{rocke2023the, author = {Rocke, Aidan}, title = {Kepler Lounge: The Emergent Complexity of the Riemann Zeta function}, url = {keplerlounge.com}, year = {2023} }