Pour la grande majorité des esprits humains la capacité d’attention diminue de manière progressif depuis l'an 2000. Cela rend de plus en plus improbable le développement d’idées complexes et originales.

Starting from the observation that the statistical behaviour of the Mertens function is reminiscent of a one-dimensional random walk on non-zero values of the Möbius function, we demonstrate that the Riemann Hypothesis implies an entropy-bound on the Mertens function.

Ariel Caticha's vision of Information Geometry as a foundation for theoretical physics.

In the following synthesis, we present Kolmogorov's theory of Algorithmic Probability which defines the fundamental limits of Machine Learning.

Nature does not hurry, yet everything is accomplished in due course.-Lao Tzu

Kepler, the astronomer who saved his mother from being burned as a witch.

In the following analysis, it is shown that the Kolmogorov Structure Function builds a robust bridge between Occam's razor and the Universal Wave Function.

A proof that the Riemann Zeta function is the statistical signature of Open-Ended Evolution.

Cauchy's Little Theorem for Stieltjes Constants.

How does a global code emerge from local codes derived from MaxEnt inference given that a global encoding is necessary for the emergence of synchronization in large neural networks?

In the following analysis we demonstrate that information symmetrization within UMAP is designed to satisfy the Law of Conservation of Information, a universal principle that underlies Occam's razor.

An analysis of the evolution of the Riemann Gas and a study of its Information Geometry using tools for Manifold Learning.

Are the behavioural bugs of an AI, appropriately classified using Cognitive Science, just as important as information security bugs?

In the following analysis, we derive a strong form of Levin's Coding theorem via the Physical Church-Turing thesis which forms a natural bridge between Universal Data Compression and Universal Simulation via the Universal Wave Function.

In the following analysis, we derive the Hamming Bound under the Hamming Metric via the Shannon Coding theorem. Assuming the equivalence of norms, this has a natural interpretation as a Sphere Packing Bound on the metric space of all possible codewords.

A reader's guide to Govind Menon's research program on manifold learning.

An overview of fundamental notions from information theory that underlie the information geometry of UMAP.

Assuming that the evolution of the Quantum State of the Universe may be simulated by the Schrödinger equation, Kolmogorov's theory of Algorithmic Probability provides us with an elegant mathematical description of what a particular physicist observes during a Quantum Measurement.

In this treatise we develop the thesis of J.L. Kelly and T.M. Cover that the financial value of side information is equal to the mutual information between a wager and the side information. From this principle, we may deduce that a market is efficient to the degree that it reflects all available information via data compression.

From an information-theoretic approach to the St Petersburg paradox we may conjecture that money has the same units as entropy as a result of it being used for price discovery via maximum entropy inference. This represents a theoretically-sound approach to deriving Daniel Bernoulli's logarithmic utility function.

In the following analysis, we trace the development of geometric algebra which allows a unified approach for tensors, quaternions, differential forms, spinors and lie algebras.

Any dynamical system that has a variational formulation admits an optimal controller within the framework of Hamilton-Jacobi-Bellman theory.

By considering the general problem of multi-agent reinforcement learning, we show that money naturally emerges as an instrument for large-scale collaboration that inherits the scalar-field property of the value function in the Bellman equation.

A brief exposition on the central role of Spinors in modern Cosmology and modern physics, or proof of the importance of mathematical physics.

In the following analysis we demonstrate that Einstein's theories of Special and General Relativity were constrained by his far-reaching insights concerning the Aether.

Reflections on the unreasonable effectiveness of mathematics and its implications for Artificial Intelligence.

A proof of Euler's formula using the Leibniz product rule.

A medieval riddle that exposes the false axiom embedded in the 'Law' of Excluded Middle.

Using the theory of Algorithmic Probability and the Universal Distribution, we demonstrate that the definition of arbitrarily large integers is unsound. It follows that the Principle of Mathematical Induction, which depends upon the 'Law' of Excluded Middle, relies upon undefined terms.

An analysis of mathematical signatures of the Simulation Hypothesis.

A high-level summary of Cosmological Natural Selection as developed by Lee Smolin and Jeff Shainline which provides a naturalistic account for the Simulation Hypothesis.

By exploring the correspondence between the Buckingham-Pi theorem and Unique Factorization Domains using Koopman Operators, we find that the Prime Numbers have an emergent Spectral Geometry.

Using the theory of Algorithmic Probability, we demonstrate that Archimedes' Constant is absolutely normal.

A proof of Bell's theorem via counterfactuals, which clarifies the nature of Quantum Randomness and its relation to Quantum Probability.

Gödel numbers may be used to define a programming language, where sets of propositions may be studied using arithmetic functions.

We may demonstrate that the Prime Number Theorem emerges as a natural consequence of a thought experiment in Quantum Mechanics.

An introduction to Lev Landau's approach to Quantum Probability theory.

Building upon the work of Billingsley, an entropy formula for the distribution of prime factors is carefully developed. This allows us to compare the entropy of the normal order of prime factors, which is constant, relative to extreme values where the entropy is unbounded.

In this article we demonstrate that Cramér's model is well-approximated by a Poisson Process, and that Cramér's conjecture may be deduced from an entropy bound on prime gaps.

Information-theoretic foundations for Probabilistic Number Theory.

An information-theoretic derivation of the Prime Number Theorem, using the Law of Conservation of Information.

An information-theoretic derivation of Chebyshev's theorem(1852), an important precursor of the Prime Number Theorem.

The Algorithmic Probability of a Prime Number, defined using Levin's Coding Theorem.

More than 60 years since Eugene Wigner's highly influential essay on the unreasonable effectiveness of mathematics in the natural sciences, it may be time for a re-appraisal.

An information-theoretic formulation of Occam's razor as the Universal A Priori Probability.

Given that all physical laws are time-reversible and computable, information must be conserved as we run a simulation of the Universe forward in time.

An information-theoretic adaptation of Erdős’ proof of Euclid’s theorem, which shows that the information content of finitely many primes is insufficient to generate all the integers.