Kepler Lounge

On the Computational Complexity of Heisenberg Uncertainty

In the following analysis, we model integer factorisation as an asymmetric information game where the expected time complexity is determined via entropy coding.

The Asymptotic Equipartition Theorem

In the following analysis, we develop an intuition for the Asymptotic Equipartition Theorem which is a fundamental cornerstone of information theory before formally proving that it describes the compressibility of sequences output from a stochastic source.

Esquisse d'un programme pour la lecture profonde

Pour la grande majorité des esprits humains la capacité d’attention diminue de manière progressif depuis l'an 2000. Cela rend de plus en plus improbable le développement d’idées complexes et originales.

Random Walks and the Riemann Hypothesis

Starting from the observation that the statistical behaviour of the Mertens function is reminiscent of a one-dimensional random walk on non-zero values of the Möbius function, we demonstrate that the Riemann Hypothesis implies an entropy-bound on the Mertens function.

A physicist's approach to Information Geometry

Ariel Caticha's vision of Information Geometry as a foundation for theoretical physics.

Kolmogorov's theory of Algorithmic Probability

In the following synthesis, we present Kolmogorov's theory of Algorithmic Probability which defines the fundamental limits of Machine Learning.

The storm in the tea cup

Nature does not hurry, yet everything is accomplished in due course.-Lao Tzu

Peter's Denial and Kepler's Witch Trial

Kepler, the astronomer who saved his mother from being burned as a witch.

The Kolmogorov Structure Function and its role in Quantum Measurement

In the following analysis, it is shown that the Kolmogorov Structure Function builds a robust bridge between Occam's razor and the Universal Wave Function.

The Emergent Complexity of the Riemann Zeta function

A proof that the Riemann Zeta function is the statistical signature of Open-Ended Evolution.

Cauchy's Little Theorem

Cauchy's Little Theorem for Stieltjes Constants.

Efficient Coding and Signal Attenuation within UMAP

How does a global code emerge from local codes derived from MaxEnt inference given that a global encoding is necessary for the emergence of synchronization in large neural networks?

The Law of Conservation of Information within UMAP

In the following analysis we demonstrate that information symmetrization within UMAP is designed to satisfy the Law of Conservation of Information, a universal principle that underlies Occam's razor.

The Evolution of the Riemann Gas

An analysis of the evolution of the Riemann Gas and a study of its Information Geometry using tools for Manifold Learning.

ChatGPT has common sense

Are the behavioural bugs of an AI, appropriately classified using Cognitive Science, just as important as information security bugs?

Lesser known miracles of Levin's Universal Distribution

In the following analysis, we derive a strong form of Levin's Coding theorem via the Physical Church-Turing thesis which forms a natural bridge between Universal Data Compression and Universal Simulation via the Universal Wave Function.

Deducing the Sphere Packing Bound from the Shannon Coding theorem

In the following analysis, we derive the Hamming Bound under the Hamming Metric via the Shannon Coding theorem. Assuming the equivalence of norms, this has a natural interpretation as a Sphere Packing Bound on the metric space of all possible codewords.

A reader's guide to Manifold Learning

A reader's guide to Govind Menon's research program on manifold learning.

The information theory behind UMAP

An overview of fundamental notions from information theory that underlie the information geometry of UMAP.

Algorithmic Probability and Wave Function collapse

Assuming that the evolution of the Quantum State of the Universe may be simulated by the Schrödinger equation, Kolmogorov's theory of Algorithmic Probability provides us with an elegant mathematical description of what a particular physicist observes during a Quantum Measurement.

Gambling and Data Compression

In this treatise we develop the thesis of J.L. Kelly and T.M. Cover that the financial value of side information is equal to the mutual information between a wager and the side information. From this principle, we may deduce that a market is efficient to the degree that it reflects all available information via data compression.

The St Petersburg Paradox, or how to gamble on the first N bits of Chaitin's Constant

From an information-theoretic approach to the St Petersburg paradox we may conjecture that money has the same units as entropy as a result of it being used for price discovery via maximum entropy inference. This represents a theoretically-sound approach to deriving Daniel Bernoulli's logarithmic utility function.

What is an imaginary number?

In the following analysis, we trace the development of geometric algebra which allows a unified approach for tensors, quaternions, differential forms, spinors and lie algebras.

Bellman's Principle of Optimality and the Hamilton-Jacobi-Bellman equation

Any dynamical system that has a variational formulation admits an optimal controller within the framework of Hamilton-Jacobi-Bellman theory.

Money as a scalar field via Reinforcement Learning

By considering the general problem of multi-agent reinforcement learning, we show that money naturally emerges as an instrument for large-scale collaboration that inherits the scalar-field property of the value function in the Bellman equation.

Penrose and Weinstein on Spinors

A brief exposition on the central role of Spinors in modern Cosmology and modern physics, or proof of the importance of mathematical physics.

Mach's Principle and Einstein's theory of the Aether

In the following analysis we demonstrate that Einstein's theories of Special and General Relativity were constrained by his far-reaching insights concerning the Aether.

Paraconsistency and Evolvability

Reflections on the unreasonable effectiveness of mathematics and its implications for Artificial Intelligence.

A sublime proof of Euler's formula

A proof of Euler's formula using the Leibniz product rule.

A riddle on the 'Law' of Excluded Middle

A medieval riddle that exposes the false axiom embedded in the 'Law' of Excluded Middle.

The limits of mathematical induction

Using the theory of Algorithmic Probability and the Universal Distribution, we demonstrate that the definition of arbitrarily large integers is unsound. It follows that the Principle of Mathematical Induction, which depends upon the 'Law' of Excluded Middle, relies upon undefined terms.

Occam's razor within Tegmark's Mathematical Universe

An analysis of mathematical signatures of the Simulation Hypothesis.

The secret life of the Cosmos

A high-level summary of Cosmological Natural Selection as developed by Lee Smolin and Jeff Shainline which provides a naturalistic account for the Simulation Hypothesis.

The Spectral Geometry of the Prime Numbers

By exploring the correspondence between the Buckingham-Pi theorem and Unique Factorization Domains using Koopman Operators, we find that the Prime Numbers have an emergent Spectral Geometry.

Archimedes' Constant is absolutely normal

Using the theory of Algorithmic Probability, we demonstrate that Archimedes' Constant is absolutely normal.

Bell's theorem via counterfactuals

A proof of Bell's theorem via counterfactuals, which clarifies the nature of Quantum Randomness and its relation to Quantum Probability.

Gödel numbers and arithmetic functions

Gödel numbers may be used to define a programming language, where sets of propositions may be studied using arithmetic functions.

Quantum Probability and the Prime Number Theorem

We may demonstrate that the Prime Number Theorem emerges as a natural consequence of a thought experiment in Quantum Mechanics.

Elements of Quantum Probability theory

An introduction to Lev Landau's approach to Quantum Probability theory.

The differential entropy of the Erdős-Kac distribution

Building upon the work of Billingsley, an entropy formula for the distribution of prime factors is carefully developed. This allows us to compare the entropy of the normal order of prime factors, which is constant, relative to extreme values where the entropy is unbounded.

Cramér's random model as a Poisson Process

In this article we demonstrate that Cramér's model is well-approximated by a Poisson Process, and that Cramér's conjecture may be deduced from an entropy bound on prime gaps.

Three master keys for Probabilistic Number Theory

Information-theoretic foundations for Probabilistic Number Theory.

An information-theoretic derivation of the Prime Number Theorem

An information-theoretic derivation of the Prime Number Theorem, using the Law of Conservation of Information.

Chebyshev's theorem via Occam's razor

An information-theoretic derivation of Chebyshev's theorem(1852), an important precursor of the Prime Number Theorem.

The Algorithmic Probability of a Prime Number

The Algorithmic Probability of a Prime Number, defined using Levin's Coding Theorem.

Revisiting the unreasonable effectiveness of mathematics

More than 60 years since Eugene Wigner's highly influential essay on the unreasonable effectiveness of mathematics in the natural sciences, it may be time for a re-appraisal.

Occam's razor

An information-theoretic formulation of Occam's razor as the Universal A Priori Probability.

The Law of Conservation of Information

Given that all physical laws are time-reversible and computable, information must be conserved as we run a simulation of the Universe forward in time.

Erdős' proof of Euclid's theorem

An information-theoretic adaptation of Erdős’ proof of Euclid’s theorem, which shows that the information content of finitely many primes is insufficient to generate all the integers.

More articles »

Kepler Lounge