All models are approximate and some approximations are good enough to be useful.

1. The Planck-Church-Turing thesis

  • Year 2021
  • Topics: physics, computation

    After going over proof-program equivalence in the natural sciences, I demonstrate that Planck presciently anticipated the Physical Church-Turing thesis. As Planck defined the physical limits of computer engineering(and engineering in general) and the first scientist to seriously consider that nature may be quantised, the Physical Church-Turing thesis is more appropriately named the Planck-Church-Turing thesis.

2. An information-theoretic upper bound on prime gaps

  • Year 2021
  • Topics: information theory, probabilistic number theory

    Within the setting of rare event modelling, the method of level sets allows us to define an equivalence relation over rare events with distinct rates of entropy production. As a measure of the efficacy of this method it is applied to Cramér’s conjecture, an open problem in probabilistic number theory.

    Furthermore, this analysis places strong epistemic limits on the application of machine learning to analyse the distribution of primes.

3. An ergodic approach to Occam’s razor

  • Year 2021
  • Topics: machine learning, model selection, algorithmic information theory

    An algorithmic Occam’s razor may be derived using the Expected Kolmogorov Complexity of a discrete random variable. Unlike mainstream Algorithmic Information Theory, it is demonstrated that a direct appeal to Solomonoff’s Universal Distribution is generally unsound.

    Instead, a derivation is based upon a combination of Bayesian and ergodic perspectives that are both necessary and sufficient in the context of the scientific method as it is applied in the natural sciences.

4. Turing’s fixed point theorem

  • Year 2021
  • Topics: Epistemology, Turing Machines, Decidability, Chaos theory

    From a dynamical systems perspective Turing’s solution to the halting problem has two interesting features. First, it clearly shows that that there is an intrinsic directionality in the information processing behaviour of Turing Machines. Second, its expression as a fixed point theorem presciently anticipates the development of Chaos theory for deterministic dynamical systems.

    By formulating Turing’s theorem in this manner we also manage to address the misplaced criticism from the computational biology community that biological systems are dynamical systems and not computers.

5. On the equivalence of incompressibility and incompleteness in machine learning

  • Year 2021
  • Topics: algorithmic information theory, epistemology, Physical Church-Turing thesis

    Motivated by the question of what problems are suitably addressed using methods in machine learning, I decided to formulate this general problem using Algorithmic Information Theory. My reason for this formulation is that the Kolmogorov Complexity appears to be a suitable measure of both incompressibility and epistemic uncertainty. I specifically derived two observer dependence theorems that appear to have subtle epistemological implications.

    These theorems imply that randomness is not observer independent, that there is a fundamental relationship between incompressibility and inextricable epistemic uncertainty(i.e. incompleteness) in machine learning, and that in some settings the bounds on epistemic uncertainty may be more important than bounded rationality.

6. The Prime Number Theorem, or the incompressibility of the primes

  • Year 2021
  • Topics: number theory, information theory, algorithmic information theory, decidability

    From an information-theoretic analysis of the prime number theorem, we may deduce that prime number sequences are incompressible. By pushing this analysis further, we may conclude that while a single counter-example may be used to prove that the Riemann Hypothesis is false, we can’t prove that the Riemann Hypothesis is true.

7. What exactly do applied mathematicians mean by random?

  • Year 2021
  • Topics: applied mathematics, applied physics, engineering

    For applied mathematicians and applied physicists, a practical definition of randomness has so far proven elusive due to contradictions between theory and practice. In theory, mathematicians have a definition of algorithmic randomness which is uncomputable whereas in practice engineers use pseudo-random number generators. Here I show that if we start from the international cryptographic standard for random number generators, a strong definition arises naturally from the notion of epistemic uncertainty combined with a combinatorial definition of the Shannon entropy.

8. Why are financial markets particularly vulnerable to the madness of crowds?

  • Year 2021
  • Topics: financial markets, cryptocurrencies, anthropology

    Unlike most engineered systems that facilitate the operation of nation-states, financial systems which we use for resource-allocation and the re-distribution of economic power regularly experience failure like no other established technology. In this setting, some may present decentralised finance(DeFi) as a cure-all, as it promises to reduce the influence of command-and-control economies and decentralise economic power. However, in this article I outline unique sociocultural vulnerabilities which may be amplified without careful regulation.

    If the history of human civilisation may teach us anything it is that we should tread carefully around romantic notions that develop around new technologies as the first version of a technological system is poorly regulated and often finds criminal use.

9. The emergence of money as a topological phase transition

  • Year: 2021
  • Topics: Economics, Monetary systems, Political science, statistical mechanics

    Through an open-ended thought experiment, I explore how money emerges through the minimisation of network costs that grow quadratically in a barter network. Furthermore, I explore how a monetary system allows the complementary developments of commercial technology which makes civilised life possible, and a Leviathan without which civilisation would be impossible.

    These are the origins of the democratic nation-state, which lies in a fragile balance between anarchy and authoritarianism.

10. Revisiting the axiomatic foundations of game theory

  • Year: 2020
  • Topics: Psychology, Economics, policymaking

    Since game theory came into being in 1944 with Von Neumann’s publication of ‘The Theory of Games and Economic Behaviour’, it has been applied to a large range of disciplines with varying degrees of success but it is worth noting its abysmal failure in its original domains of application: international relations and the field of economics. The reasons for this failure are two-fold. First, game-theoretic analyses of human relations rest upon flawed assumptions of a psychological nature that aren’t constrained by psychological evidence. Second, any axiomatisation of human behaviour would need to summarise advanced knowledge of embodied cognition(i.e. complex quantitative relationships between psychology and physiology) that resists the axiomatisation process. For these reasons, the author recommends that scalable approaches to experimental psychology(rather than game theory) be used to inform domestic policy not least because internal threats are often under-estimated.

11. Probability in high dimensions

  • Year: 2019
  • Topics: high-dimensional data analysis, probability, randomness, experimental mathematics

    While running a computer simulation that involved the ratio of two random variables whose denominator was a symmetric random variable centred at zero, I observed that this fact never caused the simulation to crash. This made me curious and I discovered that this interesting numerical observation turned out to be a theorem. While this is not a fundamental discovery, it is one of many experiences that convinced me of the increasingly important role computers will play in the discovery of mathematical theorems in the future.

    Another interesting lesson from this experience besides the growing importance of experimental mathematics is that random structures, in this case a random walk, are useful for analysing high-dimensional objects. To what degree this is because our intuition for high dimensional structures isn’t much better than random, I don’t yet have the answer.

12. Derivation of the isoperimetric inequality from the ideal gas equation

  • Year: 2019
  • Topics: Elasticity, Morphogenesis, Discrete Differential Geometry

    Why is it that whenever balloons are inflated they converge towards the shape of a sphere regardless of their initial geometry? In this article I consider the contribution of the elastic material the balloons are made of by analysing the problem in two dimensions and demonstrate that a minimal surface may be entirely due to local mechanical instabilities.

13. Mimesis as random graph coloring

  • Year: 2019
  • Topics: Human Anthropology, random graphs

    Inspired by the thought-provoking masterpiece by René Girard, Le Bouc Emissaire, in this article I propose a simple and tractable mechanism for mimetic desire. When we change our beliefs, we do so not because of their intrinsic value. Our desire to switch from belief \(A\) to belief \(B\) is proportional to the number of adherents of belief \(B\) that we know.

14. Why don’t hexapods gallop?

  • Year: 2019
  • Topics: Biomechanics

    The main contribution of this paper is to explain why the overwhelming majority of hexapods don’t employ the rectilinear gallop using simple mechanical arguments.

15. On the equivalence of causal path entropy and empowerment

  • Year: 2018
  • Topics: AI, Uncertainty, Information Theory, Statistical Mechanics

    Motivated by discussions with reinforcement learning researchers who believed that these two theories were equivalent and others who did not know the relation between them, the main contribution of this paper is to show that these two intrinsic reward functions, the Causal Path Entropy and Empowerment, are equivalent only in deterministic environments. In non-deterministic environments, it is shown that the Causal Path Entropy has fundamental weaknesses compared to Empowerment. Moreover, the author demonstrates that the difference between Causal Path Entropy and Empowerment can’t be increased without diminishing Empowerment.