Revisiting the unreasonable effectiveness of mathematics

More than 60 years since Eugene Wigner’s highly influential essay on the unreasonable effectiveness of mathematics in the natural sciences, it may be time for a re-appraisal.

Aidan Rocke https://github.com/AidanRocke
01-05-2022

Introduction:

62 years since Eugene Wigner’s highly influential essay on the unreasonable effectiveness of mathematics in the natural sciences, it may be time for a re-appraisal [9]. On balance, with important theoretical advances in algorithmic information theory and Quantum Computation it appears that the remarkable effectiveness of mathematics in the natural sciences is quite reasonable.

By effectiveness, we are explicitly referring to Wigner’s non-trivial observation that mathematical laws have remarkable generalisation power.

An information-theoretic perspective:

An acute observer will note that the same mathematical laws with remarkable generalisation power in the natural sciences are also constrained by Occam’s razor. Given two computable theories, Einstein explicitly stated that a physicist ought to choose the simplest theory that yields negligible experimental error:

It can be scarcely denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.-Einstein(1933)

In fact, from an information-theoretic perspective the remarkable generalisation power of mathematical laws in the natural sciences is a direct consequence of the effectiveness of Occam’s razor.

The Law of Conservation of Information:

From an information-theoretic perspective, a Universe where Occam’s razor is generally applicable is one where information is generally conserved. This law of conservation of information which dates back to von Neumann essentially states that the von Neumann entropy is invariant to Unitary transformations. This is meaningful within the framework of Everettian Quantum Mechanics as a density matrix may be assigned to the state of the Universe. This way information is conserved as we run a simulation of the Universe forward in time.

Moreover, given that Occam’s razor has an appropriate formulation within the context of algorithmic information theory as the Minimum Description Length principle, this information-theoretic perspective generally presumes that the Universe itself may be simulated by a Universal Turing Machine.

The Physical Church-Turing thesis:

The research of David Deutsch(and others) on the Physical Church-Turing thesis explains how a Universal Quantum computer may simulate the laws of physics [1]. This is consistent with the general belief that Quantum Mechanics may be used to simulate all of physics so the most important contributions to the Physical Church-Turing thesis have been via theories of quantum computation.

More importantly, the Physical Church-Turing thesis provides us with a credible explanation for the remarkable effectiveness of mathematics in the natural sciences.

What is truly remarkable:

If we view the scientific method as an algorithmic search procedure then there is no reason, a priori, to suspect that a particular inductive bias should be particularly powerful. This much was established by David Wolpert with his No Free Lunch theorems [13].

On the other hand, the history of the natural sciences indicates that Occam’s razor is remarkably effective. Thus, we may view the remarkable effectiveness of Occam’s razor as strong evidence for the Simulation Hypothesis.

References:

  1. David Deutsch. Quantum theory, the Church–Turing principle and the universal quantum computer. 1985.

  2. Jeffrey M Shainline. Does cosmological natural selection select for technology? Institute of Physics. 2020.

  3. Max Tegmark. The Mathematical Universe. Foundations of Physics. 2007.

  4. A. N. Kolmogorov Three approaches to the quantitative definition of information. Problems of Information and Transmission, 1(1):1–7, 1965

  5. G. J. Chaitin On the length of programs for computing finite binary sequences: Statistical considerations. Journal of the ACM, 16(1):145–159, 1969.

  6. R. J. Solomonoff A formal theory of inductive inference: Parts 1 and 2. Information and Control, 7:1–22 and 224–254, 1964.

  7. Schnorr, C. P. (1971). “A unified approach to the definition of a random sequence”. Mathematical Systems Theory.

  8. Peter D. Grünwald. The Minimum Description Length Principle . MIT Press. 2007.

  9. Eugene Wigner. The Unreasonable Effectiveness of Mathematics in the Natural Sciences. 1960.

  10. Michael Nielsen. Interesting problems: The Church-Turing-Deutsch Principle. 2004. https://michaelnielsen.org/blog/interestingproblems-the-church-turing-deutsch-principle/

  11. Marcus Hutter et al. (2007) Algorithmic probability. Scholarpedia, 2(8):2572.

  12. The Evolution of Physics, Albert Einstein and Leopold Infeld, 1938, Edited by C.P. Snow, Cambridge University Press.

  13. Wolpert, D.H., Macready, W.G. (1997), “No Free Lunch Theorems for Optimization”, IEEE Transactions on Evolutionary Computation 1, 67.

Citation

For attribution, please cite this work as

Rocke (2022, Jan. 5). Kepler Lounge: Revisiting the unreasonable effectiveness of mathematics. Retrieved from keplerlounge.com

BibTeX citation

@misc{rocke2022revisiting,
  author = {Rocke, Aidan},
  title = {Kepler Lounge: Revisiting the unreasonable effectiveness of mathematics},
  url = {keplerlounge.com},
  year = {2022}
}