The St Petersburg Paradox, or how to gamble on the first N bits of Chaitin’s Constant

From an information-theoretic approach to the St Petersburg paradox we may conjecture that money has the same units as entropy as a result of it being used for price discovery via maximum entropy inference. This represents a theoretically-sound approach to deriving Daniel Bernoulli’s logarithmic utility function.

Aidan Rocke https://github.com/AidanRocke
11-27-2022

Gambling on Universal truths:

In pursuit of Universal truths, the St Petersburg game is organised as follows:

Casino manager: Has a general interest in the halting problem and its applications to open problems in mathematics.

St Petersburg machine: An arcade game simulated by an Oracle that is capable of solving any decision problem in exchange for good money, including the task of determining the first \(N\) bits of Chaitin’s Constant.

Gambler: An MIT undergrad that read too many books on information theory and has a particular fascination for Chaitin’s Number of Wisdom that determines the algorithmic probability that an arbitrary program halts.

If the Gambler guesses \(N\) consecutive bits of Chaitin’s Constant correctly, his recompense is \(2^N\) dollars and the game ends as soon as he makes an incorrect guess.

Analysis:

Assuming that the total resources of the Casino are \(N\) dollars, the expected value of the game becomes:

\[\begin{equation} E = \sum_{k=1}^{\lfloor \log_2 N \rfloor} \frac{1}{2^k} \cdot {2^k} \sim \log_2 N \tag{1} \end{equation}\]

so the expected value grows logarithmically in the resources of the Casino. In fact, it equals the Minimum Description Length of the first \(\sim \log_2 N\) bits of Chaitin’s Constant. Here, it is worth clarifying that the Gambler is being paid by the Casino manager to reveal the bits of Chaitin’s Constant.

Thus, in order for the manager to profit with probability:

\[\begin{equation} P = 1-2^{-\log_2 N} = 1- \frac{1}{N} \tag{2} \end{equation}\]

the Gambler ought to pay \(\log_2 N\) dollars upfront.

From this analysis, we may deduce that money has the same units as entropy as a result of it being used for price discovery via methods for Maximum Entropy inference.

References:

  1. A. N. Kolmogorov Three approaches to the quantitative definition of information. Problems of Information and Transmission, 1(1):1–7, 1965

  2. A.N. Kolmogorov. Combinatorial foundations of information theory and the calculus of probabilities. Russian Math. Surveys (1983).

  3. Peter Grünwald and Paul Vitanyí. Shannon Information and Kolmogorov Complexity. Arxiv. 2004.

  4. L.A. Levin. Laws of information conservation (non-growth) and aspects of the foundation of probability theory. Problems Inform. Transmission, 10:206–210, 1974.

  5. Marcus Hutter et al. (2007) Algorithmic probability. Scholarpedia, 2(8):2572.

  6. Walter Kirchherr, Ming Li & Paul Vitányi. The Miraculous Universal Distribution. The Mathematical Intelligencer. 1997.

  7. Yuval Dagan, Yuval Filmus et al. Twenty (simple) questions. Arxiv. 2017.

  8. Daniel Bernoulli. Exposition of a New Theory of Measurement Risk. 1738.

  9. Kelly, J. L. A New Interpretation of Information Rate. Bell System Technical Journal. 1956.

  10. Calude, C. S. and Dinneen, M. J. “Exact Approximations of Omega Numbers.” Int. J. Bifur. Chaos 17, 1937-1954, 2007.

Citation

For attribution, please cite this work as

Rocke (2022, Nov. 27). Kepler Lounge: The St Petersburg Paradox, or how to gamble on the first N bits of Chaitin's Constant. Retrieved from keplerlounge.com

BibTeX citation

@misc{rocke2022the,
  author = {Rocke, Aidan},
  title = {Kepler Lounge: The St Petersburg Paradox, or how to gamble on the first N bits of Chaitin's Constant},
  url = {keplerlounge.com},
  year = {2022}
}