# Archimedes’ Constant is absolutely normal

Using the theory of Algorithmic Probability, we demonstrate that Archimedes’ Constant is absolutely normal.

Aidan Rocke https://github.com/AidanRocke
04-11-2022

Using the theory of Algorithmic Probability, we demonstrate that Archimedes’ Constant is absolutely normal. Upon closer examination, this analysis implies that Archimedes’ Constant is a physical constant.

### Lemma: Prime encodings are algorithmically random sequences.

Prime encodings $$X_N = \{x_n\}_{n=1}^N \in \{0,1\}^N$$ where $$x_n = 1$$ if $$n$$ is prime and $$x_n=0$$ otherwise are algorithmically random sequences.

### Proof:

Based on an information-theoretic demonstration of the Erdős-Kac theorem [1] and the Riemann Hypothesis [4], the Algorithmic Probability with which a prime number of magnitude $$p \in \mathbb{N}$$ is observed is on the order of:

$$$m(p) = \lim_{N \to \infty} \forall X \sim U([1,N]), P(X \bmod p = 0) = \frac{1}{p} \tag{1}$$$

As a result, using Levin’s Coding theorem, the Kolmogorov Complexity of $$p \in \mathbb{N}$$ is on the order of:

$$$K_U(p) = -\log_2 m(p) = \log_2 p \tag{2}$$$

which implies that a prime $$p \in \mathbb{N}$$ might as well be generated by $$\log_2 p$$ coin flips.

### Theorem: Archimedes’ Constant is absolutely normal.

In the following analysis, we shall demonstrate that Archimedes’ Constant is finite-state incompressible using the Three Master Keys for Probabilistic Number Theory [2].

### Proof:

Given the Euler product,

$$$\frac{\pi}{4} = \big(\prod_{p \equiv 1(\bmod 4)} \frac{p}{p-1}\big) \cdot \big(\prod_{p \equiv 3(\bmod 4)} \frac{p}{p+1}\big) = \frac{3}{4} \cdot \frac{5}{4} \cdot \frac{7}{8} \cdot \frac{11}{12} ... \tag{3}$$$

we may reformulate this product as follows:

$$$\frac{\pi}{4} = \prod_{p \in \mathbb{P} \setminus \{2\}} \frac{p}{f(p)} \tag{4}$$$

where $$f(X) = 4 \cdot (\text{argmin}_{\lambda \in \mathbb{N}} \lvert X - 4 \cdot \lambda \rvert)$$.

Now, considering that $$K_U(f)= \mathcal{O}(1)$$ for any computable function $$f$$ we have:

$$$K_U(\frac{\pi}{4}) = \lim_{N \to \infty} K_U \big(\prod_{n=2}^N p_n \big) + \mathcal{O}(1) \tag{5}$$$

Furthermore, considering the information-theoretic derivation of the Prime Number Theorem [3], the statistical distribution of primes $$p_n$$ may be modeled by independent random variables $$\widehat{p_n} \sim U([1,N])$$. Using the fact that if $$f$$ is invertible and $$X,Y$$ are independent random variables, the entropy:

$$$H(f(X,Y)) = H(X) + H(Y) \tag{6}$$$

so we have:

$$$\mathbb{E}[K_U \big(\prod_{n=2}^N \widehat{p_n} \big)] \sim \sum_{n=2}^N \mathbb{E}[K_U(\widehat{p_n})] \sim \sum_{n=2}^N H(\widehat{p_n}) \tag{7}$$$

where invertibility is guaranteed by the Unique Factorization Theorem.

It follows that we may derive the asymptotic relation:

$$$\sum_{n=2}^N H(\widehat{p_n}) \sim \pi(N) \cdot \big(\sum_{p \leq N} m(p) \cdot \ln p\big) \sim \pi(N) \cdot \big(\sum_{p \leq N} \frac{1}{p} \cdot \ln p\big) \tag{8}$$$

which may be deduced from the Lemma.

Using the Shannon Source coding theorem, the expression (8) may be identified with:

$$$\mathbb{E}[K(X_N)] \sim \pi(N) \cdot \big(\sum_{k =1 }^N \frac{1}{k} \big) \sim \pi(N) \cdot \ln N \sim N \tag{9}$$$

where $$X_N$$ is the prime encoding of length $$N$$.

Now, using the Asymptotic Equipartition Property we may note that the average information gained from observing a prime number of unknown magnitude in the interval $$[1,N]$$ is dominated by the typical probability $$\frac{1}{N}$$:

$$$\frac{1}{N} \cdot -\ln \prod_{k=1}^N P(x_k = 1) = \frac{-\ln \prod_{k=1}^N \frac{1}{k}}{N} = \frac{\ln N!}{N} \sim \frac{N \cdot \ln N - N}{N} \sim \ln N \tag{10}$$$

Hence, a representation of Archimedes’ Constant of length $$N$$ relative to any description language $$U$$ corresponds to $$\pi(N)$$ incompressible strings $$\widehat{x_p} \in \{0,1\}^*$$ of length $$\sim \ln N$$ where each string occurs with Algorithmic Probability:

$$$m(\widehat{x_p}) \sim e^{-\ln N} = \frac{1}{N} \tag{11}$$$

As the entropy of Archimedes’ Constant is dominated by the product of uniformly distributed random variables, its entropy is invariant to permutations of these variables. It follows that the Algorithmic Probability of observing the entire string of length $$N$$ is given by:

$$$P(\widehat{x}_{p_1},...,\widehat{x}_{p_{\pi(N)}}) = \prod_{p \leq N} m(\widehat{x_p}) \sim \big(\frac{1}{N}\big)^{\pi(N)} = e^{-\pi(N) \cdot \ln N} = e^{-N + \mathcal{o}(1)} \tag{12}$$$

which allows us to determine the expected information gained:

$$$-\ln P(\widehat{x}_{p_1},...,\widehat{x}_{p_{\pi(N)}}) \sim -\ln \big(\frac{1}{N}\big)^{\pi(N)} \sim \pi(N) \cdot \ln N \sim N \tag{13}$$$

Hence, we may conclude that Archimedes’ Constant has a Maximum Entropy distribution such that randomly sampled substrings of equal length have equal entropy. Therefore, Archimedes’ Constant is absolutely normal.

## References:

1. Rocke (2022, Jan. 11). Kepler Lounge: An information-theoretic proof of the Erdős-Kac theorem. Retrieved from keplerlounge.com

2. Rocke (2022, Jan. 15). Kepler Lounge: Three master keys for Probabilistic Number Theory. Retrieved from keplerlounge.com

3. Rocke (2022, Jan. 12). Kepler Lounge: An information-theoretic derivation of the Prime Number Theorem. Retrieved from keplerlounge.com

4. Rocke (2022, March 8). Kepler Lounge: The Von Neumann Entropy and the Riemann Hypothesis. Retrieved from keplerlounge.com

### Citation

Rocke (2022, April 11). Kepler Lounge: Archimedes' Constant is absolutely normal. Retrieved from keplerlounge.com
@misc{rocke2022archimedes',
}