## Introduction:

I stumbled upon this analysis from [1] due to my interest in the Planck scale. While I find the GUP intriguing, what is even more interesting is the natural correspondence that emerges between civilisations incapable of Planck-scale engineering and civilisations within the Turing limit.

For this reason, I think the Physical Church-Turing thesis is more appropriately named the Planck-Church-Turing thesis.

## Plausible arguments for a Generalised Uncertainty Principle:

Given that a photon has energy $$h \nu$$, and therefore an effective mass of:

$$M = \frac{h \nu}{c^2} = \frac{h}{c \lambda}$$

we should observe gravitational effects on this particle. This gravitational force should accelerate the particles, making its uncertain position even more uncertain.

Now, using Newtonian mechanics we may estimate the variation in acceleration and position due to gravity to be roughly:

$$\Delta a_g \approx \frac{GM}{r^2} = \frac{G(h/\lambda c)}{r^2}$$

$$\Delta x_g \approx \Delta a_g \cdot t^2 \approx G(h/\lambda c) \cdot \frac{t^2}{r^2}$$

where $$r$$ and $$t$$ denote the average distance and time of interaction.

The only characteristic velocity of the system is the photon velocity $$c$$, so we have:

$$\frac{r}{t} \approx c$$

so we obtain the following uncertainty due to gravity:

$$\Delta x_g \approx \frac{Gh}{\lambda c^3} \approx \frac{G \hbar/ c^3}{\lambda} = \frac{l_p^2}{\lambda}$$

where $$l_p$$ is the Planck length.

Given that $$\Delta x \approx \frac{\hbar}{\Delta p}$$ due to the Heisenberg Uncertainty Principle, we may derive the GUP by summing the uncertainties:

$$\Delta x \approx \frac{\hbar}{\Delta p} + l_p^2 \cdot \frac{\Delta p}{\hbar}$$

If we set $$u = \frac{\hbar}{\Delta p}$$, we find:

$$\delta (u) = u + \frac{l_p^2}{u}$$

$$\frac{d \delta (u)}{d u} = 1 - \frac{l_p^2}{u^2} = 0 \implies u = l_p$$

and as a result, we have:

$$\Delta x \geq 2 \cdot l_p$$

## Discussion:

Planck first noted in 1899 the existence of a system of units based on the three fundamental constants:

$$G = 6.67 \cdot 10^{-11} m^3 \cdot \text{kg}^{-1} \cdot s^{-2}$$

$$c = 3.00 \cdot 10^8 m \cdot s^{-1}$$

$$h = 6.60 \cdot 10^{-34} \text{kg} \cdot m^2 \cdot s^{-1}$$

where $$\hbar = \frac{h}{2 \pi}$$.

In particular, Planck found that these constants are dimensionally independent in the sense that no combination is dimensionless and length, time and mass may be constructed from them. In fact, the reader may check that this is equivalent to finding non-trivial solutions to the linear system:

$$3x+y+2z = 0$$

$$z-x = 0$$

$$2x+y+z = 0$$

where $$x,y,z \in \mathbb{Z}$$. However, the reader may check that the only solution is $$(x,y,z) = (0,0,0)$$.

Meanwhile, the energy associated with the Planck mass is given by:

$$E_p = M_p \cdot c^2 = 1.2 \cdot 10^{19} \text{GeV}$$

It’s worth noting that observational analysis of Planck scale physics is highly problematic since modern high energy particle experiments are on the order of $$10^3$$ GeV, and even the most high-energy cosmic rays detected are on the order of $$\sim 10^{12}$$ GeV which is far below the Planck scale. For this reason, the Planck-scale effectively describes the limit of the Standard Model and therefore the physical limits of computer engineering for human civilisations operating within the Standard Model.

Finally, given that a good understanding of physics at the Planck-scale would potentially allow us to engineer black holes capable of computing any limit-computable function there is a correspondence between civilisations within the Turing limit and civilisations that have not yet mastered the principles of Planck-scale engineering.