A stochastic sequence with symmetry hiding in plain sight

## Introduction:

A few days ago I decided to analyse the symmetries of the two-thirds power law [1] and this analysis naturally led to the following kinematic sequence:

where $M_n \in SL(2, \mathbb{R})$ is a volume-preserving transformation and the position is updated using:

$$x_{n+1} = x_n + \dot{x}_n\cdot \Delta t + \frac{1}{2} \ddot{x}_n \cdot \Delta t^2$$

Now, in order to make sure that $ad-bc=1$ I decided to use the trigonometric identity:

$$cos^2(\theta) + sin^2(\theta) = 1$$

so we only have to sample three random numbers $\alpha, \beta,\theta \in \mathbb{R}$ so we have:

For the rest of the discussion we shall assume that $\alpha,\beta \sim (-1)^{\operatorname{Bern}(0.5)} \cdot U(0.1,10)$ and $\theta \sim U(0,1)$ .

Now, the key question I have is whether:

$$\mathbb{E}\big[\frac{\Delta y_n}{\Delta x_n}\big] = \text{Cst}$$

i.e. whether the expected value of the rate of change is constant.

## Using a symmetry to simplify calculations:

### A tale of two branching processes:

The following diagram, derived from the first figure, is a particularly useful method for visualising the trajectory of our stochastic sequence:

A tale of two branching processes

If we use $\Sigma_{1}^n$ and $\Sigma_{2}^n$ to denote random variables associated with the first and second kinds of branching processes, we may simplify (1) so we have:

$$\ddot{x_n} = \ddot{x_0} \cdot c_0 \cdot \Sigma_{2}^n + \dot{x_0} \cdot d_0 \cdot \Sigma_{2}^n = q_1 \Sigma_{2}^n$$

$$\dot{x_n} = \ddot{x_0} \cdot a_0 \cdot \Sigma_{1}^n + \dot{x_0} \cdot b_0 \cdot \Sigma_{1}^n = q_2 \Sigma_{1}^n$$

Similarly, we find that for $\ddot{y}_n$ and $\dot{y}_n$ we have:

$$\ddot{y_n} = \ddot{y_0} \cdot c_0 \cdot \Sigma_{2}^n + \dot{y_0} \cdot d_0 \cdot \Sigma_{2}^n = q_3 \Sigma_{2}^n$$

$$\dot{y_n} = \ddot{y_0} \cdot a_0 \cdot \Sigma_{1}^n + \dot{y_0} \cdot b_0 \cdot \Sigma_{1}^n = q_4 \Sigma_{1}^n$$

### Analysis of the rate of change:

Given equation (3) we may deduce that:

$$\frac{\Delta y_n}{\Delta x_n} = \frac{y_{n+1}-y_n}{x_{n+1}-x_n} = \frac{\dot{y_n} \Delta t + \frac{1}{2} \ddot{y_n} \Delta t^2}{\dot{x_n} \Delta t + \frac{1}{2} \ddot{x_n} \Delta t^2} = \frac{\dot{y_n} + h\ddot{y_n}}{\dot{x_n} + h \ddot{x_n}}$$

where $h = \frac{\Delta t}{2}$.

Now, using equations (7), (8), (9) and (10) we find that:

$$\frac{\Delta y_n}{\Delta x_n} = \frac{\dot{y_n} + h\ddot{y_n}}{\dot{x_n} + h \ddot{x_n}} = \frac{q_4 \Sigma_{1}^n + h \cdot q_3 \Sigma_{2}^n}{q_2 \Sigma_{1}^n + h \cdot q_1 \Sigma_{2}^n}$$

## An experimental observation:

### Expected values of $\Sigma_{1}^n$ and $\Sigma_{2}^n$:

It’s useful to note that given that the matries $M_n$ are independent and:

$$\forall n \in \mathbb{N}, \mathbb{E}[M_n] = 0$$

we may deduce that:

$$\mathbb{E}[\Sigma_{1}^n] =\mathbb{E}[\Sigma_{2}^n]= 0$$

### Numerical experiments with $\frac{\Delta y_n}{\Delta x_n}$:

My intuition told me from the beginning that (12) might be useful for analysing the expected value of $\frac{\Delta y_n}{\Delta x_n}$. In fact, numerical experiments suggest:

$$\frac{\Delta y_n}{\Delta x_n} \approx \frac{q_4}{q_2}$$

To be precise, numerical experiments show that 100% of the time the sign of $\frac{q_4}{q_2}$ is in agreement with the sign of $\frac{\Delta y_n}{\Delta x_n}$ and more than 70% of the time these two numbers disagree with each other by less than a factor of 1.5 i.e. a 30% difference.

## Analysis:

If we take the limit as $h \rightarrow 0$:

$$\lim_{h \to 0} \frac{\Delta y_n}{\Delta x_n} = \lim_{h \to 0} \frac{q_4 \Sigma_1^n+h \cdot q_3 \cdot \Sigma_2^n}{q_2 \Sigma_1^n+h \cdot q_1 \cdot \Sigma_2^n} = \frac{q_4}{q_2}$$

so it appears that what I observed numerically depends on $h$ and it’s still not clear to me how to calculate $\mathbb{E}\big[\frac{\Delta y_n}{\Delta x_n}\big]$ directly, which was my original question.

## Conjecture:

While I’m still looking for a closed form expression for $\mathbb{E}\big[\frac{\Delta y_n}{\Delta x_n}\big]$ my previous analysis leads me to conclude that, for any random matrices $M_i$ sampled i.i.d., as $h \rightarrow 0$:

$$\lim_{h \to 0} \frac{\Delta y_n}{\Delta x_n} = \frac{q_4}{q_2}$$

which is a general result I didn’t expect in advance.

Now, given that there is strong numerical evidence for (6) regardless of the magnitude of $\Delta t$, I wonder whether we can show:

$$\lim_{h \to 0} \frac{\Delta y_n}{\Delta x_n} = \mathbb{E}\big[\frac{\Delta y_n}{\Delta x_n} \big]$$

# References:

1. D. Huh & T. Sejnowski. Spectrum of power laws for curved hand movements. 2015.