A stochastic sequence with symmetry hiding in plain sight


A few days ago I decided to analyse the symmetries of the two-thirds power law [1] and this analysis naturally led to the following kinematic sequence:

where is a volume-preserving transformation and the position is updated using:

\begin{equation} x_{n+1} = x_n + \dot{x}_n\cdot \Delta t + \frac{1}{2} \ddot{x}_n \cdot \Delta t^2 \end{equation}

Now, in order to make sure that I decided to use the trigonometric identity:

\begin{equation} cos^2(\theta) + sin^2(\theta) = 1 \end{equation}

so we only have to sample three random numbers so we have:

For the rest of the discussion we shall assume that and .

Now, the key question I have is whether:

\begin{equation} \mathbb{E}\big[\frac{\Delta y_n}{\Delta x_n}\big] = \text{Cst} \end{equation}

i.e. whether the expected value of the rate of change is constant.

Using a symmetry to simplify calculations:

A tale of two branching processes:

The following diagram, derived from the first figure, is a particularly useful method for visualising the trajectory of our stochastic sequence:

A tale of two branching processes

If we use and to denote random variables associated with the first and second kinds of branching processes, we may simplify (1) so we have:

\begin{equation} \ddot{x_n} = \ddot{x_0} \cdot c_0 \cdot \Sigma_{2}^n + \dot{x_0} \cdot d_0 \cdot \Sigma_{2}^n = q_1 \Sigma_{2}^n \end{equation}

\begin{equation} \dot{x_n} = \ddot{x_0} \cdot a_0 \cdot \Sigma_{1}^n + \dot{x_0} \cdot b_0 \cdot \Sigma_{1}^n = q_2 \Sigma_{1}^n \end{equation}

Similarly, we find that for and we have:

\begin{equation} \ddot{y_n} = \ddot{y_0} \cdot c_0 \cdot \Sigma_{2}^n + \dot{y_0} \cdot d_0 \cdot \Sigma_{2}^n = q_3 \Sigma_{2}^n \end{equation}

\begin{equation} \dot{y_n} = \ddot{y_0} \cdot a_0 \cdot \Sigma_{1}^n + \dot{y_0} \cdot b_0 \cdot \Sigma_{1}^n = q_4 \Sigma_{1}^n \end{equation}

Analysis of the rate of change:

Given equation (3) we may deduce that:

\begin{equation} \frac{\Delta y_n}{\Delta x_n} = \frac{y_{n+1}-y_n}{x_{n+1}-x_n} = \frac{\dot{y_n} \Delta t + \frac{1}{2} \ddot{y_n} \Delta t^2}{\dot{x_n} \Delta t + \frac{1}{2} \ddot{x_n} \Delta t^2} = \frac{\dot{y_n} + h\ddot{y_n}}{\dot{x_n} + h \ddot{x_n}} \end{equation}

where .

Now, using equations (7), (8), (9) and (10) we find that:

\begin{equation} \frac{\Delta y_n}{\Delta x_n} = \frac{\dot{y_n} + h\ddot{y_n}}{\dot{x_n} + h \ddot{x_n}} = \frac{q_4 \Sigma_{1}^n + h \cdot q_3 \Sigma_{2}^n}{q_2 \Sigma_{1}^n + h \cdot q_1 \Sigma_{2}^n} \end{equation}

An experimental observation:

Expected values of and :

It’s useful to note that given that the matries are independent and:

\begin{equation} \forall n \in \mathbb{N}, \mathbb{E}[M_n] = 0 \end{equation}

we may deduce that:

\begin{equation} \mathbb{E}[\Sigma_{1}^n] =\mathbb{E}[\Sigma_{2}^n]= 0 \end{equation}

Numerical experiments with :

My intuition told me from the beginning that (12) might be useful for analysing the expected value of . In fact, numerical experiments suggest:

\begin{equation} \frac{\Delta y_n}{\Delta x_n} \approx \frac{q_4}{q_2} \end{equation}

To be precise, numerical experiments show that 100% of the time the sign of is in agreement with the sign of and more than 70% of the time these two numbers disagree with each other by less than a factor of 1.5 i.e. a 30% difference.


If we take the limit as :

\begin{equation} \lim_{h \to 0} \frac{\Delta y_n}{\Delta x_n} = \lim_{h \to 0} \frac{q_4 \Sigma_1^n+h \cdot q_3 \cdot \Sigma_2^n}{q_2 \Sigma_1^n+h \cdot q_1 \cdot \Sigma_2^n} = \frac{q_4}{q_2} \end{equation}

so it appears that what I observed numerically depends on and it’s still not clear to me how to calculate directly, which was my original question.


While I’m still looking for a closed form expression for my previous analysis leads me to conclude that, for any random matrices sampled i.i.d., as :

\begin{equation} \lim_{h \to 0} \frac{\Delta y_n}{\Delta x_n} = \frac{q_4}{q_2} \end{equation}

which is a general result I didn’t expect in advance.

Now, given that there is strong numerical evidence for (6) regardless of the magnitude of , I wonder whether we can show:

\begin{equation} \lim_{h \to 0} \frac{\Delta y_n}{\Delta x_n} = \mathbb{E}\big[\frac{\Delta y_n}{\Delta x_n} \big] \end{equation}


  1. D. Huh & T. Sejnowski. Spectrum of power laws for curved hand movements. 2015.