Dendrite morphologies exhibit considerable variation(taken from [8])

Introduction:

Due to the compositional structure of neural computation, if dendritic trees in single neurons couldn’t compute interesting functions the brain wouldn’t be able to generate any complex behaviour. Now, in order to figure out what dendrites can compute we need to figure out the expressive power, computational complexity and function spaces associated with dendritic computation. Eventually we would also need to take dynamics into account to determine when exactly these computations should happen.

As function space identification is very hard we may reduce the complexity of our task by identifying scaling laws for the time-complexity and expressive power of dendritic computation.

Dendrites as functions computable on binary trees:

To a first-order approximation the typical dendrite has the morphology of a random binary tree [6]. However, as shown in the illustration this allows considerable diversity in the range of dendrite morphologies. So how should we proceed with our analysis?

It’s worth noting that every random binary tree is embedded in a complete binary tree i.e. a binary tree where each node has two children. Furthermore, if functions computable on a binary tree are the composition of simpler functions computable at each node then any function computable on a random binary tree is also computable on a complete binary tree.

Complete binary trees are also special in the sense that functions computable on complete binary trees have maximal expressive power. Assuming our demonstration is correct, to figure out what functions are computable by dendrites we may focus on the analysis of functions computable on complete binary trees.

The expressiveness of functions computable on trees:

Let’s define a function computable on a -ary tree as a function composed with simpler computable functions defined at each node such that a function of this kind defined on a binary tree of depth receives inputs:

A tree function

Intuitively, if there are functions at the nth level the expressiveness for on a tree of depth should grow on the order of:

\begin{equation} \sim k^N \end{equation}

Probabilistic argument using Kolmogorov Complexity:

If is a composition of functions in where and denotes Kolmogorov Complexity then we may define:

\begin{equation} Q = \min_{f_i \in S} K(f_i) \end{equation}

and we may show that for almost all we must have:

\begin{equation} K(F_k^N) \geq \frac{Q}{2} \cdot k^{N-1} \end{equation}

Proof:

Let’s suppose each has an encoding as a binary string so . If we compress each then is reduced to a program of length greater than:

\begin{equation} n= Qk^{N -1} \end{equation}

Now, the number of programs of length less than or equal to is given by:

\begin{equation} \sum_{l=1}^{\frac{n}{2}} 2^l \leq 2^{\frac{n}{2}+1}-1 \end{equation}

and so, using the principle of maximum entropy(i.e. uniform distribution) [4], we find that:

\begin{equation} \lim_{n \to \infty} P(K(F_k^N) \geq \frac{n}{2}) \geq \lim_{n \to \infty} 1 - \frac{2^{\frac{n}{2}}}{2^n} = 1 \end{equation}

At this point the careful reader may remark that the typical dendritic tree doesn’t form a complete binary tree. I suspect the reason for this is that besides computational considerations dendritic trees must take into account spatial, energetic and developmental constraints.

One reason for node sparsity may be the fact that for complete binary trees computational cost scales exponentially with tree depth.

Energy as a robust proxy measure for total computational cost:

Consider that software running on given hardware must consume energy in order to perform computations so energy consumption scales with computational cost. It follows that energy is a robust proxy measure for computational cost. We may also observe that if functions are defined at the nodes of dendritic trees and the energy consumption, per neural action potential, at each node is bounded by joules then we may model the computational cost of evaluating functions computable on dendritic trees.

In particular, it may be easily shown that the total computational cost of evaluating functions defined on complete binary trees with depth generally scales with:

\begin{equation} \sim 2^N \cdot \text{Joules} \end{equation}

The asymptotic time-complexity of functions computable on trees:

Given a function defined on a complete binary tree with depth we may ask what is the fastest way to evaluate this function. In other words, what is the fastest way to perform inference using ?

If we proceed in a sequential manner, starting with the base of the tree, the temporal cost or time-complexity will scale with the energy cost so we will have:

\begin{equation} {Time}(F_2^N) \sim \mathcal{O}(2^N) \end{equation}

and if we note that the dimension of input to equals the number of nodes at its base, , we can say that serial computation yields:

\begin{equation} {Time}(F_2^N(n)) \sim \mathcal{O}(n) \end{equation}

However, if we assume parallel execution of functions at each level of then the time-complexity scales with tree depth:

\begin{equation} {Time}(F_2^N) \sim \mathcal{O}(N) \end{equation}

which yields a time-complexity that scales logarithmically with dimension of the inputs:

\begin{equation} {Time}(F_2^N(n)) \sim \mathcal{O}(\ln n) \end{equation}

It turns out that such an algorithm for evaluating is not only highly efficient, it is also biologically plausible [1].

How algorithmic information scales with compute time:

Given equations (3) and (10) we may deduce a very interesting scaling law:

\begin{equation} K(F_2^N) \sim 2^{{Time}(F_2^N)} \end{equation}

which means that the complexity of the functions that are evaluated grows exponentially in the running time. This is comparable to running all programs of length within seconds.

This may scale exponentially with energy (7) so there may be an inherent tradeoff between evaluating functions with great expressive power and minimising energy consumption. From a biological perspective, node-sparsity may be a practical necessity.

Discussion:

The above analysis raises a question that may be explored through computer simulations on carefully chosen benchmarks. To be precise, what functions defined on binary trees do we obtain if we simultaneously maximise expressive power and minimise energy consumption? I think these are both biologically plausible and take maximal advantage of dendritic tree structure.

My hunch is that dendritic trees are a form of parse tree where the nodes represent mathematical expressions. I also think that dendritic trees implement some kind of temporal logic and that this timing information comes from neural dynamics. For these reasons I think we need to unify the dynamical systems and computer science perspectives of single-neuron computation.

Finally, the reader may note that I have said nothing about the time complexity of learning in dendritic trees. That is a problem for another day.

References:

  1. Michael London & Michael Häusser. Dendritic Computation. Annu. Rev. Neurosci. 2005. 28:503–32
  2. M. Li and P. Vitányi. An Introduction to Kolmogorov Complexity and Its Applications. Graduate Texts in Computer Science. Springer, New York, second edition, 1997.
  3. Roozbeh Farhoodi, Khashayar Filom, Ilenna Simone Jones, Konrad Paul Kording. On functions computed on trees. Arxiv. 2019.
  4. Edwin Jaynes. Information Theory and Statistical Mechanics. The Physical Review. Vol. 106. No 4. 620-630. May 15, 1957.
  5. Cuntz H, Borst A, Segev I.Optimization principles of dendritic structure. Theor Biol Med Model. 2007 Jun 8;4:21.
  6. Alexandra Vormberg ,Felix Effenberger,Julia Muellerleile,Hermann Cuntz. Universal features of dendrites through centripetal branch ordering. PLOS Biology. 2017.
  7. Hermann Cuntz ,Friedrich Forstner,Alexander Borst,Michael Häusser. One Rule to Grow Them All: A General Theory of Neuronal Branching and Its Practical Application. PLOS Biology. August 5, 2010.
  8. Duncan E. Donohue,Giorgio A. Ascoli. A Comparative Computer Simulation of Dendritic Morphology. PLOS Biology. June 6, 2008.
  9. WARREN S. MCCULLOCH AND WALTER PITTS. A LOGICAL CALCULUS OF THE IDEAS IMMANENT IN NERVOUS ACTIVITY. Bulletin of Mothemnticnl Biology Vol. 52, No. l/2. pp. 99-115. 1990.
  10. Ujfalussy BB, Makara, Lengyel, Branco.Global and Multiplexed Dendritic Computations under In Vivo-like Conditions. Neuron. 2018 Nov 7.
  11. Lars Buesing & Wolfgang Maass. A Spiking Neuron as Information Bottleneck. Neural Comput. 2010 Aug.
  12. Anthony Zador & Barak A. Pearlmutter. VC Dimension of an Integrate-and-Fire Neuron Model. 1996.
  13. Vladimir I Arnold. Representation of continuous functions of three variables by the superposition of continuous functions of two variables. Collected Works: Representations of Functions, Celestial Mechanics and KAM Theory, 1957–1965, pages 47–133, 2009.