Dendritic computing: Branching deeper into machine learning

Jyotibdha Acharya*, Arindam Basu, Robert Legenstein, Thomas Limbacher, Panayiota Poirazi, Xundong Wu

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review


In this paper, we discuss the nonlinear computational power provided by dendrites in biological and artificial neurons. We start by briefly presenting biological evidence about the type of dendritic nonlinearities, respective plasticity rules and their effect on biological learning as assessed by computational models. Four major computational implications are identified as improved expressivity, more efficient use of resources, utilizing internal learning signals, and enabling continual learning. We then discuss examples of how dendritic computations have been used to solve real-world classification problems with performance reported on well known data sets used in machine learning. The works are categorized according to the three primary methods of plasticity used—structural plasticity, weight plasticity, or plasticity of synaptic delays. Finally, we show the recent trend of confluence between concepts of deep learning and dendritic computations and highlight some future research directions.
Original languageEnglish
Pages (from-to)275-289
Number of pages15
Publication statusPublished - 1 May 2022


  • deep neural networks
  • expressivity
  • machine learning
  • maxout networks
  • non-linear dendrites
  • plasticity
  • rewiring

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this