Elsevier

Current Opinion in Neurobiology

Volume 55, April 2019, Pages 103-111
Current Opinion in Neurobiology

Towards the neural population doctrine

https://doi.org/10.1016/j.conb.2019.02.002Get rights and content

Highlights

  • New generations of recording and computing technologies have enabled neuroscience at the level of the neural population.

  • Landmark scientific findings suggest the arrival of the neural population doctrine, where the population, not the single neuron, is the central object of computation.

  • State space models offer a principled framework for population-level analyses.

  • Four subfields of neuroscience have benefitted particularly from population analyses; we highlight key findings.

Across neuroscience, large-scale data recording and population-level analysis methods have experienced explosive growth. While the underlying hardware and computational techniques have been well reviewed, we focus here on the novel science that these technologies have enabled. We detail four areas of the field where the joint analysis of neural populations has significantly furthered our understanding of computation in the brain: correlated variability, decoding, neural dynamics, and artificial neural networks. Together, these findings suggest an exciting trend towards a new era where neural populations are understood to be the essential unit of computation in many brain regions, a classic idea that has been given new life.

Introduction

The neuron has been heralded as the structural and functional unit of neural computation for more than a hundred years [1, 2, 3], and this doctrine has driven a vast array of our field's most important discoveries [4]. Increasingly prevalent, however, is the idea that populations of neurons may in fact be the essential unit of computation in many brain regions [4, 5, 6]. Certainly the idea of information being encoded in an ensemble of neurons is not a new one. In fact, the conflict between single neurons and populations of neurons as the perceptual, behavioral, and cognitive unit of computation dates back to the beginning of the 20th century [4,6], with the first concrete theories of networks of neurons introduced in the 1940s [7,8]. The ideas, while being novel, were not testable due to the technological shortcomings of both recording techniques and computational resources. However, we currently find ourselves in the ideal era for scientific discovery, given the astounding progresses in both these enabling technologies.

Electrophysiology recordings have been the hallmark of neuronal recordings over the last 80 years — extracellular recordings of one or multiple electrodes, each capturing up to a few neurons. More recently, multi-electrode arrays and imaging techniques (optical, and more recently voltage) have been used to efficiently capture the simultaneous activity of hundreds and thousands of neurons, with this number steadily growing using tools such as the Neuropixel [9]. In tandem, the increase in computational resources has led to the development of efficient and scalable statistical and machine learning methods; see the methodological reviews [10,11].

As our ability to simultaneously record from large populations of neurons is growing exponentially [9,12], the analysis of the covariation of populations of neurons has provided us with scientific insights in many domains. Here, we highlight several recent findings in four domains of neuroscience where the joint analysis of a population of neurons has been central to scientific discovery that would not be possible using single neurons alone. Firstly, trial-to-trial ‘noise’ correlations have been shown to influence the information carrying capacity of a neural circuit. Secondly, decoding of behavior using correlated populations of neurons can yield levels of accuracy beyond what would be anticipated from single neurons alone. Thirdly, the dynamic analysis of stimulus-driven population recordings over time can be projected into a lower dimensional subspace to reveal computational strategies employed by different brain regions. Lastly, artificial neural networks (ANNs) can aid in simulations that reproduce population structure, as well as directly modeling neuronal activity. We focus on the analysis of a population of N neurons in ‘state space’, where each neuron's activity at any time point is represented as a dot in either the N dimensional observation space or in a lower dimensional subspace.

We pinpoint one or two recent studies in each domain that stand out (indicated using * and **). Unlike previous reviews on population-level neuroscience [10,11], we focus here not on the data analysis methodologies, but rather the notable scientific findings that have resulted. These scientific findings are, first, reshaping the way the field thinks about computation, and, second, fundamentally population-based. Taken together, these two features point to a future where the central scientific theme is not the neuron doctrine, but the neural population doctrine. We conclude with topics that we think future studies may need to address.

Section snippets

Correlated trial-to-trial variability in populations of neurons is a key indicator for behavioral states

As we gain the ability to simultaneously record the activity of more and more neurons, we must ask how much information we can hope to achieve by doing so, that is what is the information gained per added neuron? Generally speaking, ‘signal’ correlations, or tuning curves, are useful in terms of decoding as well as understanding the dynamics of neural population over time. However, in the pursuit of this specific question, the systematic study of covariation in the activity of pairs of neurons

Decoding accuracy is more than the sum of its parts

Here we focus on the case where population analysis is virtually essential: when neurons have so-called ‘mixed selectivity’ [5, 20]. As an increasing number of experiments involving multiple categories of stimuli are being performed, neurons in several brain areas have been found to have mixed selectivity: neurons significantly modulate their activity in response to more than one stimulus category/type [21, 5]. For example, in Figure 2a, we see two neurons encoding for two different stimuli;

Analysis of neural activity over time reveals computational strategies

Considering neural population activity over time as states in a dynamical system has a long history, for example as in [26], where authors examine the potential mechanisms of memory and error correction using neuron-like components. This dynamical systems perspective is now prevalent in neuroscience, with the motor regions being the most natural testing ground, since we have access to the time-varying behavior as a direct output, and in fact, significant work has shown the value of this

Nodes in Artificial Neural Networks emulate activity in certain regions of the brain

Although we have been modeling neural activity using artificial nodes for a long time [26], modeling with networks of neurons has been limited to carefully designed studies and an intricate hand-tuning of parameters (see e.g. [44]). The massive advances in learning deep neural networks has made using artificial neural networks (ANNs) with a large number of nodes and layers, with a large variety of structures, more approachable. However, the potential of ANNs to accurately describe neuronal

Looking ahead: when can we trust the results of population-level analyses?

Throughout this review we have highlighted exciting findings that have resulted from the joint analysis of neural populations, works that exemplify the broad and rapidly growing trend in the field towards the neural population doctrine. However, this exciting progress has ignited a serious and increasingly contentious debate about whether these analyses are actually producing novel findings about the brain, or if they are simply recapitulating ‘old knowledge dressed up in new clothes’ [67].

Conclusions

As recording techniques and computing capabilities continue to improve, experimental and computational studies continue to demonstrate that neuronal populations may in fact be the relevant unit of computation in many brain regions. Throughout this review, we have pointed out various studies that support this scientific trend, in domains spanning correlated variability, decoding, dynamical activity, and artificial neural networks.

Going forward, there are still major concerns in these domains

Conflict of interest statement

Nothing declared.

References and recommended reading

Papers of particular interest, published within the period of review, have been highlighted as:

  • • of special interest

  • •• of outstanding interest

Acknowledgements

This work was supported by the Swiss National Science Foundation (Research Award P2SKP2_178197), NIH R01NS100066, Simons Foundation 542963, NSF NeuroNex DBI-1707398, The Gatsby Charitable Foundation, the Sloan Foundation, and the McKnight Foundation.

References (77)

  • J.W. Pillow et al.

    Spatio-temporal correlations and visual signalling in a complete neuronal population

    Nature

    (2008)
  • E. Batty et al.

    Multilayer recurrent network models of primate retinal ganglion cell responses

    (2016)
  • Ramón et al.

    Histology of the Nervous System of Man and Vertebrates

    (1995)
  • C.S. Sherrington

    Observations on the scratch-reflex in the spinal dog

    J Physiol

    (1906)
  • H.B. Barlow

    Summation and inhibition in the frog's retina

    J Physiol

    (1953)
  • R. Yuste

    From the neuron doctrine to neural networks

    Nat Rev Neurosci

    (2015)
  • H. Eichenbaum

    Barlow versus Hebb: when is it time to abandon the notion of feature detectors and adopt the cell assembly as the unit of cognition?

    Neurosci Lett

    (2017)
  • W.S. McCulloch et al.

    A logical calculus of the ideas immanent in nervous activity

    Bull Math Biophys

    (1943)
  • D.O. Hebb

    The Organization of Behavior: A Neuropsychological Theory

    (2005)
  • J.P. Cunningham et al.

    Dimensionality reduction for large-scale neural recordings

    Nat Neurosci

    (2014)
  • I.H. Stevenson et al.

    How advances in neural recording affect data analysis

    Nat Neurosci

    (2011)
  • E. Zohary et al.

    Correlated neuronal discharge rate and its implications for psychophysical performance

    Nature

    (1994)
  • R. Moreno-Bote et al.

    Information-limiting correlations

    Nat Neurosci

    (2014)
  • L.F. Abbott et al.

    The effect of correlated variability on the accuracy of a population code

    Neural Comput

    (1999)
  • B.B. Averbeck et al.

    Neural correlations, population coding and computation

    Nat Rev Neurosci

    (2006)
  • A. Kohn et al.

    Correlations and neuronal population information

    Annu Rev Neurosci

    (2016)
  • A. Ni et al.

    Learning and attention reveal a general relationship between population activity and behavior

    Science

    (2018)
  • M. Rigotti et al.

    The importance of mixed selectivity in complex cognitive tasks

    Nature

    (2013)
  • N.C. Rust

    Population-based representations: from implicit to explicit

    Cognit Neurosci

    (2014)
  • D. Raposo et al.

    A category-free neural population supports evolving demands during decision-making

    Nat Neurosci

    (2014)
  • J.J. DiCarlo et al.

    How does the brain solve visual object recognition?

    Neuron

    (2012)
  • A. Saez et al.

    Abstract context representations in primate amygdala and prefrontal cortex

    Neuron

    (2015)
  • M.M. Churchland et al.

    Neural population dynamics during reaching

    Nature

    (2012)
  • S. Saxena et al.

    Performance limitations in sensorimotor control: tradeoffs between neural computing and accuracy in tracking fast movements

    bioRxiv

    (2018)
  • M.M. Churchland et al.

    A dynamical basis set for generating reaches

    Cold Spring Harbor Symposia on Quantitative Biology

    (2014)
  • Y. Gao et al.

    Linear dynamical neural population models through nonlinear embeddings

    Advances in Neural Information Processing Systems

    (2016)
  • B. Sauerbrei et al.

    Motor cortex is an input-driven dynamical system controlling dexterous movement

    bioRxiv

    (2018)
  • M.M. Churchland et al.

    Cortical preparatory activity: representation of movement or first cog in a dynamical machine?

    Neuron

    (2010)
  • Cited by (183)

    • Signatures of task learning in neural representations

      2023, Current Opinion in Neurobiology
    View all citing articles on Scopus
    View full text