Synapse-type-specific competitive Hebbian learning forms functional recurrent networks

Samuel Eckmann, Edward James Young, Julijana Gjorgjieva

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Cortical networks exhibit complex stimulus–response patterns that are based on specific recurrent interactions between neurons. For example, the balance between excitatory and inhibitory currents has been identified as a central component of cortical computations. However, it remains unclear how the required synaptic connectivity can emerge in developing circuits where synapses between excitatory and inhibitory neurons are simultaneously plastic. Using theory and modeling, we propose that a wide range of cortical response properties can arise from a single plasticity paradigm that acts simultaneously at all excitatory and inhibitory connections—Hebbian learning that is stabilized by the synapse-type-specific competition for a limited supply of synaptic resources. In plastic recurrent circuits, this competition enables the formation and decorrelation of inhibition-balanced receptive fields. Networks develop an assembly structure with stronger synaptic connections between similarly tuned excitatory and inhibitory neurons and exhibit response normalization and orientation-specific center-surround suppression, reflecting the stimulus statistics during training. These results demonstrate how neurons can self-organize into functional networks and suggest an essential role for synapse-type-specific competitive learning in the development of cortical circuits.

Original languageEnglish
Article numbere2305326121
Pages (from-to)1-12
Number of pages12
JournalProceedings of the National Academy of Sciences of the United States of America
Volume121
Issue number25
DOIs
StatePublished - 1 Jun 2024

Keywords

  • excitation–inhibition balance
  • recurrent networks
  • surround suppression
  • synaptic plasticity

Fingerprint

Dive into the research topics of 'Synapse-type-specific competitive Hebbian learning forms functional recurrent networks'. Together they form a unique fingerprint.

Cite this