Loading…
CNS*2020 Online has ended
Welcome to the Sched instance for CNS*2020 Online! Please read the instruction document on detailed information on CNS*2020.
Back To Schedule
Monday, July 20 • 8:00pm - 9:00pm
P15: Learning sequences of correlated patterns in recurrent networks

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
Subhadra Mokashe, Nicolas Brunel
google meet link
What determines the format of memory representations in cortical networks is a subject of active research. During memory tasks, the retrieval of stored memories is characterized either by the persistent elevation in the firing rate of a set of neurons (‘persistent activity’) [1] or by ordered transient activation of different sets of neurons (‘sequential activity’) [3]. Multiple theoretical studies have shown that temporally symmetric Hebbian learning rules give rise to fixed point attractor representation of memory (e.g., [4] and references therein), while temporally asymmetric learning rules lead to a dynamic sequential representation of memories (e.g., [3] and references therein). These studies assume that inputs to the network during learning have no temporal correlations.
The sensory information received by brain networks is likely to be temporally correlated. We study temporally asymmetric Hebbian learning rules in a recurrent network of rate-based neurons in the presence of temporal correlations in the inputs and characterize how the inputs shape the network dynamics and memory representation using both numerical simulations and mean- field analysis. We show that the network dynamics depend on the temporal correlations in the input stream the network receives. For inputs with short correlation timescale, the network exhibits sequential activity (Fig.1 A left), while for longer correlations within the stream of input, the network settles into a fixed point attractor during retrieval (Fig.1 A right). At intermediate value of correlations, the network partially traverses the input sequence before settling into an attractor state (Fig.1 A middle). We find that correlations increase the sequential memory capacity of the network. Non- linear learning rules increase the range of timescale of correlation for which the networks represent the memories as sequential activity in the network (Fig.1 B). We also show that the network maintains a sequential representation, both in the case of sequences of discrete patterns and in the continuum limit (Fig.1 C). Our work thus suggests that the correlation time scales of inputs at the time of learning have a strong influence on the nature of network dynamics during retrieval.

References [1] S. Funahashi, C. J. Bruce, and P. S. Goldman-Rakic. Mnemonic coding of visual space in the monkey’s dorsolateral prefrontal cortex. Journal of Neurophysiology, 61(2):331–349, 1989. [2] Maxwell Gillett, Ulises Pereira, and Nicolas Brunel. Characteristics of sequential activity in networks with temporally asymmetric hebbian learning. bioRxiv, 2019. [3] Christopher D. Harvey, Philip Coen, and David W. Tank. Choice-specific sequences in parietal cortex during a virtual-navigation decision task. Nature, 484(7392):62–68, 2012.
[4] Ulises Pereira and Nicolas Brunel. Unsupervised learning of persistent and sequential activity. Frontiers in Computational Neuroscience, 13:97, 2020.

Speakers
SM

Subhadra Mokashe

Neurobiology, Duke Univ.



Monday July 20, 2020 8:00pm - 9:00pm CEST
Slot 16