Loading…
CNS*2020 Online has ended
Welcome to the Sched instance for CNS*2020 Online! Please read the instruction document on detailed information on CNS*2020.
Back To Schedule
Sunday, July 19 • 8:00pm - 9:00pm
P106: Efficient communication in distributed simulations of spiking neuronal networks with gap junctions

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
Jakob Jordan, Moritz Helias, Markus Diesmann, Susanne Kunkel

Investigating the dynamics and function of large-scale spiking neuronal networks with realistic numbers of synapses is made possible today by state- of-the-art simulation code that scales to the largest contemporary supercomputers. These implementations exploit the delayed and point-event like nature of the spike interaction between neurons. In a network with only chemical synapses the dynamics of all neurons is decoupled for the duration of the minimal synaptic transmission delay such that the dynamics of each neuron can be propagated independently for the duration of the minimal delay without requiring information from other neurons. Hence, in distributed simulations of such networks, compute nodes need to communicate spike data only after this period [1].

Electrical interactions, also called gap junctions at first seem to be incompatible with such a communication scheme as they couple membrane potentials of pairs of neurons instantaneously. Hahne et al. [2] however demonstrate that communication of spikes and gap-junction data can be unified using waveform-relaxation methods [3]. Despite these advances, simulations involving gap junctions scale only poorly due to a communication scheme that collects global data on each compute node. In comparison to chemical synapses, gap junctions are far less abundant. To improve scalability we exploit this sparsity by integrating the existing framework for continuous interactions with a recently proposed directed communication scheme for spikes [4]. Using a reference implementation in the NEST simulator ([www.nest- simulator.org](http://www.nest-simulator.org), [5]) we demonstrate excellent scalability of the integrated framework, accelerating large-scale simulations with gap junctions by more than an order of magnitude. This allows, for the first time, the efficient exploration of the interactions of chemical and electrical coupling in large-scale neuronal networks models with natural synapse density distributed across thousands of compute nodes.

Acknowledgements Partly supported by Helmholtz young investigator group VH-NG-1028, European Union’s Horizon 2020 funding framework under grant agreement no. 785907 (Human Brain Project HBP SGA2) and no. 754304 (DEEP-EST), Helmholtz IVF no. SO-092 (Advanced Computing Architectures, ACA). Use of the JURECA supercomputer through VSR grant JINB33.

References

[1] Morrison A, Mehring C, Geisel T, Aertsen A, Diesmann M (2005) Neural Comput 17:1776–1801
[2] Hahne J, Helias M, Kunkel S, Igarashi J, Bolten M, Frommer A, Diesmann M (2015) Front Neuroinform 9:22
[3] Lelarasmee E, Ruehli AE, Sangiovanni-Vincentelli A (1982) IEEE Trans CAD Integ Circ Syst 1:131–145
[4] Jordan J, Ippen T, Helias M, Kitayama I, Sato M, Igarashi J, Diesmann M, Kunkel S (2018) Front Neuroinform 12:2
[5] Gewaltig MO, Diesmann M (2007) Scholarpedia 2:1430

Speakers
avatar for Markus Diesmann

Markus Diesmann

Professor, Institute of Neuroscience and Medicine (INM-6, INM-10) and Institute for Advanced Simulation (IAS-6), Jülich Research Ce
My main scientific interests include correlation structure of neuronal networks, models of cortical networks, simulation technology, supercomputing and neuromorphic computing. I am co-founder of the NEST Initiative and a member of their steering committee.


Sunday July 19, 2020 8:00pm - 9:00pm CEST
Slot 05