GerstnerLab
@gerstnerlab.bsky.social
The Laboratory of Computational Neuroscience @EPFL studies models of neurons, networks of neurons, synaptic plasticity, and learning in the brain.
P4 52 “Coding Schemes in Non-Lazy Artificial Neural Networks” by @avm.bsky.social
September 30, 2025 at 9:29 AM
P4 52 “Coding Schemes in Non-Lazy Artificial Neural Networks” by @avm.bsky.social
WEDNESDAY 14:00 – 15:30
P4 25 “Rarely categorical, always high-dimensional: how the neural code changes along the cortical hierarchy” by @shuqiw.bsky.social
P4 35 “Biologically plausible contrastive learning rules with top-down feedback for deep networks” by @zihan-wu.bsky.social
P4 25 “Rarely categorical, always high-dimensional: how the neural code changes along the cortical hierarchy” by @shuqiw.bsky.social
P4 35 “Biologically plausible contrastive learning rules with top-down feedback for deep networks” by @zihan-wu.bsky.social
September 30, 2025 at 9:29 AM
WEDNESDAY 14:00 – 15:30
P4 25 “Rarely categorical, always high-dimensional: how the neural code changes along the cortical hierarchy” by @shuqiw.bsky.social
P4 35 “Biologically plausible contrastive learning rules with top-down feedback for deep networks” by @zihan-wu.bsky.social
P4 25 “Rarely categorical, always high-dimensional: how the neural code changes along the cortical hierarchy” by @shuqiw.bsky.social
P4 35 “Biologically plausible contrastive learning rules with top-down feedback for deep networks” by @zihan-wu.bsky.social
WEDNESDAY 12:30 – 14:00
P3 4 “Toy Models of Identifiability for Neuroscience” by @flavioh.bsky.social
P3 55 “How many neurons is “infinitely many”? A dynamical systems perspective on the mean-field limit of structured recurrent neural networks” by Louis Pezon
P3 4 “Toy Models of Identifiability for Neuroscience” by @flavioh.bsky.social
P3 55 “How many neurons is “infinitely many”? A dynamical systems perspective on the mean-field limit of structured recurrent neural networks” by Louis Pezon
September 30, 2025 at 9:29 AM
WEDNESDAY 12:30 – 14:00
P3 4 “Toy Models of Identifiability for Neuroscience” by @flavioh.bsky.social
P3 55 “How many neurons is “infinitely many”? A dynamical systems perspective on the mean-field limit of structured recurrent neural networks” by Louis Pezon
P3 4 “Toy Models of Identifiability for Neuroscience” by @flavioh.bsky.social
P3 55 “How many neurons is “infinitely many”? A dynamical systems perspective on the mean-field limit of structured recurrent neural networks” by Louis Pezon
P2 65 “Rate-like dynamics of spiking neural networks” by Kasper Smeets
September 30, 2025 at 9:29 AM
P2 65 “Rate-like dynamics of spiking neural networks” by Kasper Smeets
TUESDAY 18:00 – 19:30
P2 2 “Biologically informed cortical models predict optogenetic perturbations” by @bellecguill.bsky.social
P2 12 “High-precision detection of monosynaptic connections from extra-cellular recordings” by @shuqiw.bsky.social
P2 2 “Biologically informed cortical models predict optogenetic perturbations” by @bellecguill.bsky.social
P2 12 “High-precision detection of monosynaptic connections from extra-cellular recordings” by @shuqiw.bsky.social
September 30, 2025 at 9:29 AM
TUESDAY 18:00 – 19:30
P2 2 “Biologically informed cortical models predict optogenetic perturbations” by @bellecguill.bsky.social
P2 12 “High-precision detection of monosynaptic connections from extra-cellular recordings” by @shuqiw.bsky.social
P2 2 “Biologically informed cortical models predict optogenetic perturbations” by @bellecguill.bsky.social
P2 12 “High-precision detection of monosynaptic connections from extra-cellular recordings” by @shuqiw.bsky.social
Work led by Martin Barry with the supervision of Wulfram Gerstner and Guillaume Bellec @bellecguill.bsky.social
September 4, 2025 at 4:00 PM
Work led by Martin Barry with the supervision of Wulfram Gerstner and Guillaume Bellec @bellecguill.bsky.social
In experiments (models & simulations), we showed how this approach supports stable retention of old tasks while learning new ones (split CIfar-100, ASC…)
September 4, 2025 at 4:00 PM
In experiments (models & simulations), we showed how this approach supports stable retention of old tasks while learning new ones (split CIfar-100, ASC…)
We designed a Bio-inspired Context-specific gating of plasticity and neuronal activity allowing for a drastic reduction in catastrophic forgetting.
We also show the capacity of our model of both forward and backward transfer! All of this thanks to the shared neuronal activity across tasks.
We also show the capacity of our model of both forward and backward transfer! All of this thanks to the shared neuronal activity across tasks.
September 4, 2025 at 4:00 PM
We designed a Bio-inspired Context-specific gating of plasticity and neuronal activity allowing for a drastic reduction in catastrophic forgetting.
We also show the capacity of our model of both forward and backward transfer! All of this thanks to the shared neuronal activity across tasks.
We also show the capacity of our model of both forward and backward transfer! All of this thanks to the shared neuronal activity across tasks.
We designed a Gating/Availabilty model that detects selective neurons - most useful neuron for the task - during learning, shunt activity of the others (Gating) and decrease the learning rate of task selective neuron (Availability)
September 4, 2025 at 4:00 PM
We designed a Gating/Availabilty model that detects selective neurons - most useful neuron for the task - during learning, shunt activity of the others (Gating) and decrease the learning rate of task selective neuron (Availability)
Work lead by Valentin Schmutz (@bio-emergent.bsky.social), in collaboration with Johanni Brea and Wulfram Gerstner.
Emergent Rate-Based Dynamics in Duplicate-Free Populations of Spiking Neurons
Can spiking neural networks (SNNs) approximate the dynamics of recurrent neural networks? Arguments in classical mean-field theory based on laws of large numbers provide a positive answer when each ne...
journals.aps.org
August 8, 2025 at 3:25 PM
Work lead by Valentin Schmutz (@bio-emergent.bsky.social), in collaboration with Johanni Brea and Wulfram Gerstner.