Grateful to my co-authors, Claus Hilgetag, @kayson.bsky.social, and Moein Khajehnejad for their invaluable contributions,
Grateful to my co-authors, Claus Hilgetag, @kayson.bsky.social, and Moein Khajehnejad for their invaluable contributions,
🧠 Neuroscience — functional rationale for the evolutionary suppression of strong reciprocal motifs.
🤖 NeuroAI — reciprocity as a tunable design parameter in recurrent & neuromorphic networks.
🧠 Neuroscience — functional rationale for the evolutionary suppression of strong reciprocal motifs.
🤖 NeuroAI — reciprocity as a tunable design parameter in recurrent & neuromorphic networks.
Because reciprocity systematically hurts computation.
Suppressing strong loops preserves:
working memory
representational diversity
stable but flexible dynamics
Because reciprocity systematically hurts computation.
Suppressing strong loops preserves:
working memory
representational diversity
stable but flexible dynamics
Result: the same pattern.
Strong reciprocity consistently undermines memory and representational richness.
Result: the same pattern.
Strong reciprocity consistently undermines memory and representational richness.
- Higher reciprocity → larger spectral radius (instability risk).
- Narrower spectral gap → less dynamical diversity.
- Lower non-normality → weaker transient amplification.
Together → compressed dynamical range.
- Higher reciprocity → larger spectral radius (instability risk).
- Narrower spectral gap → less dynamical diversity.
- Lower non-normality → weaker transient amplification.
Together → compressed dynamical range.
- Increasing reciprocity (link as well as strength reciprocity) reduces memory capacity.
- Representation becomes less diverse (lower kernel rank).
- Effects are strongest in ultra-sparse networks.
- Increasing reciprocity (link as well as strength reciprocity) reduces memory capacity.
- Representation becomes less diverse (lower kernel rank).
- Effects are strongest in ultra-sparse networks.
- Reservoir computing to isolate structure from learning.
- Networks of 64–256 nodes, in both ultra-sparse and sparse regimes.
- Topologies: small-world, hierarchical modular, core–periphery, hybrid, and nulls.
- Metrics: memory capacity, kernel rank, spectral analysis.
- Reservoir computing to isolate structure from learning.
- Networks of 64–256 nodes, in both ultra-sparse and sparse regimes.
- Topologies: small-world, hierarchical modular, core–periphery, hybrid, and nulls.
- Metrics: memory capacity, kernel rank, spectral analysis.
In this study, we apply NRC to systematically test how reciprocity shapes computation in recurrent networks.
In this study, we apply NRC to systematically test how reciprocity shapes computation in recurrent networks.
Across species (macaque, marmoset, mouse), strong reciprocal (symmetric) connections are rare.
This asymmetry is well known anatomically.
But what are its computational consequences?
Across species (macaque, marmoset, mouse), strong reciprocal (symmetric) connections are rare.
This asymmetry is well known anatomically.
But what are its computational consequences?