Fatemeh_Hadaeghi
banner
fatemehhadaeghi.bsky.social
Fatemeh_Hadaeghi
@fatemehhadaeghi.bsky.social
NeuroAI Researcher @ICNS_Hamburg, PhD in Biomedical Engineering
I would also like to thank prominent figures in the field, Sara Solla, Petra Vertes, @kenmiller.bsky.social, @bendfulcher.bsky.social, @jlizier.bsky.social, @danakarca.bsky.social, MarcusKaiser, Gorka Zamora-López, and Patrick Desrosiers, who provided feedback during lab visits and conferences.
September 27, 2025 at 10:04 PM
This work has been in progress for 3+ years.
Grateful to my co-authors, Claus Hilgetag, @kayson.bsky.social, and Moein Khajehnejad for their invaluable contributions,
September 27, 2025 at 9:48 PM
Implications:
🧠 Neuroscience — functional rationale for the evolutionary suppression of strong reciprocal motifs.
🤖 NeuroAI — reciprocity as a tunable design parameter in recurrent & neuromorphic networks.
September 27, 2025 at 9:44 PM
So why does the brain avoid strong loops?
Because reciprocity systematically hurts computation.

Suppressing strong loops preserves:
working memory
representational diversity
stable but flexible dynamics
September 27, 2025 at 9:43 PM
We validated this on empirical connectomes (macaque long-distance, macaque visual cortex, marmoset).

Result: the same pattern.
Strong reciprocity consistently undermines memory and representational richness.
September 27, 2025 at 9:42 PM
Spectral analysis explains why:

- Higher reciprocity → larger spectral radius (instability risk).
- Narrower spectral gap → less dynamical diversity.
- Lower non-normality → weaker transient amplification.

Together → compressed dynamical range.
September 27, 2025 at 9:41 PM
Interestingly, hierarchical modular networks consistently outperformed random counterparts, but only when reciprocity was low. However, the comparative advantages of network topologies shift with reciprocity, sparsity, and weight distribution
September 27, 2025 at 9:41 PM
Findings (robust across sizes, densities, architectures):
- Increasing reciprocity (link as well as strength reciprocity) reduces memory capacity.
- Representation becomes less diverse (lower kernel rank).
- Effects are strongest in ultra-sparse networks.
September 27, 2025 at 9:38 PM
Methods:
- Reservoir computing to isolate structure from learning.
- Networks of 64–256 nodes, in both ultra-sparse and sparse regimes.
- Topologies: small-world, hierarchical modular, core–periphery, hybrid, and nulls.
- Metrics: memory capacity, kernel rank, spectral analysis.
September 27, 2025 at 9:36 PM
In earlier work (www.biorxiv.org/content/10.1...), we developed Network Reciprocity Control (NRC): algorithms that adjust reciprocity (link + strength) while preserving network structure.

In this study, we apply NRC to systematically test how reciprocity shapes computation in recurrent networks.
September 27, 2025 at 9:35 PM
The no-strong-loops principle:
Across species (macaque, marmoset, mouse), strong reciprocal (symmetric) connections are rare.

This asymmetry is well known anatomically.
But what are its computational consequences?
September 27, 2025 at 9:34 PM