David G. Clark
david-g-clark.bsky.social
David G. Clark
@david-g-clark.bsky.social
Theoretical neuroscientist
Research fellow @ Kempner Institute, Harvard
dclark.io
(15/26) We show that, at large N, adding  to J pulls eigenvalue η to λ = η + α, generating complex-conjugate outliers that can drive oscillations. Furthermore, A(0) ≈  can be generated using oscillatory inputs whose single-neuron phases are chosen based on the eigenvector ψ.
August 25, 2025 at 5:17 PM
(13/26) Thus, to create complex outliers, A(t) must become correlated with J. This can happen only if activity reflects both inputs and recurrence, explaining the significance of being in the "intermediate" regime. Indeed, A(t) aligns to eigenvector subspaces of J associated w/ complex eigenvalues.
August 25, 2025 at 5:17 PM
(11/26) We now turn to mechanistic understanding. Persistent oscillations occur when J + A(t) has complex-conjugate outlier eigenvalues at stimulation offset (t=0). In particular, while networks in the persistent-oscillation regime develop complex outliers, other regimes develop only real outliers.
August 25, 2025 at 5:17 PM
(10/26) The frequency of persistent oscillations tracks the input frequency (within a preferred band). That is, faster inputs generally produce faster persistent oscillations. Thus, the autonomous dynamics reflect the temporal structure of previously experienced stimuli, indicating a dynamic memory!
August 25, 2025 at 5:17 PM
(9/26) Persistent oscillations occur in an "intermediate" dynamic regime where activity expresses features of both external inputs and recurrent connectivity. In particular, too-strong inputs lead to neuronal activity being dominated by the input alone, preventing persistent oscillations.
August 25, 2025 at 5:17 PM
(7/26) The most surprising phenomenology happens after input cessation at t≥0: neurons continue oscillating long after stimulation ends, with lifetimes exceeding any intrinsic system timescale, including the plasticity forgetting timescale, by an order of magnitude (panel iii).
August 25, 2025 at 5:17 PM
(5/26) Following Rajan, Abbott, and Sompolinsky (PRE, 2010), we drive these plastic networks with oscillatory inputs, with random phases across neurons, for t<0. At t=0, we halt stimulation. The setup is schematized here:
August 25, 2025 at 5:17 PM
(23/30)This framework extends to MEC grid cells, which show a continuous spectrum of "gridness" at odds with classical models. We model this w/ a 2D gen. process. We optimize networks that function as attractors and derive the DMFT, resembling classical continuous-attractor models for grid cells.
January 29, 2025 at 6:26 PM
(22/30) The DMFT predicts both static and traveling bump solutions, accurately tracking head direction through angular-velocity integration. These emerge from pattern formation via the Mexican-hat connectivity.
January 29, 2025 at 6:26 PM
(21/30) The effective DMFT nonlinearity involves an activity-dependent gain-control mechanism, absent in the original description (and classical ring models). It is essential for maintaining stable bump solutions in the DMFT.
January 29, 2025 at 6:26 PM
(20/30) There are effective Mexican-hat interactions in the DMFT whose shape depends on the tuning-curve statistics and network nonlinearity, with explicit rotation invariance.
January 29, 2025 at 6:26 PM
(18/30)What about the full nonlin. dynamics? Using DMFT, we show that as N→∞, the high-D dynamics become equivalent to CLASSICAL ring attractors with all 3 key ingredients (nonlinearity, Mexican-hat connectivity, continuous symmetry). Bump states arise by spontaneously breaking the continuous symm.
January 29, 2025 at 6:26 PM
(16/30) Spectral embedding techniques, often applied to network connectivity, fail to reveal this structure because the eigenvectors are random. This contrasts with an alternative circulant-plus-noise connectivity, where such techniques succeed.
January 29, 2025 at 6:26 PM
(15/30) This circular geometry manifests in the connectivity's eigenvalue spectrum in the form of a doublet degeneracy, the same as a classical model's circulant matrix. The disordered embedding appears in the eigenvectors, which are random rather than Fourier-like as in classical models.
January 29, 2025 at 6:26 PM
(13/30) Amazingly, this simple, 3-parameter model captures detailed quantitative features of actual head-direction cells across all mice: distributions of peak counts, peak values, tuning-curve asymmetries, and more.
January 29, 2025 at 6:26 PM
(12/30) A key inductive bias in the generative process is distributional circular symmetry (corresponding to a translation-invariant Gaussian process). We find evidence for this in the data through the lack of angular biases in tuning-curve features.
January 29, 2025 at 6:26 PM
(11/30)To extend these models to large-N (the scale of mammalian circuits) and study them analytically, we developed a 3-parameter generative process for sampling synthetic responses. It samples tuning curves from a Gaussian process, followed by a nonlinearity. See also Mainali et al.(bioRxiv,2024).
January 29, 2025 at 6:26 PM
(10/30) Even with intelligent neuron ordering, the connectivity appears highly disordered. Yet these networks exhibit quasi-continuous-attractor dynamics and integrate angular velocity—like classical models, but with heterogeneous responses matching data. What connectivity and dynamics enable this?
January 29, 2025 at 6:26 PM
(9/30) We construct RNNs directly from data. We define a "target manifold" of actual tuning curves and optimize weights to make dxᵢ/dt vanish on this manifold. This procedure, not guaranteed a priori to work, yields networks that behave like ring attractors while exhibiting the data's heterogeneity.
January 29, 2025 at 6:26 PM
(8/30) We analyze recordings from mouse head direction cells (Duszkiewicz et al. Nat Neurosci 2024). Instead of focusing on classical-looking cells, we embrace tuning diversity to understand how heterogeneity coexists with continuous-attractor dynamics.
January 29, 2025 at 6:26 PM
(4/30)
1. Nonlinear neurons
2. Pattern-forming connectivity (local excitation, longer-range inhibition)
3. A continuous symmetry matching the encoded variable

Combined, these generate a manifold of stable states tracking the variable.
January 29, 2025 at 6:26 PM
Here’s how the reduction to the dynamics of overlaps for low-rank RNNs works for the two possible placements of the nonlinearity. Does this answer your question?
December 12, 2024 at 12:51 AM
(4/5) These seemingly different problems share a common structure: a bipartite system in which feature variables interact with datum variables through quenched random couplings. This enables a unified analysis.
December 3, 2024 at 7:34 PM