Victor Geadah
vgeadah.bsky.social
Victor Geadah
@vgeadah.bsky.social
PhD candidate in statistical neuroscience at Princeton. https://victorgeadah.github.io
In all, CLDS bridges classic LDS and modern nonlinear models:
- Interpretable: linear dynamics conditioned on task variables
- Expressive: parameters vary nonlinearly over conditions
- Efficient: closed-form and fast inference, and shares statistical power across conditions. [6/6]
December 3, 2025 at 5:44 PM
We demonstrated CLDS on a range of synthetic tasks and datasets, showing how to link dynamical structure to behaviorally relevant variables in a transparent way. [5/6]
December 3, 2025 at 5:44 PM
Because CLDS is linear in x given u and uses GP priors over u, we have:
✅ Exact latent state inference with Kalman filtering/smoothing;
✅ Tractable Bayesian learning via closed-form EM updates using “conditionally linear regression”, a trick in a basis-function space. [4/5]
December 3, 2025 at 5:44 PM
CLDS = linear dynamical system in latent state (x), whose coefficients depend nonlinearly on task conditions (u) through Gaussian processes (GP)

CLDS leverages conditions to approximate the full nonlinear dynamics with locally linear LDSs, bridging the benefits of linear and nonlinear models. [3/5]
December 3, 2025 at 5:44 PM
This is joint work with amazing collaborators: Amin Nejatbakhsh (@aminejat.bsky.social), David Lipshutz (
@lipshutz.bsky.social), Jonathan Pillow (@jpillowtime.bsky.social), and Alex Williams (@itsneuronal.bsky.social).

🔗 OpenReview: openreview.net/forum?id=xgm...
🖥️ Code: github.com/neurostatsla...
Modeling Neural Activity with Conditionally Linear Dynamical Systems
Neural population activity exhibits complex, nonlinear dynamics, varying in time, over trials, and across experimental conditions. Here, we develop *Conditionally Linear Dynamical System* (CLDS)...
openreview.net
December 3, 2025 at 5:44 PM