kabirdabholkar.bsky.social
@kabirdabholkar.bsky.social
PhD student at Barak Lab, Technion
https://kabirdabholkar.github.io/
Reposted
At #NeurIPS? Curious about how RNNs learn differently in closed-loop (RL) vs. open-loop (supervised) settings?
Come by Poster #2107 on Thursday at 4:30 PM!

neurips.cc/virtual/2025...
December 3, 2025 at 1:12 AM
Reposted
0/10 Thanks for the interest in our preprint. Some takes say it negates or fully supports the “manifold hypothesis”, neither quite right. Our results show that if you only focus on the manifold capturing most of task-related variance, you could miss important dynamics that actually drive behavior.
“Our findings challenge the conventional focus on low-dimensional coding subspaces as a sufficient framework for understanding neural computations, demonstrating that dimensions previously considered task-irrelevant and accounting for little variance can have a critical role in driving behavior.”
Neural dynamics outside task-coding dimensions drive decision trajectories through transient amplification
Most behaviors involve neural dynamics in high-dimensional activity spaces. A common approach is to extract dimensions that capture task-related variability, such as those separating stimuli or choice...
www.biorxiv.org
December 2, 2025 at 7:48 AM
#NeuRIPS2025
Wanna find decision boundaries in your RNN? Or learn about Koopman Eigenfunctions? Come to my poster.

#2002
Wednesday, Dec 3, 11am-2pm.
Exhb. Hall C,D,E
San Diego

neurips.cc/virtual/2025...
NeurIPS Poster Finding separatrices of dynamical flows with Deep Koopman EigenfunctionsNeurIPS 2025
neurips.cc
December 2, 2025 at 5:47 PM