David Lipshutz
lipshutz.bsky.social
David Lipshutz
@lipshutz.bsky.social
Studying neural computation • Assistant Professor of Neuroscience at Baylor College of Medicine • lipshutzlab.com
Image representations are often compared in terms of their global geometry. I'm excited to present work at #cosyne2025 that proposes a method for comparing image representations based on their *local* geometry.

Poster: 8p tonight #1-112
March 27, 2025 at 3:34 PM
When the inputs are the responses of local (Gabor-like) filters applied to natural images, we find that our nonlinear circuit substantially reduces the redundancy of the responses (orders of magnitude more than linear transformations like ZCA whitening).
December 9, 2024 at 7:06 PM
Our model is derived starting from an optimal transport objective and is closely related to sliced or max-sliced Wasserstein distances.
December 9, 2024 at 7:06 PM
Many efficient coding theories (eg redundancy reduction) can be interpreted in the context of transforming a sensory distribution into an efficient target distribution. We derive a recurrent circuit model with adaptive interneurons that learns to implement such a transformation from data.
December 9, 2024 at 7:06 PM
How do interneurons reshape neural responses? I'm excited to present work with @eerosim.bsky.social at #NeurIPS2024 that proposes a nonlinear recurrent circuit model motivated by efficient coding theory.

Poster: 4:30p on Fri, Dec 13
Paper: openreview.net/forum?id=ojL...
December 9, 2024 at 7:06 PM