Trinity Chung
banner
trinityjchung.com
Trinity Chung
@trinityjchung.com
MS @ Carnegie Mellon University Robotics Institute
BS @ UC Berkeley Computer Science
9/ We thank Albert Gu and Roberta “Bobby” Klatzky for helpful discussions!

Preprint here: arxiv.org/abs/2505.18361
PyTorchTNN library coming very soon – stay tuned!
Task-Optimized Convolutional Recurrent Networks Align with Tactile Processing in the Rodent Brain
Tactile sensing remains far less understood in neuroscience and less effective in artificial systems compared to more mature modalities such as vision and language. We bridge these gaps by introducing...
arxiv.org
May 27, 2025 at 9:47 PM
8/ Our conclusions:
a) ConvRNNs outperform feedforward/SSMs on realistic tactile recognition
b) ConvRNNs best match neural responses in the mice brain
c) Contrastive SSL matches supervised neural alignment, suggesting a general-purpose representation in the somatosensory cortex
May 27, 2025 at 9:47 PM
7/ This work is just the beginning of understanding how touch works in both animals and machines. To continue to improve embodied AI, we'll need richer neural datasets and smarter ways to fuse touch with other senses like vision and proprioception.
May 27, 2025 at 9:47 PM
6/ We found that ConvRNNs (esp. IntersectionRNNs of @jascha.sohldickstein.com @sussillodavid.bsky.social ) beat ResNets, Transformers, and SSMs when it comes to aligning with real mouse somatosensory cortex data. Plus, they pass the NeuroAI Turing Test! (bsky.app/profile/anay...)
May 27, 2025 at 9:47 PM
5/ Crucially, we found that when doing tactile self-supervised learning, we couldn’t apply all of the usual image augmentations out-of-the-box (in fact, they couldn’t train at all!), so we designed transformations explicitly for tactile force & torque inputs:
May 27, 2025 at 9:47 PM
4/ We trained our models using data from simulated mouse whiskers from @MitraHartmann's group brushing various objects. Whiskers realistically output detailed force-torque signals, unlike current robot sensors, which remain difficult to scale and have reduced sensitivity in sim.
May 27, 2025 at 9:47 PM
3/ We systematically explored temporal architectures (unifying ConvRNNs including from @Chengxu & @dyamins.bsky.social et al. 2017's prior tactile work, SSMs, Transformers) via our Encoder-Attender-Decoder (EAD) framework, built using a custom "PyTorchTNN" library we developed.
May 27, 2025 at 9:47 PM
2/ Tactile perception is still considerably under-explored in both Neuroscience and Embodied AI. Our goal is to provide (1) a model of tactile processing we can quantitatively compare against the brain with and (2) inspire better models for tactile processing in robots.
May 27, 2025 at 9:47 PM