Tommy Botch
@tommybotch.bsky.social
How does sensory context facilitate language acquisition? We hypothesized that training LLMs with richer sensory context would partially ”rescue” performance relative to humans.
LLMs trained with richer sensory context performed better with much less data (~20%)!
LLMs trained with richer sensory context performed better with much less data (~20%)!
May 5, 2025 at 2:49 PM
How does sensory context facilitate language acquisition? We hypothesized that training LLMs with richer sensory context would partially ”rescue” performance relative to humans.
LLMs trained with richer sensory context performed better with much less data (~20%)!
LLMs trained with richer sensory context performed better with much less data (~20%)!
However, LLMs are can still be useful models of behavior –LLM prediction distributions better reflected human predictions when reading written language.
May 5, 2025 at 2:49 PM
However, LLMs are can still be useful models of behavior –LLM prediction distributions better reflected human predictions when reading written language.
Human predictions were more accurate than LLMs, even when just reading! This performance gap scaled with the amount of sensory context provided to humans.
May 5, 2025 at 2:49 PM
Human predictions were more accurate than LLMs, even when just reading! This performance gap scaled with the amount of sensory context provided to humans.
We collected human predictions of natural language under varying sensory conditions (written, audio- only, or audiovisual language) and compared these with LLM predictions to the same stories.
May 5, 2025 at 2:49 PM
We collected human predictions of natural language under varying sensory conditions (written, audio- only, or audiovisual language) and compared these with LLM predictions to the same stories.
While NLP models generally align with how humans represent language, we found that this is not always the case.
Neural representations of concrete concepts were more similar than those of abstract concepts, despite spanning larger distances in semantic space!
7/9
Neural representations of concrete concepts were more similar than those of abstract concepts, despite spanning larger distances in semantic space!
7/9
September 15, 2023 at 3:57 PM
While NLP models generally align with how humans represent language, we found that this is not always the case.
Neural representations of concrete concepts were more similar than those of abstract concepts, despite spanning larger distances in semantic space!
7/9
Neural representations of concrete concepts were more similar than those of abstract concepts, despite spanning larger distances in semantic space!
7/9
This result was surprising to us! Why do concrete concepts show more reliable representations than abstract concepts?
We used NLP tools to cluster concrete & abstract concepts in semantic space and compared neural representations of these “concept clusters” across stories.
6/9
We used NLP tools to cluster concrete & abstract concepts in semantic space and compared neural representations of these “concept clusters” across stories.
6/9
September 15, 2023 at 3:57 PM
This result was surprising to us! Why do concrete concepts show more reliable representations than abstract concepts?
We used NLP tools to cluster concrete & abstract concepts in semantic space and compared neural representations of these “concept clusters” across stories.
6/9
We used NLP tools to cluster concrete & abstract concepts in semantic space and compared neural representations of these “concept clusters” across stories.
6/9
Are concrete and abstract concepts contributing equally to this reliability?
By dichotomizing the continuous concrete-abstract axis, we show that concrete concepts exhibit more stable patterns of representation than abstract concepts.
5/9
By dichotomizing the continuous concrete-abstract axis, we show that concrete concepts exhibit more stable patterns of representation than abstract concepts.
5/9
September 15, 2023 at 3:57 PM
Are concrete and abstract concepts contributing equally to this reliability?
By dichotomizing the continuous concrete-abstract axis, we show that concrete concepts exhibit more stable patterns of representation than abstract concepts.
5/9
By dichotomizing the continuous concrete-abstract axis, we show that concrete concepts exhibit more stable patterns of representation than abstract concepts.
5/9
These patterns of representations were not only reliable, but also unique to individual subjects.
Using a fingerprinting approach, we demonstrate the ability to identify subjects solely based on their representations of concrete and abstract concepts!
4/9
Using a fingerprinting approach, we demonstrate the ability to identify subjects solely based on their representations of concrete and abstract concepts!
4/9
September 15, 2023 at 3:56 PM
These patterns of representations were not only reliable, but also unique to individual subjects.
Using a fingerprinting approach, we demonstrate the ability to identify subjects solely based on their representations of concrete and abstract concepts!
4/9
Using a fingerprinting approach, we demonstrate the ability to identify subjects solely based on their representations of concrete and abstract concepts!
4/9
Across stories, we found that an axis of concrete and abstract concepts evoked highly reliable neural representations — more reliable than other semantic properties.
3/9
3/9
September 15, 2023 at 3:56 PM
Across stories, we found that an axis of concrete and abstract concepts evoked highly reliable neural representations — more reliable than other semantic properties.
3/9
3/9
We compared the neural representations of semantic properties across diverse stories to identify properties of language that 1) varied across the population and 2) were reliable within individuals across experiences.
2/9
2/9
September 15, 2023 at 3:55 PM
We compared the neural representations of semantic properties across diverse stories to identify properties of language that 1) varied across the population and 2) were reliable within individuals across experiences.
2/9
2/9