Rolando Bonandrini
banner
rolandobonandrini.bsky.social
Rolando Bonandrini
@rolandobonandrini.bsky.social
Cognitive Neuroscientist. 💬🧠 Post-doctoral fellow @unimib. I love brains, words, and words that don't exist. I also write songs, sometimes.
Reposted by Rolando Bonandrini
How does the brain handle semantic composition?

Our new Cerebral Cortex paper shows the left inferior frontal gyrus (BA45) does it automatically, even when task-irrelevant. We used fMRI + computational models.

Congrats Marco Ciapparelli, Marco Marelli & team!

doi.org/10.1093/cerc...
Compositionality in the semantic network: a model-driven representational similarity analysis
Abstract. Semantic composition allows us to construct complex meanings (e.g., “dog house”, “house dog”) from simpler constituents (“dog”, “house”). Neuroim
academic.oup.com
October 31, 2025 at 6:19 AM
Reposted by Rolando Bonandrini
For those who couldn't attend, the recording of Prof. Harald Baayen's seminar on morphological productivity and the Discriminative Lexicon Model is now available on our YouTube channel.

Watch the full presentation here:
www.youtube.com/watch?v=zN7G...
The Computational Approach to Morphological Productivity | Harald Baayen at Bicocca
YouTube video by Mbs Vector Space Lab
www.youtube.com
September 9, 2025 at 10:45 AM
Reposted by Rolando Bonandrini
📢 Upcoming Seminar!

A computational approach to morphological productivity using the Discriminative Lexicon Model
Professor Harald Baayen (University of Tübingen, Germany)

🗓️ September 8, 2025
2:00 PM - 3:30 PM
📍 UniMiB, Room U6-01C, Milan
🔗 Join remotely: meet.google.com/dkj-kzmw-vzt
August 25, 2025 at 12:52 PM
Reposted by Rolando Bonandrini
I'm sharing a Colab notebook on using large language models for cognitive science! GitHub repo: github.com/MarcoCiappar...

It's geared toward psychologists & linguists and covers extracting embeddings, predictability measures, comparing models across languages & modalities (vision). see examples 🧵
July 18, 2025 at 1:40 PM
New chapter! How can we estimate the meaning of words we have never seen before? Hint: take the words you know, shatter them into little pieces and use these pieces to get the meaning of something new.
March 24, 2025 at 6:44 PM