James Trujillo
jamestrujillo.bsky.social
James Trujillo
@jamestrujillo.bsky.social
Assistant prof at University of Amsterdam using motion tracking and virtual agents to study multimodal language and social interaction.
Autistic and interested in autistic language and interaction.
Dad/papa, artist, runner, gamer.
It was very inspiring to join the other two shortlisted candidates for the Leo Waaijers award and represent the team in Groningem today. @wimpouw.bsky.social @sarkadava.bsky.social @babajideowoyele.bsky.social Big congrats to winner Chris Hartgerink! #OpenScience #NOSF2025
October 24, 2025 at 4:57 PM
On my way back from my last conference/event of the academic year! It's been so fun to meet so many great people doing cool stuff. I certainly have inspiration to last the summer! Thanks for organizing this @adw.bsky.social , and for letting me pick your brain about affordances for 3 days! #EWEP16
July 18, 2025 at 6:19 PM
A couple of days late to post this, but this is one of the great things about in person conferences: getting to hang out with people in very different, non-work settings! Had a great time on Sunday canoeing in Utrecht with @ashleydemarchena.bsky.social , who is normally only see on Zoom!
July 15, 2025 at 12:11 PM
What a great time at @isgs2025.bsky.social ! So great to reconnect and meet so many people doing really cool work. And always nice to be back in Nijmegen 😁 Already excited for ISGS11 in Hong Kong!
July 12, 2025 at 7:38 AM
We also see a reduction in activity in key areas (MTG, TPJ, IFG) when the preceding context is more informative to the current sentence. However, this effect is less prominent when there are more gestures present.
April 4, 2025 at 10:40 AM
I won't go into all the details of brain activations here, but a couple of interesting highlights: mPFC was only activated by contextualized information content (updating based on longer time-scale predictions?), while IFG and OT were activated by both. But only when there were few gestures!
April 4, 2025 at 10:40 AM
We used these measures to disentangle the role of context in reducing surprisal, at calculated them at a sentence level, rather than word level. By also taking the amount of spontaneous gestures in each sentence, we could look at the interaction of these factors on brain response
April 4, 2025 at 10:40 AM
As expected, we found that linguistic information (both surprisal and entropy) was higher in the first half of the utterance. And we found the same for number of visual signals! So this suggested that there's a heavy multimodal front-loading of information in spoken Dutch.
February 3, 2025 at 10:05 AM
We analyzed a corpus of face-to-face conversations in Dutch, measuring linguistic information using surprisal and next-word entropy, as well as the number and duration of facial signals and manual gestures. We split each utterance in half, and looked at linguistic info and visual signals per half
February 3, 2025 at 10:05 AM
The Dutch government wants to cut €1billion to higher education. 20,000 people marched in the Hague today to say NO to this. Very proud to see so many UvA colleagues joining in!
November 25, 2024 at 3:36 PM
It is such an honor to be included in this fantastic new Handbook of Gesture Studies! And really wild to see my name listed among so many amazing people whose work has inspired me
April 24, 2024 at 10:08 AM