Marlijn ter Bekke
banner
marlijnterbekke.bsky.social
Marlijn ter Bekke
@marlijnterbekke.bsky.social
language/neuroscientist (she/her) studying multimodal communication during face-to-face conversation @dondersinst.bsky.social | https://marlijnterbekke.nl
Pinned
Hora est! 🎓 On Friday I successfully defended my PhD thesis "On how gestures facilitate prediction and fast responding during conversation", which can be found here (lnkd.in/eewcJYyt). It was an absolutely fantastic day! 😍
Tips for resources would be really appreciated! 😊
Supervising a Psychology student creating an online #mousetracking experiment with #jsPsych - showing videos and asking participants to choose left or right.

If you’ve worked on something similar and are happy to share a code template or know of good resources, we’d really appreciate it! 🙂
October 31, 2025 at 11:32 AM
Reposted by Marlijn ter Bekke
Delighted to share our new paper, now out in PNAS! www.pnas.org/doi/10.1073/...

"Hierarchical dynamic coding coordinates speech comprehension in the brain"

with dream team @alecmarantz.bsky.social, @davidpoeppel.bsky.social, @jeanremiking.bsky.social

Summary 👇

1/8
PNAS
Proceedings of the National Academy of Sciences (PNAS), a peer reviewed journal of the National Academy of Sciences (NAS) - an authoritative source of high-impact, original research that broadly spans...
www.pnas.org
October 22, 2025 at 5:21 AM
Happy with this popular science article about our work (with @lindadrijvers.bsky.social and @judithholler.bsky.social), showing that listeners use co-speech hand gestures to predict upcoming meaning ✋ www.medscape.com/viewarticle/...
What Using Our Hands While Speaking Reveals About Our Brains
Researchers now have fascinating insights into how our brains function while we communicate — and our hand gestures hold the key.
www.medscape.com
October 15, 2025 at 8:12 AM
Reposted by Marlijn ter Bekke
🚨NEW PAPER🚨 Prior expectations guide multisensory integration during face-to-face communication

Out now in @plos.org Computational Biology
journals.plos.org/ploscompbiol...
w/ @giuliamz.bsky.social et al. @cimecunitrento.bsky.social

🧠🧪 #psychscisky #neuroskyence
TL;DR 🧵👇
September 24, 2025 at 7:39 AM
Reposted by Marlijn ter Bekke
👀 👂 How does the brain merge what we hear & see? @lindadrijvers.bsky.social got an ERC Starting Grant (≈ €1.5M) for HANDWAVE, studying how we integrate audiovisual signals.

Vital for understanding language disorders & improving diagnostics.👇

www.ru.nl/en/research/...
ERC Starting Grants for research into language, money circulation and medieval songs | Radboud University
Three researchers at Radboud University will receive a Starting Grant from the European Research Council (ERC). They will receive a grant of rougly 1.5 million euros.
www.ru.nl
September 4, 2025 at 2:55 PM
Reposted by Marlijn ter Bekke
New issue of ROLSI out online: a Special Issue of articles exploring the use of coding in conversation analysis

www.tandfonline.com/toc/hrls20/c...
Research on Language and Social Interaction
Counting practices and actions: Coding social interaction. Volume 58, Issue 3 of Research on Language and Social Interaction
www.tandfonline.com
September 9, 2025 at 11:17 AM
Reposted by Marlijn ter Bekke
The articles in the Special Issue on Coding, now out online:
September 9, 2025 at 11:19 AM
Reposted by Marlijn ter Bekke
How do our brains and bodies support social learning in real time?

We take an ecological and multimodal neuroscience approach to study mutual prediction and social coordination when learning with others.

It took 5 full years for this one! Full open-access pre-print: osf.io/preprints/ps...
OSF
osf.io
July 23, 2025 at 10:31 AM
I had a lot of fun last week organising @isgs2025.bsky.social in my role as chair of the local support team! Thanks to all attendees and organizers for making the conference such a great success 🫶
July 16, 2025 at 8:49 AM
Reposted by Marlijn ter Bekke
🌟The closing session of #ISGS2025 has officially concluded—thank you to all our speakers, presenters and attendees!

See you at ISGS11 in Hong Kong 🌟
July 11, 2025 at 4:29 PM
Reposted by Marlijn ter Bekke
📢 We will announce the winner of the Student Poster Prize at the closing session of #ISGS2025 on Friday 🏆

Don’t miss it — join us as we celebrate outstanding student research! ✨

Keep an eye on our website to stay in the loop and catch all the updates www.isgs10.nl/home
ISGS10
10th Conference of the ISGS 9-11 July 2025 Nijmegen, The Netherlands
www.isgs10.nl
July 7, 2025 at 10:10 AM
Reposted by Marlijn ter Bekke
We're hiring! Join us to work at the intersection of social interaction and language technology. Postdoc and PhD positions in my Futures of Language research group, based at Radboud University in Nijmegen, NL

Read more: markdingemanse.net/futures/news...

#linguistics #interaction #sts #emca #hci
July 7, 2025 at 11:44 AM
Reposted by Marlijn ter Bekke
📢 We are thrilled to announce that the full program for #ISGS2025 is now available on our website: isgs10.nl

✨Explore the detailed schedule featuring latest advancements in multimodal language research ✨
isgs10.nl
June 13, 2025 at 3:47 PM
🚨 Great PhD opportunity to work with James McQueen and Orhun Ulusahin on the NWO-funded project ’Plugging talkers in: A new solution to the variability problem in human speech recognition’! Deadline 13th of June 📅

www.ru.nl/en/working-a...
PhD Position at the Donders Centre for Cognition: Talker Variability in Speech Recognition | Radboud University
Do you want to work as a PhD candidate at the Faculty of Social Sciences? Check our vacancy!
www.ru.nl
June 5, 2025 at 1:23 PM
Reposted by Marlijn ter Bekke
NEW PREPRINT
Iconic gestures are produced before the associated words and support prediction, but how does this work in children learning words?
Check out our new paper led by @marinewang.bsky.social @eddonnellan.bsky.social: osf.io/preprints/ps...
OSF
osf.io
May 15, 2025 at 9:11 AM
Reposted by Marlijn ter Bekke
Co-speech hand gestures are used to predict upcoming meaning. Final version by @marlijnterbekke.bsky.social , @lindadrijvers.bsky.social & @judithholler.bsky.social
doi.org/10.1177/09567976251331041
May 8, 2025 at 10:19 AM
Reposted by Marlijn ter Bekke
You do not need to be Dutch to sign. If you simply care to preserve programs that have been leaders in rigor and reform in psychology, then signal your support for them to continue to thrive.
An open letter supporting the international bachelor’s psychology programs threatened for cuts. Proceeding with these cuts would damage some of the most important and impactful psychology departments globally. #supportdutchpsychology

openletter.earth/against-lang...
Against Language Barriers: A Call to Protect International Education in Dutch Academia
openletter.earth
April 29, 2025 at 11:51 AM
Reposted by Marlijn ter Bekke
Interested in the evolution of human language and in understanding the pressures that shape languages today? Come do a PhD with me! Fully funded PhD position available in my group @mpi-nl.bsky.social, application deadline June 2nd:
www.mpi.nl/career-educa...
Fully funded 4-year PhD position in Language Evolution | Max Planck InstituteMax Planck Institute for psycholinguistics
www.mpi.nl
April 30, 2025 at 8:29 AM
Reposted by Marlijn ter Bekke
Dyadic differences in empathy scores are associated with kinematic similarity during conversational question-answer pairs. Final version by @jamestrujillo.bsky.social , Rebecca M. K. Dyer & @judithholler.bsky.social
doi.org/10.1080/0163853X.2025.2467605
Dyadic differences in empathy scores are associated with kinematic similarity during conversational question–answer pairs
During conversation, speakers coordinate and synergize their behaviors at multiple levels, and in different ways. The extent to which individuals converge or diverge in their behaviors during inter...
doi.org
April 28, 2025 at 1:05 PM
Reposted by Marlijn ter Bekke
Multimodal information density is highest in question beginnings, and early entropy is associated with fewer but longer visual signals. Final version by @jamestrujillo.bsky.social & @judithholler.bsky.social
doi.org/10.1080/0163853X.2024.2413314
Multimodal information density is highest in question beginnings, and early entropy is associated with fewer but longer visual signals
When engaged in spoken conversation, speakers convey meaning using both speech and visual signals, such as facial expressions and manual gestures. An important question is how information is distri...
doi.org
April 28, 2025 at 1:19 PM
Reposted by Marlijn ter Bekke
If anyone is interested in using WhisperX to transcribe speech, this tutorial is for you!! In this tutorial, I provide an easy-to-use pipeline where you will get a time-aligned transcript as a Praat TextGrid file, a TSV file, and a subtitle file🙌
github.com/ShoAkamine/w...
GitHub - ShoAkamine/whisperx_tutorial
Contribute to ShoAkamine/whisperx_tutorial development by creating an account on GitHub.
github.com
April 25, 2025 at 9:05 AM
Reposted by Marlijn ter Bekke
In face-to-face conversations, speakers use hand movements to signal meaning. But do listeners actually use these gestures to predict what someone might say next? A study using virtual avatars showed that listeners used the avatar’s gestures to predict upcoming speech. www.mpi.nl/news/listene...
April 25, 2025 at 3:25 PM
Reposted by Marlijn ter Bekke
Co-speech hand gestures are used to predict upcoming meaning. New paper by @marlijnterbekke.bsky.social , @lindadrijvers.bsky.social & @judithholler.bsky.social
doi.org/10.1177/09567976251331041
April 24, 2025 at 11:48 AM
Reposted by Marlijn ter Bekke
Co-Speech Hand Gestures Are Used to Predict Upcoming Meaning journals.sagepub.com/doi/10.1177/... A Cloze experiment showed that gestures improved explicit predictions of upcoming target words. An EEG experiment showed that gestures reduced alpha & beta power - indicating anticipation
Sage Journals: Discover world-class research
Subscription and open access journals from Sage, the world's leading independent academic publisher.
journals.sagepub.com
April 22, 2025 at 10:50 PM
🎉 New paper out today! We used virtual avatars and EEG to show that listeners use meaningful gestures to predict what someone might say next 🧠👋

Read it here in Psychological Science 👉 doi.org/10.1177/0956..., with @lindadrijvers.bsky.social and @judithholler.bsky.social
Co-Speech Hand Gestures Are Used to Predict Upcoming Meaning - Marlijn ter Bekke, Linda Drijvers, Judith Holler, 2025
In face-to-face conversation, people use speech and gesture to convey meaning. Seeing gestures alongside speech facilitates comprehenders’ language processing, ...
doi.org
April 23, 2025 at 9:39 AM