Félix Bigand
banner
felixbigand.bsky.social
Félix Bigand
@felixbigand.bsky.social
Postdoc @IITalk | Neuroscience, movement, music and dance
Neuroscience of Perception & Action Lab, Rome
inspired by all the brilliant work from auditory visual processing and groups from @cnspworkshop.bsky.social
May 22, 2025 at 11:29 AM
This thread is also a perfect time to share this recent spot-on piece by @escross.bsky.social — and I'm grateful to see our dancing brain featured! Couldn’t agree more: thrilled to see dance stepping into the neuroscience spotlight!

More to come…!
💃🧠 Psyched to share my latest paper in Neuron on the past, present and future of dance neuroscience research! Crazy to think where this field will be in 20 more years 🚀

authors.elsevier.com/a/1kkh3_KOmx...

#PsychSciSky #neuroskyence #Dance #CognitiveNeuroscience #Neuroaesthetics #BrainScience
authors.elsevier.com
May 22, 2025 at 11:24 AM
my dear co-partners in crime: @robertabianco.bsky.social, @saraabalde, @trinhnguyen.bsky.social and @giacomonovembre.bsky.social :)
May 22, 2025 at 10:30 AM
These findings show how computational neuroscience can reveal the brain mechanisms behind real-world movement and social behavior — helping us understand how the brain supports dynamic, interactive moments.
May 22, 2025 at 10:30 AM
This brain result echoes our earlier behavioral finding: that bounce might play a key role in how we naturally sync up with others when we dance!

(see our Current Biology paper: www.cell.com/current-biol...
May 22, 2025 at 10:30 AM
Finally, we found one dance move that stands out: vertical bounce!

Using PCA, we found that bounce explains over 80% of the brain responses tied to both moving and watching a partner. This is noteworthy as bounce was explaining even less than 1% of the kinematic variance!
May 22, 2025 at 10:30 AM
We also uncovered a new brain signal for social coordination! 🧠👯

This signal tracks how well dancers move in sync, beyond just reacting to your own or your partner’s movements. It shows up when partners make eye contact, comes from visual areas, and is driven by watching (not initiating) movement.
May 22, 2025 at 10:30 AM
We linked the mTRF results to well-known brain signals using classic ERP analysis. The first three processes reflect known ERPs:

(I) frontotemporal P50-N100-P200 for sound,
(II) central-lateralized motor potentials for movement initiation, and
(III) occipital N170 for movement observation.
May 22, 2025 at 10:30 AM
mTRFs teased apart four key processes:

(I) auditory tracking of music,
(II) control of self-generated movements,
(III) visual monitoring of partner movements, and
(IV) visual tracking of social coordination accuracy.

Importantly, these are all independent of eye, face and neck muscle activity!
May 22, 2025 at 10:30 AM
We recorded EEG, 3D full-body kinematics, EOG, and EMG signals from 80 participants freely dancing in pairs to music. Then we used advanced denoising techniques and multivariate temporal response functions (mTRFs) to to tease apart neural signals related to music, movement, and social partners.
May 22, 2025 at 10:30 AM
Thanks :)
December 3, 2024 at 2:03 PM
Hi! Thanks for this, I'd be happy to be in :) working on hyperscanning and dance
November 28, 2024 at 12:21 PM
Hi Marta, thanks for putting this together! I'd be happy to be added.

interested in (and working on) dance, music and movement, combining neuroimaging and 3D kinematics methods!
November 25, 2024 at 9:02 PM
Thanks for this! :) I'd be happy to be in the list (I'm studying dance and music)
November 18, 2024 at 1:19 PM
Thanks!
November 17, 2024 at 2:27 PM
Thanks so much for this, and hi everyone! Would love to be added too!
November 17, 2024 at 11:51 AM