Mattia Rosso
banner
mattiarosso.bsky.social
Mattia Rosso
@mattiarosso.bsky.social
Postdoctoral researcher @ Center for Music in the Brain (MIB) | Brain network analysis, neurophysiology, interpersonal coordination | Signal processing, multivariate data analysis, scientific programming.
8/n - These results offer a window into how our brains support social interaction by integrating the other in one's own sensorimotor schema via frequency-specific dynamics.

Looking forward to presenting this work at #JointActionMeeting2025 in my hometown (Turin, IT)!
May 7, 2025 at 2:04 PM
7/n - This suggests a functional dissociation:

Neural entrainment tracks a partner’s rhythm more generally

Beta modulation underlies deeper self–other integration, supporting a shared body schema for action
May 7, 2025 at 2:02 PM
6/n - But here’s the key result: only beta modulation was selectively enhanced in the 1P condition! That is, when the participant experienced their partner’s hand as their own.
May 7, 2025 at 2:01 PM
4/n - What did we find? In visually coupled conditions (1P and 2P), we observed both:

- Neural entrainment (low-frequency convergence across brains)

- Beta modulation (~20 Hz signals tied to perceiving partner movements)
May 7, 2025 at 2:00 PM
3/n - In our study, we used immersive virtual reality to create a body-swap illusion. Participants saw either their own or their partner’s hand from a 1P or 2P (natural) perspective while tapping rhythmically in pairs.
May 7, 2025 at 1:58 PM
2/n - Previous studies show that adopting a first-person (1P) perspective of your partner, namely seeing from their point of view, improves coordination. But what does this mean for the brain? Can we measure the neural dynamics that support this integration?
May 7, 2025 at 1:57 PM
1/n - Temporal coordination is crucial to everything from conversation to music-making — yet the brain mechanisms behind it remain surprisingly unclear. One key idea: we synchronize by integrating motor information from the other person into our own sensorimotor framework.
May 7, 2025 at 1:56 PM
5/
🛠️ FREQ-NESS has been a long time in the making.
If you’re interested in exploring frequency-resolved brain networks in your own data — check out the toolbox and documentation here:
👉 shorturl.at/mOVKF

Feel free to reach out for clarification or collaborations #OpenScience #Toolbox
GitHub - mattiaRosso92/Frequency-resolved_brain_network_estimation_via_source_separation_FREQ-NESS
Contribute to mattiaRosso92/Frequency-resolved_brain_network_estimation_via_source_separation_FREQ-NESS development by creating an account on GitHub.
github.com
April 23, 2025 at 10:11 AM
4/
Provided the network separation, we also tracked cross-frequency coupling (CFC) between networks.
During passive listening to the metronome, the phase of low-freq (2.4 Hz) auditory networks selectively modulates the gamma band amplitude in more distributed medial temporal networks.
April 23, 2025 at 10:08 AM
3/n
🎧 Auditory stimulation reshapes the entire frequency-resolved network landscape:

• EMERGENCE: Attunement to the 2.4 Hz stimulation

• RE-ARRANGEMENT: Spatial shift of alpha from occipital to sensorimotor, spectral shift to high alpha activity

• INVARIANCE: Beta networks remain unchanged
April 23, 2025 at 10:05 AM
2/n
🧘‍♂️ During rest, FREQ-NESS reliably separates well-known resting state brain networks — the Default Mode Network, alpha-band parieto-occipital, and motor-beta sensorimotor topographies. Textbook configurations emerge directly from the data and based on frequency, without predefining regions.
April 23, 2025 at 10:02 AM
1/n 🔍 At the core of FREQ-NESS is Generalized Eigendecomposition (GED) - a powerful linear decomposition technique that allows us to separate overlapping neural processes based on their dominant frequency, by contrasting narrowband vs broadband activity. #Neuroscience #FREQNESS
April 23, 2025 at 10:00 AM
This is the first application to #psychedelics research of our FREQ-NESS framework for brain network analysis.

The method was co-developed by me and Leonardo Bonetti at @musicinthebrain.bsky.social.

🔧 The FREQ-NESS Toolbox is #open-source on GitHub:
shorturl.at/JT6iO
Frequency-resolved_brain_network_estimation_via_source_separation_FREQ-NESS/FREQNESS_Toolbox at main · mattiaRosso92/Frequency-resolved_brain_network_estimation_via_source_separation_FREQ-NESS
Contribute to mattiaRosso92/Frequency-resolved_brain_network_estimation_via_source_separation_FREQ-NESS development by creating an account on GitHub.
shorturl.at
March 31, 2025 at 8:54 AM
Hi David! New here :)

I would love to be added to the list.
Thanks in advance.
November 27, 2024 at 11:31 AM