-Assistant Professor @URochester studying naturalistic, interactive human communication and speech/music perception 🧠 -PI of the SoNIC Lab (piazzalab.com) -BA @Williams | PhD @UCBerkeley | Postdoc @Princeton -Halfling bard irl 🎶 -she/her 🌈
Led jointly by postdoc Sarah Izen and PhD student Riesa Cassano-Coleman (rcassanocoleman.bsky.social). Come by Riesa's #Psychonomics poster (Saturday 7:45-9:15, #7059), which mainly discusses Expt 3.
You can hear our wacky scrambled music here: osf.io/mej7a/
Led jointly by postdoc Sarah Izen and PhD student Riesa Cassano-Coleman (rcassanocoleman.bsky.social). Come by Riesa's #Psychonomics poster (Saturday 7:45-9:15, #7059), which mainly discusses Expt 3.
You can hear our wacky scrambled music here: osf.io/mej7a/
This is a beast of a study! E.g., Experiment 3 alone uncovers new fundamentals of musical event segmentation (a relatively understudied topic). One takeaway here: musicians are more likely than non-musicians to perceive long-timescale (multi-phrase) events.
November 21, 2024 at 6:57 PM
This is a beast of a study! E.g., Experiment 3 alone uncovers new fundamentals of musical event segmentation (a relatively understudied topic). One takeaway here: musicians are more likely than non-musicians to perceive long-timescale (multi-phrase) events.
In general, we show that non-musicians use context quite effectively, which is surprising b/c this is relatively high-level tonal context (not driven by dynamics/timbre/tempo/pitch proximity). But musicians do perform better overall across tasks (including identifying the degree of scrambling).
November 21, 2024 at 6:57 PM
In general, we show that non-musicians use context quite effectively, which is surprising b/c this is relatively high-level tonal context (not driven by dynamics/timbre/tempo/pitch proximity). But musicians do perform better overall across tasks (including identifying the degree of scrambling).