Andrew Chang
banner
candrew123.bsky.social
Andrew Chang
@candrew123.bsky.social
Postdoctoral researcher at NYU, working on computational cognitive neuroscience, audition (music and speech), and real-world communication. 🇹🇼🇨🇦🇺🇸
While spectrogram-based audio DNNs excel, they’re often bulky, compute-heavy, hard to interpret, and data-hungry.
We explored an alternative: training a DNN on spectrotemporal modulation (#STM) features—an approach inspired by how the human auditory cortex processes sound.
June 2, 2025 at 7:00 PM
One surprising insight: awkward silences—those long gaps in turn-taking—were more detrimental to conversational fluidity and enjoyment than chaotic overlaps or interruptions.
5/n
March 10, 2025 at 7:24 PM
We used multimodal ML on 100+ person-hours of videoconferences, modeling voice, facial expressions, and body movements. Key result: ROC-AUC 0.87 in predicting unfluid and unenjoyable moments and classifying various disruptive events, such as gaps and interruptions.
4/n
March 10, 2025 at 7:24 PM
Videoconferencing has become essential in our professional and personal lives, especially post-pandemic. Yet, we've all experienced the “derailed” moments, such as awkward pauses and uncoordinated turn-taking, and that can make virtual meetings less effective and enjoyable.
3/n
March 10, 2025 at 7:24 PM
The helix model reflects the idea that pitches separated by an octave (e.g., the repeating piano keys) are perceived as inherently similar. This concept was first explored in the early 1900s by Géza Révész, laying the groundwork for modern music cognition! 🧠🎹 6/n
February 19, 2025 at 8:19 PM
The brain doesn’t process pitch in an unstructured way. Typically, it represents pitches in a mostly linear structure—think piano keyboard layout. BUT—just 0.3 seconds after hearing a sound, something wild happens: the brain briefly represents pitch in a helix-like structure! 5/n
February 19, 2025 at 8:19 PM
This animation shows the reconstruction of how the brain dynamically represents musical pitches. The pitches that are closer in space are perceived as more similar at a given moment. 4/n
February 19, 2025 at 8:19 PM
We used machine learning to decode how the brain represents musical pitches during an #MEG scan. Our model reconstructed how the brain represents the similarity between different pitches and how this representation changes over time. 3/n
February 19, 2025 at 8:19 PM
Why does pitch matter? It’s essential not just for music, but for speech perception & sound segregation too! Understanding how our brain dynamically encodes pitch is a major research in auditory cognitive neuroscience. 2/n
February 19, 2025 at 8:19 PM
The helix model reflects the idea that pitches separated by an octave (e.g., the repeating piano keys) are perceived as inherently similar. This concept was first explored in the early 1900s by Géza Révész, laying the groundwork for modern music cognition! 🧠🎹 6/n
February 19, 2025 at 7:59 PM
The brain doesn’t process pitch in an unstructured way. Typically, it represents pitches in a mostly linear structure—think piano keyboard layout. BUT—just 0.3 seconds after hearing a sound, something wild happens: the brain briefly represents pitch in a helix-like structure! 5/n
February 19, 2025 at 7:59 PM
This animation shows the reconstruction of how the brain dynamically represents musical pitches. The pitches that are closer in space are perceived as more similar at a given moment. 4/n
February 19, 2025 at 7:59 PM
We used machine learning to decode how the brain represents musical pitches during an #MEG scan. Our model reconstructed how the brain represents the similarity between different pitches and how this representation changes over time. 3/n
February 19, 2025 at 7:59 PM
Why does pitch matter? It’s essential not just for music, but for speech perception & sound segregation too! Understanding how our brain dynamically encodes pitch is a major research in auditory cognitive neuroscience. 2/n
February 19, 2025 at 7:59 PM
This season of gratitude, I’m especially thankful to be featured as one of four NYU researchers on The Academic Minute and highlighted in the Postdoc Spotlight by the New York University Office of Postdoctoral Affairs weekly newsletter.
December 2, 2024 at 11:15 PM
Hello, blue sky!
October 20, 2023 at 11:55 PM