Sam Cheyette
banner
samcheyette.bsky.social
Sam Cheyette
@samcheyette.bsky.social
I study thinking. Postdoc in the CoCoSci lab at MIT.
Dream team: Tracey Mills (@traceym.bsky.social), Nicole Coates, Alessandra Silva (@alessandra-silva.bsky.social), Kaylee Ji, Steve Ferrigno (@sferrigno.bsky.social), Laura Schulz, Josh Tenenbaum.
October 14, 2025 at 4:28 PM
Our results highlight program learning as a powerful, potentially distinctive, and early-emerging ability that humans deploy to learn structure. Our comparative/developmental results also raise many exciting questions—check out our paper for those + some speculations :). osf.io/preprints/ps...
OSF
osf.io
October 14, 2025 at 4:28 PM
We compared various Bayesian learning models with each population. The main takeaway: children as young as 4-years-old showed adult-like program induction on our task. Monkeys and 3-year-olds mostly used local extrapolation.
October 14, 2025 at 4:28 PM
In contrast, despite extensive training on algorithmic patterns, the monkeys relied on a simpler local linear extrapolation to make predictions. Interestingly, 3-year-olds mostly used this same strategy—and their accuracy across patterns correlated with monkeys much more than with adults.
October 14, 2025 at 4:28 PM
Consistent with a program-learning account, older children and adults' initial predictions typically show early multimodal uncertainty, but converge on the true pattern after only a handful of observations.
October 14, 2025 at 4:28 PM
We found striking differences across both development and species in our task. Below are some examples of predicted continuations on various sequences in each population.
October 14, 2025 at 4:28 PM
If participants are learning structured programs in an expressive “Language of Thought”, they should (1) be able to learn the sequences we tested by the final timepoint; but (2) show patterns of multimodal uncertainty reflective of possible algorithms that are consistent with the sequence so far.
October 14, 2025 at 4:28 PM
One possibility is that we can “learn by programming” rapidly inferring structured algorithms to model our observations.

Our paper tests this ability in adults, 3-7yo children, and two rhesus macaques. Participants predicted how 2D sequences would unfold starting from the first few timepoints.
October 14, 2025 at 4:28 PM
People seem wired to uncover hidden structure: we pick up the rules of games after a few turns, see figures in clouds and constellations, and riff on songs. What are the computational mechanisms that make this rapid structure learning possible?
October 14, 2025 at 4:28 PM
hopefully my first and last ever mildly political post. from here on out it's memes and papers
December 7, 2024 at 5:17 AM
"this rendition emphasizes her eyes, carefully positioned to create the illusion of a gaze that follows you"
November 19, 2024 at 4:28 AM
November 19, 2024 at 4:24 AM
and ascii mona lisa
November 19, 2024 at 4:23 AM