Steve Rathje
steverathje.bsky.social
Steve Rathje
@steverathje.bsky.social
Incoming Assistant Professor of HCI at Carnegie Mellon studying the psychology of technology. NSF postdoc at NYU, PhD from Cambridge, BA from Stanford. stevenrathje.com
📚 Read the full issue here: cell.com/trends/cogni...
📖 Our article here: doi.org/10.1016/j.ti...
📝 And the pre-print here: osf.io/preprints/ps...
Issue: Trends in Cognitive Sciences
cell.com
October 7, 2025 at 6:28 PM
Thanks! Excited to read your book.
October 2, 2025 at 4:25 AM
Cool! Thank you for sharing!
October 1, 2025 at 6:24 PM
Thank you!!
October 1, 2025 at 4:56 PM
This is still a working paper, so please let us know if you have any feedback!
October 1, 2025 at 3:16 PM
OSF
osf.io
October 1, 2025 at 3:16 PM
We hope this research informs the creation of AI systems that broaden users’ perspectives instead of reinforcing their biases.
October 1, 2025 at 3:16 PM
The very qualities that are thought to make AI persuasive, such as its ability to provide targeted facts and evidence, may also make it an effective tool for creating elaborate rationalizations of one’s beliefs.
October 1, 2025 at 3:16 PM
While AI chatbots have been lauded for their ability to encourage more accurate viewpoints and debunk misinformation, our work suggests that people may prefer to use AI to marshal evidence in support of their pre-existing beliefs.
October 1, 2025 at 3:16 PM
AI companies may face a tradeoff between creating engaging AI systems that foster echo chambers or creating less engaging AI systems that are healthier for users and public discourse.
October 1, 2025 at 3:16 PM
Altogether, these results suggest that people’s preference for and blindness to sycophantic AI may risk creating AI "echo chambers" that increase polarization and overconfidence.
October 1, 2025 at 3:16 PM
Different dimensions of sycophancy had different effects:
-The one-sided presentation of facts primarily impacted extremity & certainty
-Validation primarily impacted enjoyment & perceptions of bias
October 1, 2025 at 3:16 PM