David Ouyang, MD
davidouyang.bsky.social
David Ouyang, MD
@davidouyang.bsky.social
Cardiologist, Data Scientist, AI researcher
EchoPrime is a contrastive vision-language model. Trained on over 12M video-repprt pairs, it learns the relationship between what cardiologists write and what the ultrasound images show.

If you feed it a video or echo reports, it can find similar historical studies that match those concepts.
November 17, 2025 at 2:41 AM
We are excited to announce EchoPrime is published in Nature. EchoPrime is the first echocardiography AI model capable of evaluating a full transthoracic echocardiogram study, identify the most relevant videos, and produce a comprehensive interpretation!
November 17, 2025 at 2:38 AM
Congratulations to Sarnoff Fellow Victoria Yuan on her new preprint 'Automated Deep Learning Pipeline for Characterizing Left Ventricular Diastolic Function'!
May 4, 2025 at 4:12 AM
5/n Preprint: medrxiv.org/content/10.1...
GitHub: github.com/echonet/meas...
Demo: On Github.
March 20, 2025 at 10:35 PM
1/n We are thrilled to present EchoNet-Measurements, an open source, comprehensive AI platform for automated #echofirst measurements.

Using more than 1,414,709 annotations from 155,215 studies from 78,037 patients for training, this is the most comprehensive #echofirst segmentation model.
March 20, 2025 at 10:33 PM
Consistency - one of the most useful heuristics of whether a medical AI model is good or not.

I've noticed that AI models trained on small datasets tend to jitter - jumping a lot from frame to frame - while robust models tend to have consistent measurements across the entire video.

Coming soon...
March 5, 2025 at 12:04 AM