David Ouyang, MD
davidouyang.bsky.social
David Ouyang, MD
@davidouyang.bsky.social
Cardiologist, Data Scientist, AI researcher
EchoPrime is a contrastive vision-language model. Trained on over 12M video-repprt pairs, it learns the relationship between what cardiologists write and what the ultrasound images show.

If you feed it a video or echo reports, it can find similar historical studies that match those concepts.
November 17, 2025 at 2:41 AM
EchoPrime is the culmination of a long roadmap for AI echo.

Substantial time and effort was put into curating the largest existing echocardiography database.

EchoPrime is trained on 1000x the data of EchoNet-Dynamic, our first model for AI-LVEF, and 10x the data of existing AI echo models.
November 17, 2025 at 2:39 AM
The deep learning pipeline was more reproducible and had better concordance with guidelines. And it’s open source!

Preprint: medrxiv.org/content/10.1...
Code: github.com/echonet/dias...
Automated Deep Learning Pipeline for Characterizing Left Ventricular Diastolic Function
Introduction: Left ventricular diastolic dysfunction (LVDD) is most commonly evaluated by echocardiography. However, without a sole identifying metric, LVDD is assessed by a diagnostic algorithm relyi...
medrxiv.org
May 4, 2025 at 4:13 AM
Victoria developed a fully automated pipeline that processes an entire #echofirst study, view classifies important views, rejects low quality images, derives measurements from each image, and uses the @ase360.bsky.social algorithm to assess diastolic function.
May 4, 2025 at 4:13 AM
Diastolic function underpins many cardiac conditions, including HFpEF, however #echofirst assessment is complex.
@ase360.bsky.social expert guidelines integrate many elements for comprehensive assessment, however its complexity lead to missingness and inconsistency in clinical practice.
May 4, 2025 at 4:12 AM
3/n Without additional prompting, our #echofirst model naturally gravitates towards the vena contracta in color Doppler videos to assess AR severity.
March 25, 2025 at 10:08 PM
2/n Accurate assessment of AR severity and sequalae is critical for surveillance and timing of surgery. Using over 40k studies from @smidtheart.bsky.social and validated on 1.5k studies from @stanforddeptmed.bsky.social, we show that AI can accurately assess AR severity across a range of patients.
March 25, 2025 at 10:08 PM
5/n Preprint: medrxiv.org/content/10.1...
GitHub: github.com/echonet/meas...
Demo: On Github.
March 20, 2025 at 10:35 PM
4/n Combined with EchoPrime, which enables automated structured reporting, with EchoNet-Measurements, which enables automated measurements, we envision truly automated comprehensive echocardiography.

Access to #POCUS and #echofirst improves accuracy and access to care.

x.com/David_Ouyang...
David Ouyang, MD on X: "1/n We are excited to announce EchoPrime – the first echocardiography AI model capable of evaluating a full transthoracic echocardiogram study, identify the most relevant videos, and produce a comprehensive interpretation! Great work lead by @milos_ai, EchoPrime is the largest https://t.co/dPTK9cXO5Y" / X
1/n We are excited to announce EchoPrime – the first echocardiography AI model capable of evaluating a full transthoracic echocardiogram study, identify the most relevant videos, and produce a comprehensive interpretation! Great work lead by @milos_ai, EchoPrime is the largest https://t.co/dPTK9cXO5Y
x.com
March 20, 2025 at 10:34 PM
3/n Across a wide range of image quality, patient characteristics, and study types, EchoNet-Measurements perform well, improving upon sonographers' and cardiologists' precision.
March 20, 2025 at 10:34 PM
2/n Lead by Yuki Sahashi, EchoNet-Measurements provide automated annotations for the 18 most common #echofirst (both B mode and Doppler) measurements.

Excellent performance in test cohorts (overall R2 of 0.967 in the held-out CSMC dataset and 0.987 in the SHC dataset)
March 20, 2025 at 10:33 PM
My favorite feature is if you have multiple calendars, it’s put side by side
March 18, 2025 at 3:36 AM
This model is trained on cases and controls matched by wall thickness, and is able to accurately identify who has LVOT obstruction. Strong performance validated at two sites.

Talk to us at #acc25 if any questions!

Preprint: medrxiv.org/content/10.1...
Github: github.com/echonet/obst...
Detection of Left Ventricular Outflow Obstruction from Standard B-Mode Echocardiogram Videos using Deep Learning
Abstract Introduction Hypertrophic cardiomyopathy (HCM) affects 20 million individuals globally, with increased risk of sudden death and heart failure. While cardiac myosin inhibitors show great promi...
medrxiv.org
March 7, 2025 at 9:04 PM
Reposted by David Ouyang, MD
Original Article: Development and Evaluation of a Model to Manage Patient Portal Messages nejm.ai/3XmIx8m

Original Article: Opportunistic Screening of Chronic Liver Disease with Deep-Learning–Enhanced Echocardiography nejm.ai/3XJuPNl
February 28, 2025 at 5:46 PM