Cyril Zakka, MD
banner
cyrilzakka.bsky.social
Cyril Zakka, MD
@cyrilzakka.bsky.social
Health AI x @Huggingface | Medical Doctor & ML Researcher. Prev: @Stanford

cyrilzakka.github.io
Sure thing! If you’d like we can move things over email. firstName.lastName@huggingface.co
January 8, 2025 at 8:41 PM
Would love to host this data on HuggingFace!
January 7, 2025 at 11:31 PM
I want to repost this so badly
January 6, 2025 at 10:55 PM
Reposted by Cyril Zakka, MD
my wl:
- fewer low-hanging proof-of-concept studies (yes, it works for your specialty/organ/classification, too)
- fewer head-to-head model comparisons (yes, llama 3.1 was better than llama 2 for your case, but peer review took 8 months, now there’s llama 4)

- instead: RCTs, relevant outcomes
December 26, 2024 at 10:42 PM
I can’t repost this enough.
December 26, 2024 at 10:45 PM
My spiciest prediction is that we’ll see a couple of medical subspecialties whose core workflows will be mostly automated/augmented by AI. Radiology won’t be one of them.
December 26, 2024 at 10:18 PM
On my wishlist - let’s eliminate QA style evaluations in medical benchmarks, especially ones that are obviously contaminated. They do nothing but generate headlines and hype.
December 26, 2024 at 10:16 PM
I’ve found that simply walking a model through how I would approach a problem often results in great outputs compared to open-ended prompts.
December 26, 2024 at 10:10 PM
We’re still in beta but have achieved feature parity (except for voice). Will focus on UI/UX in the next update.
December 9, 2024 at 7:23 PM
what features from the Mac app do you find indispensable? We just released a huge update for the HF macOS app: bsky.app/profile/cyri...
Christmas came early! 🎅🏻 Today marks the newest release of the HuggingChat 🤗 update with some really exciting capabilities! First up, automatic context injection!

1) Open a file in a supported app, summon HFChat, and it pre-populates the context window. No more copy-pasting. /cc @hf.co
December 9, 2024 at 7:13 PM
5) On-device LLM generation now powered by MLX and available with a quick keyboard shortcut. Let us know what you think!
December 9, 2024 at 7:12 PM
4) Web browsing now comes with rich link previews so you can verify LLM outputs and avoid hallucinations 🦟🔎
December 9, 2024 at 7:12 PM
3) A lot of folks asked for the ability to generate images so we added tool use 🔧 with support for image generation, web browsing and more!
December 9, 2024 at 7:12 PM
2) Next up we have system-wide on-device transcription 🎤🔥 Compatible with any app!
December 9, 2024 at 7:12 PM
Yes. I’m from Lebanon and the fall was celebrated across Lebanon today (with Assad regime having tried to invade the country several times before)
December 8, 2024 at 11:26 PM