Prince Canuma
prince-canuma.bsky.social
Prince Canuma
@prince-canuma.bsky.social
@hugg Idefics 3 and SmolVLM now on MLX 🎉🚀

You can now run inference and fine-tune locally on your Mac.

pip install -U mlx-vlm

I’m getting ~140 tok/s on M3 Max 96GB 🔥

Thanks to @pcuenq.hf.co for PR!

Model Cards 👇🏽
November 26, 2024 at 11:04 PM