FCLC (St. Louis)
banner
fclc.bsky.social
FCLC (St. Louis)
@fclc.bsky.social
HPC, BLAS, I make things FAST

Standing on the shoulders of giants
TLDR; 🇨🇦🐧🧑🏼‍💻🚴🏎️🧗🏼 💩posting.
Haver of opinions that are all my own.
I mainly do #HPC #BLAS #AI #RVV and #clusters

Proud French Canadian, you’ll hear about it

(I help with HPC.social)
Glenn Posted his ISC recap, and as always, it's excellent. The exert on Mixed precision and Ozaki did have me giggling whilst reading.

I think it's relevant to add some high level context on *why* we vendors decrease FP64; it's not just chasing AI, it's the shear *cost* of FP64.

The short ...
June 24, 2025 at 2:51 PM
shorter version:
May 2, 2025 at 2:09 PM
April 15, 2025 at 12:32 AM
Presented without comment
April 14, 2025 at 5:19 PM
Loosing it to this… DO NOT OVERLOAD OPCODES YOU {redacted}
April 1, 2025 at 1:47 AM
Happy place @ the computer history museum!
March 30, 2025 at 6:33 PM
March 11, 2025 at 9:56 AM
March 10, 2025 at 1:06 AM
"All lessons in tech will be relearned every 25 years"
-Me, looking at us relearning the same thing again
February 24, 2025 at 2:14 PM
Sus:
February 24, 2025 at 1:18 PM
Currently at Veracruz having a margarita!
February 16, 2025 at 12:22 AM
Hey folks! Will be talking about the wonderful, wacky world of floating point Sunday afternoon in the HPC, Big data, & Data science dev room at 15:30!

#fosdem
February 1, 2025 at 5:10 PM
Nope, I’m definitely not still working on my slides 😅
January 20, 2025 at 7:25 AM
Until next time UK friends!
January 19, 2025 at 12:53 PM
Really not sure how to square these two images.

One implies V length min is 256, the other implies it’s 128.

Spec lawyer in me says you go by CPUID
January 17, 2025 at 9:22 AM
January 11, 2025 at 7:28 PM
trying not to dunk, but LOL
January 10, 2025 at 3:26 PM
Like, really?!?!?
January 7, 2025 at 3:48 PM
One of the coolest things ever; one of my favorite people in the world, roboticist extraordinaire, and best mate Austin, got me a pair of mugs with the incredible artwork from @mcy.gay!
December 25, 2024 at 5:59 PM
A refreshing part of RV hacker boards is that when something isn't working, they'll say as much 🤭

CC @jeffgeerling.com
December 11, 2024 at 6:40 PM
“You have to choose between optimizing a chip for AI training or AI inference”

No, you need to optimize for the workloads you will see, and the ones you expect to see emerge.

CNNs have massively different inference requirements than LLMs, which in turn are different from diffusion models.
December 10, 2024 at 11:20 PM
December 4, 2024 at 4:29 PM
With the next product launch season fast approaching, a reminder of the first question you should *always* be asking when a figure of merit is presented without context:

#HPC
November 29, 2024 at 11:36 PM
November 21, 2024 at 8:48 PM
Super fun Time earlier today at the “Democratizing Al Accelerators for HPC Applications" BoF!
November 21, 2024 at 1:38 AM