Stéphane Thiell
banner
sthiell.bsky.social
Stéphane Thiell
@sthiell.bsky.social
I do HPC storage at Stanford and always monitor channel 16 ⛵
Thrilled to host Lustre Developer Day at @stanford-rc.bsky.social post-LUG 2025! 🌟 With 14+ top organizations like DDN, LANL, LLNL, HPE, CEA, AMD, ORNL, AWS, Google, NVIDIA, Sandia and Jefferson Lab represented, we discussed HSM, Trash Can, and upstreaming Lustre in Linux.
April 25, 2025 at 6:39 PM
@stanford-rc.bsky.social was proud to host the Lustre User Group 2025 organized with OpenSFS! Thanks to everyone who participated and our sponsors! Slides are already available at srcc.stanford.edu/lug2025/agenda 🤘Lustre! #HPC #AI
April 4, 2025 at 5:05 PM
Getting things ready for next week's Lustre User Group 2025 at Stanford University!
March 28, 2025 at 7:07 PM
SAS 24Gb/s (12 x 4 x 24Gb/s) switch from SpectraLogic on display at #SC24. Built by Astek Corporation.
January 29, 2025 at 4:45 AM
We started it! blocksandfiles.com/2025/01/28/s...

Check out my LAD'24 presentation:
www.eofs.eu/wp-content/u...
January 29, 2025 at 4:34 AM
Just another day for Sherlock's home-built scratch Lustre filesystem at Stanford: Crushing it with 136+GB/s aggregate read on real research workload! 🚀 #Lustre #HPC #Stanford
January 11, 2025 at 7:59 PM
A great show of friendly open source competition and collaboration: the lead developers of Environment Modules and Lmod (Xavier of CEA and Robert of @taccutexas.bsky.social) at #SC24. They often exchange ideas and push each other to improve their tools!
November 21, 2024 at 3:26 PM
Newly announced at the #SC24 Lustre BoF! Lustre User Group 2025, organized by OpenSFS, will be hosted at Stanford University on April 1-2, 2025. Save the date!
November 20, 2024 at 2:40 PM
Fun fact: the Georgia Aquarium (nonprofit), next to the Congress center is the largest in the U.S. and the only one that houses whale sharks. I went on Sunday and it was worth it. Just in case you need a break from SC24 this week… 🦈
November 18, 2024 at 5:39 PM
I always enjoy an update from JD Maloney (NCSA), but even more when it is about using S3 for Archival Storage, something we are deploying at scale at Stanford this year (using MinIO server and Lustre/HSM!)
November 18, 2024 at 5:11 PM
Our branch of robinhood-lustre-3.1.7 on Rocky Linux 9.3 with our own branch of Lustre 2.15.4 and MariaDB 10.11 can ingest more than 35K Lustre changelogs/sec. Those gauges seem appropriate for Pi Day, no?
github.com/stanford-rc/...
github.com/stanford-rc/...
March 14, 2024 at 10:54 PM
Filesystem upgrade complete! Stanford cares about HPC I/O! The Sherlock cluster has now ~10PB of full flash Lustre scratch storage at 400 GB/s, to support a wide range of research jobs on large datasets! Fully built in-house!
January 25, 2024 at 1:27 AM
My header image is an extract from this photo taken at the The Last Bookstore in Los Angeles, a really cool place.
October 28, 2023 at 1:03 AM