Stéphane Thiell
banner
sthiell.bsky.social
Stéphane Thiell
@sthiell.bsky.social
I do HPC storage at Stanford and always monitor channel 16 ⛵
Thrilled to host Lustre Developer Day at @stanford-rc.bsky.social post-LUG 2025! 🌟 With 14+ top organizations like DDN, LANL, LLNL, HPE, CEA, AMD, ORNL, AWS, Google, NVIDIA, Sandia and Jefferson Lab represented, we discussed HSM, Trash Can, and upstreaming Lustre in Linux.
April 25, 2025 at 6:39 PM
@stanford-rc.bsky.social was proud to host the Lustre User Group 2025 organized with OpenSFS! Thanks to everyone who participated and our sponsors! Slides are already available at srcc.stanford.edu/lug2025/agenda 🤘Lustre! #HPC #AI
April 4, 2025 at 5:05 PM
Getting things ready for next week's Lustre User Group 2025 at Stanford University!
March 28, 2025 at 7:07 PM
Join us for the Lustre User Group 2025 hosted by @stanford-rc.bsky.social in collaboration with OpenSFS.
Check out the exciting agenda! 👉 srcc.stanford.edu/lug2025/agenda
LUG 2025 Agenda
srcc.stanford.edu
March 8, 2025 at 7:57 AM
ClusterShell 1.9.3 is now available in EPEL and Debian. Not using clustershell groups on your #HPC cluster yet?! Check out the new bash completion feature! Demo recorded on Sherlock at @stanford-rc.bsky.social with ~1,900 compute nodes and many group sources!

asciinema.org/a/699526
clustershell bash completion (v1.9.3)
This short recording demonstrates the bash completion feature available in ClusterShell 1.9.3, showcasing its benefits when using the clush and cluset command-line tools.
asciinema.org
February 3, 2025 at 5:49 AM
We started it! blocksandfiles.com/2025/01/28/s...

Check out my LAD'24 presentation:
www.eofs.eu/wp-content/u...
January 29, 2025 at 4:34 AM
Just another day for Sherlock's home-built scratch Lustre filesystem at Stanford: Crushing it with 136+GB/s aggregate read on real research workload! 🚀 #Lustre #HPC #Stanford
January 11, 2025 at 7:59 PM
A great show of friendly open source competition and collaboration: the lead developers of Environment Modules and Lmod (Xavier of CEA and Robert of @taccutexas.bsky.social) at #SC24. They often exchange ideas and push each other to improve their tools!
November 21, 2024 at 3:26 PM
Newly announced at the #SC24 Lustre BoF! Lustre User Group 2025, organized by OpenSFS, will be hosted at Stanford University on April 1-2, 2025. Save the date!
November 20, 2024 at 2:40 PM
Fun fact: the Georgia Aquarium (nonprofit), next to the Congress center is the largest in the U.S. and the only one that houses whale sharks. I went on Sunday and it was worth it. Just in case you need a break from SC24 this week… 🦈
November 18, 2024 at 5:39 PM
I always enjoy an update from JD Maloney (NCSA), but even more when it is about using S3 for Archival Storage, something we are deploying at scale at Stanford this year (using MinIO server and Lustre/HSM!)
November 18, 2024 at 5:11 PM
Our branch of robinhood-lustre-3.1.7 on Rocky Linux 9.3 with our own branch of Lustre 2.15.4 and MariaDB 10.11 can ingest more than 35K Lustre changelogs/sec. Those gauges seem appropriate for Pi Day, no?
github.com/stanford-rc/...
github.com/stanford-rc/...
March 14, 2024 at 10:54 PM
Reposted by Stéphane Thiell
Sherlock goes full flash!
The scratch file system of Sherlock, Stanford's HPC cluster, has been revamped to provide 10 PB of fast flash storage on Lustre
news.sherlock.stanford.edu/publications...
Sherlock goes full flash - Sherlock
What could be more frustrating than anxiously waiting for your computing job to finish? Slow I/O that makes it take even longer is certainly high on the list. But not anymore! Fir, Sherlock’s scratch file system, has just undergone a major
news.sherlock.stanford.edu
February 8, 2024 at 12:09 AM
Filesystem upgrade complete! Stanford cares about HPC I/O! The Sherlock cluster has now ~10PB of full flash Lustre scratch storage at 400 GB/s, to support a wide range of research jobs on large datasets! Fully built in-house!
January 25, 2024 at 1:27 AM
My header image is an extract from this photo taken at the The Last Bookstore in Los Angeles, a really cool place.
October 28, 2023 at 1:03 AM
👋
October 28, 2023 at 12:55 AM