Nate Corley
@ncorley.bsky.social
PhD Candidate @ the University of Washington's Institute for Protein Design | Baker Lab | Machine Learning for Protein Design | Enzymes
Reposted by Nate Corley
`atomworks.ml` on the other hand offers advanced dataset featurization and sampling for deep learning workflows, all operating on the canonical AtomArray object from @biotite_python so that all transforms are traceable and generalizable between models. 4/6
August 15, 2025 at 5:46 PM
`atomworks.ml` on the other hand offers advanced dataset featurization and sampling for deep learning workflows, all operating on the canonical AtomArray object from @biotite_python so that all transforms are traceable and generalizable between models. 4/6
Reposted by Nate Corley
AtomWorks has two main components: atomworks.io takes a file (cif, sdf, ...) and does parsing, cleaning and more. You can also look at your structures in a notebook or via PyMol thanks to pymol-remote, so you can directly inspect if your code does what you want! 3/6
August 15, 2025 at 5:45 PM
AtomWorks has two main components: atomworks.io takes a file (cif, sdf, ...) and does parsing, cleaning and more. You can also look at your structures in a notebook or via PyMol thanks to pymol-remote, so you can directly inspect if your code does what you want! 3/6
(7/7)
We also want to give a shout-out to [Biotite](www.biotite-python.org/latest/), which is the bedrock of our approach. Biotite has made our framework vastly more performant and flexible. We're excited for what's next!
We also want to give a shout-out to [Biotite](www.biotite-python.org/latest/), which is the bedrock of our approach. Biotite has made our framework vastly more performant and flexible. We're excited for what's next!
Biotite documentation — Biotite
www.biotite-python.org
August 15, 2025 at 5:17 PM
(7/7)
We also want to give a shout-out to [Biotite](www.biotite-python.org/latest/), which is the bedrock of our approach. Biotite has made our framework vastly more performant and flexible. We're excited for what's next!
We also want to give a shout-out to [Biotite](www.biotite-python.org/latest/), which is the bedrock of our approach. Biotite has made our framework vastly more performant and flexible. We're excited for what's next!
(6/7)
Frank DiMaio both directed and carried the team - he deserves the most recognition
Frank DiMaio both directed and carried the team - he deserves the most recognition
August 15, 2025 at 5:17 PM
(6/7)
Frank DiMaio both directed and carried the team - he deserves the most recognition
Frank DiMaio both directed and carried the team - he deserves the most recognition
(5/7)
It's been the pleasure my life to work alongside @simonmathis.bsky.social, @rkrishna3.bsky.social, @kinasekid.bsky.social , and so many other unbelievably talented individuals on this project. And extra credit to @kdidi.bsky.social for jumping into the frenzy to bring this work across the line
It's been the pleasure my life to work alongside @simonmathis.bsky.social, @rkrishna3.bsky.social, @kinasekid.bsky.social , and so many other unbelievably talented individuals on this project. And extra credit to @kdidi.bsky.social for jumping into the frenzy to bring this work across the line
August 15, 2025 at 5:17 PM
(5/7)
It's been the pleasure my life to work alongside @simonmathis.bsky.social, @rkrishna3.bsky.social, @kinasekid.bsky.social , and so many other unbelievably talented individuals on this project. And extra credit to @kdidi.bsky.social for jumping into the frenzy to bring this work across the line
It's been the pleasure my life to work alongside @simonmathis.bsky.social, @rkrishna3.bsky.social, @kinasekid.bsky.social , and so many other unbelievably talented individuals on this project. And extra credit to @kdidi.bsky.social for jumping into the frenzy to bring this work across the line
(4/7)
There's no reason every researcher in the BioML space re-invents the wheel ever time they train a structure-based model. The complexity of loading and annotating biomolecular data for machine learning applications should be done once, and done right — that was our goal with AtomWorks
There's no reason every researcher in the BioML space re-invents the wheel ever time they train a structure-based model. The complexity of loading and annotating biomolecular data for machine learning applications should be done once, and done right — that was our goal with AtomWorks
August 15, 2025 at 5:17 PM
(4/7)
There's no reason every researcher in the BioML space re-invents the wheel ever time they train a structure-based model. The complexity of loading and annotating biomolecular data for machine learning applications should be done once, and done right — that was our goal with AtomWorks
There's no reason every researcher in the BioML space re-invents the wheel ever time they train a structure-based model. The complexity of loading and annotating biomolecular data for machine learning applications should be done once, and done right — that was our goal with AtomWorks
(3/7)
We're also thrilled to release AtomWorks, which we used as the foundation for not only RF3, but also RF2AA, LigandMPNN, ProteinMPNN, and a design model — all by just swapping out a handful of modular components we call Transforms (just like Torchdata, for those familiar)
We're also thrilled to release AtomWorks, which we used as the foundation for not only RF3, but also RF2AA, LigandMPNN, ProteinMPNN, and a design model — all by just swapping out a handful of modular components we call Transforms (just like Torchdata, for those familiar)
August 15, 2025 at 5:17 PM
(3/7)
We're also thrilled to release AtomWorks, which we used as the foundation for not only RF3, but also RF2AA, LigandMPNN, ProteinMPNN, and a design model — all by just swapping out a handful of modular components we call Transforms (just like Torchdata, for those familiar)
We're also thrilled to release AtomWorks, which we used as the foundation for not only RF3, but also RF2AA, LigandMPNN, ProteinMPNN, and a design model — all by just swapping out a handful of modular components we call Transforms (just like Torchdata, for those familiar)
(2/7)
We're thrilled to share RF3 fully open-source with the community — it handles chirality better than any other model, it supports arbitrary atomic templating (which we included at train-time), and it narrows the open-source gap to AF3
We're thrilled to share RF3 fully open-source with the community — it handles chirality better than any other model, it supports arbitrary atomic templating (which we included at train-time), and it narrows the open-source gap to AF3
August 15, 2025 at 5:17 PM
(2/7)
We're thrilled to share RF3 fully open-source with the community — it handles chirality better than any other model, it supports arbitrary atomic templating (which we included at train-time), and it narrows the open-source gap to AF3
We're thrilled to share RF3 fully open-source with the community — it handles chirality better than any other model, it supports arbitrary atomic templating (which we included at train-time), and it narrows the open-source gap to AF3