Grant Rotskoff
grant.rotskoff.cc
Grant Rotskoff
@grant.rotskoff.cc
Statistical mechanic working on generative models for biophysics and beyond. Assistant professor at Stanford. https://statmech.stanford.edu
Reposted by Grant Rotskoff
Small angel X-ray scattering
June 19, 2025 at 8:49 PM
Reposted by Grant Rotskoff
Big fan of this perspective:
May 7, 2025 at 6:46 PM
Reposted by Grant Rotskoff
The plan at FutureHouse has been to build scientific agents for discoveries. We’ve spent the last year researching the best way to make agents. We’ve made a ton of progress and now we’ve engineered them to be used at scale, by anyone. Free and on API.
May 1, 2025 at 4:16 PM
What an incredibly cool paper! While knot theory strictly applies to closed curves, Tommy, @smnlssn.bsky.social , and @paulrobustelli.bsky.social show that writhe, a knot "non-invariant" that changes with smooth deformations, provides a meaningful descriptor for flexible conformations.
Presenting one of my favorite manuscripts I've ever worked on:

"Characterizing structural and kinetic ensembles of intrinsically disordered proteins using writhe"

www.biorxiv.org/content/10.1...

by Tommy Sisk, with a generative modeling component done in collaboration with @smnlssn.bsky.social
May 1, 2025 at 5:49 AM
Reposted by Grant Rotskoff
Our review on machine learning methods to study sequence–ensemble–function relationships in disordered proteins is now out in COSB

authors.elsevier.com/sd/article/S...
Led by @sobuelow.bsky.social and Giulio Tesei
March 12, 2025 at 9:35 PM
Excited to see our paper “Computing Nonequilibrium Responses with Score-Shifted Stochastic Differential Equations” in Physical Review Letters this morning as an Editor’s Suggestion! We uses ideas from generative modeling to unravel a rather technical problem. 🧵 journals.aps.org/prl/abstract...
March 4, 2025 at 6:45 PM
Reposted by Grant Rotskoff
Applications for the FutureHouse Independent Postdoctoral Fellowship are due in two weeks! $125k annual stipend, full access to our resources, be coadvised by world class professors and apply our AI science agents to make new discoveries. Apply!

Details here: www.futurehouse.org/fellowship
January 31, 2025 at 3:08 PM
Reposted by Grant Rotskoff
Ten simple rules for developing good reading habits during graduate school and beyond

To me, the most important are:
Read often, read broadly (incl. older papers and outside your field), and learn to read some papers in detail and others more superficially (and quickly)
January 26, 2025 at 10:15 AM
Reposted by Grant Rotskoff
Excited to share this beast of a review on the potential of protein-based degraders from trainees who are all not on the Sky! Herein we cover choice of binders, selection strategies, E3 ligases, and DELIVERY! pubs.acs.org/doi/10.1021/...
Protein-Based Degraders: From Chemical Biology Tools to Neo-Therapeutics
The nascent field of targeted protein degradation (TPD) could revolutionize biomedicine due to the ability of degrader molecules to selectively modulate disease-relevant proteins. A key limitation to ...
pubs.acs.org
January 18, 2025 at 1:07 AM
I am hiring a postdoctoral scholar with a start date summer or fall 2025. Projects will be focused on thermodynamically consistent generative models, broadly defined. If you’re interested, please send a CV and one paragraph about why you think you’d be a good fit to rotskoff@stanford.edu
December 23, 2024 at 5:31 PM
Lot of cool stuff in here. Consistent with my working hypothesis that the main scientific utility of LLMs at the moment is plain old NLP
🎅🏼 A small early Christmas present from our team.

To celebrate the publication of our data extraction tutorial in Chem Soc Rev, we made it easy to run it — without any installation — on a JupyterHub of the Base4NFDI.

🎥 Video intro to the JupyterHub deployment: youtu.be/l-5QNUo1fcU
From text to insight: large language models for chemical data extraction
The vast majority of chemical knowledge exists in unstructured natural language, yet structured data is crucial for innovative and systematic materials design. Traditionally, the field has relied on m...
pubs.rsc.org
December 22, 2024 at 5:30 PM
Really cool opportunity via futurehouse. Come work with them and collaborate with us at Stanford!
FutureHouse is launching an independent postdoctoral fellowship program for exceptional researchers who want to apply our automated science tools to specific problems in biology and biochemistry, in collaboration with world-leading academic labs. 1/
December 19, 2024 at 5:55 PM
If you didn't see our poster at NeurIPS on how to make diffusion model inference fast, you can always read the paper here: arxiv.org/abs/2405.15986
December 13, 2024 at 4:22 PM
Reposted by Grant Rotskoff
also I must say often when I read new methods being pre-printed, while I appreciate the eagerness to make a splash, many folks seem unaware of the long history of this field & its assessments - to their detriment

If in CADD, pls read through D3R's last paper
pubmed.ncbi.nlm.nih.gov/31974851/
D3R grand challenge 4: blind prediction of protein-ligand poses, affinity rankings, and relative binding free energies - PubMed
The Drug Design Data Resource (D3R) aims to identify best practice methods for computer aided drug design through blinded ligand pose prediction and affinity challenges. Herein, we report on the resul...
pubmed.ncbi.nlm.nih.gov
December 9, 2024 at 3:46 AM
@franknoe.bsky.social presented this very impressive work at a fantastic @cecamevents.bsky.social workshop this week. I’m very excited to take a deep dive into the details this weekend!
Super excited to preprint our work on developing a Biomolecular Emulator (BioEmu): Scalable emulation of protein equilibrium ensembles with generative deep learning from @msftresearch.bsky.social ch AI for Science.

www.biorxiv.org/content/10.1...
December 6, 2024 at 4:34 PM
If you're at NeurIPS next week come see our spotlight poster led by Yinuo Ren and Haoxuan Chen! We use the parallel sampling technique to rigorously establish a big acceleration for diffusion model inference! neurips.cc/virtual/2024...
NeurIPS Poster Accelerating Diffusion Models with Parallel Sampling: Inference at Sub-Linear Time ComplexityNeurIPS 2024
neurips.cc
December 3, 2024 at 9:55 PM
Chemists use NMR spectroscopy to identify molecules, but interpreting spectra is laborious and error prone. We show the process can be automated end-to-end using a well-designed Molecular GPT. Importantly, we also make predictions of substructures for interpretability. pubs.acs.org/doi/10.1021/...
Accurate and Efficient Structure Elucidation from Routine One-Dimensional NMR Spectra Using Multitask Machine Learning
Rapid determination of molecular structures can greatly accelerate workflows across many chemical disciplines. However, elucidating structure using only one-dimensional (1D) NMR spectra, the most readily accessible data, remains an extremely challenging problem because of the combinatorial explosion of the number of possible molecules as the number of constituent atoms is increased. Here, we introduce a multitask machine learning framework that predicts the molecular structure (formula and connectivity) of an unknown compound solely based on its 1D 1H and/or 13C NMR spectra. First, we show how a transformer architecture can be constructed to efficiently solve the task, traditionally performed by chemists, of assembling large numbers of molecular fragments into molecular structures. Integrating this capability with a convolutional neural network, we build an end-to-end model for predicting structure from spectra that is fast and accurate. We demonstrate the effectiveness of this framework on molecules with up to 19 heavy (non-hydrogen) atoms, a size for which there are trillions of possible structures. Without relying on any prior chemical knowledge such as the molecular formula, we show that our approach predicts the exact molecule 69.6% of the time within the first 15 predictions, reducing the search space by up to 11 orders of magnitude.
pubs.acs.org
November 13, 2024 at 6:22 PM