https://keyonvafa.com
Starting with orbits: we encode solar systems as sequences and train a transformer on 10M solar systems (20B tokens)
The model makes accurate predictions many timesteps ahead. Predictions for our solar system:
Starting with orbits: we encode solar systems as sequences and train a transformer on 10M solar systems (20B tokens)
The model makes accurate predictions many timesteps ahead. Predictions for our solar system:
What would that even mean?
Our new ICML paper (poster tomorrow!) formalizes these questions.
One result tells the story: A transformer trained on 10M solar systems nails planetary orbits. But it botches gravitational laws 🧵
What would that even mean?
Our new ICML paper (poster tomorrow!) formalizes these questions.
One result tells the story: A transformer trained on 10M solar systems nails planetary orbits. But it botches gravitational laws 🧵