For more details, please refer to:
🔗 Website: jc-bao.github.io/spider-proj...
📄 Paper: arxiv.org/abs/2511.09484
Huge thanks to the team at FAIR + CMU!
We hope SPIDER pushes scalable, physics-grounded retargeting forward 🕷️🤖
For more details, please refer to:
🔗 Website: jc-bao.github.io/spider-proj...
📄 Paper: arxiv.org/abs/2511.09484
Huge thanks to the team at FAIR + CMU!
We hope SPIDER pushes scalable, physics-grounded retargeting forward 🕷️🤖
SPIDER is also a data engine:
We vary object size, states, and scenes to produce massive, physics-valid training data.
SPIDER is also a data engine:
We vary object size, states, and scenes to produce massive, physics-valid training data.
SPIDER supports diverse embodiments: 🤖 humanoid + ✋ dexterous hand
📦 2.4M robot-feasible frames
🤖 9 embodiments
🧊 103 objects
⏱️ 800 hrs of data
SPIDER supports diverse embodiments: 🤖 humanoid + ✋ dexterous hand
📦 2.4M robot-feasible frames
🤖 9 embodiments
🧊 103 objects
⏱️ 800 hrs of data
Because SPIDER is grounded with physics, the generated motions are directly executable on real hardware—no tuning.
Zero robot demos. No RL training. Just human → physics → real robot.
Because SPIDER is grounded with physics, the generated motions are directly executable on real hardware—no tuning.
Zero robot demos. No RL training. Just human → physics → real robot.
Retargeting human motion is hard: embodiment gap, missing forces, noisy data. IK fails ❌, RL is costly ❌.
SPIDER converts kinematic human demos into dynamically feasible robot trajectories using physics-informed sampling + virtual contact guidance.
👇 Pipeline
Retargeting human motion is hard: embodiment gap, missing forces, noisy data. IK fails ❌, RL is costly ❌.
SPIDER converts kinematic human demos into dynamically feasible robot trajectories using physics-informed sampling + virtual contact guidance.
👇 Pipeline