Chaoyi Pan
chaoyipan.bsky.social
Chaoyi Pan
@chaoyipan.bsky.social
Ph.D. @ CMU | Exploring the intersections of Learning & Control
🧵 6/n
For more details, please refer to:
🔗 Website: jc-bao.github.io/spider-proj...
📄 Paper: arxiv.org/abs/2511.09484

Huge thanks to the team at FAIR + CMU!
We hope SPIDER pushes scalable, physics-grounded retargeting forward 🕷️🤖
SPIDER: Scalable Physics-Informed Dexterous Retargeting
Learning dexterous and agile policy for humanoid and dexterous hand control requires large-scale demonstrations, but collecting robot-specific data is prohibitively expensive. In contrast,...
arxiv.org
November 14, 2025 at 3:30 PM
🧵 5/n
SPIDER is also a data engine:

We vary object size, states, and scenes to produce massive, physics-valid training data.
November 14, 2025 at 3:30 PM
🧵 4/n
SPIDER supports diverse embodiments: 🤖 humanoid + ✋ dexterous hand

📦 2.4M robot-feasible frames
🤖 9 embodiments
🧊 103 objects
⏱️ 800 hrs of data
November 14, 2025 at 3:30 PM
🧵 3/n
Because SPIDER is grounded with physics, the generated motions are directly executable on real hardware—no tuning.

Zero robot demos. No RL training. Just human → physics → real robot.
November 14, 2025 at 3:30 PM
🧵 2/n
Retargeting human motion is hard: embodiment gap, missing forces, noisy data. IK fails ❌, RL is costly ❌.

SPIDER converts kinematic human demos into dynamically feasible robot trajectories using physics-informed sampling + virtual contact guidance.

👇 Pipeline
November 14, 2025 at 3:30 PM