Uksang Yoo
uksang.bsky.social
Uksang Yoo
@uksang.bsky.social
PhD Student at CMU RI
uksangyoo.github.io
🚀 We've already extended MOE to various dexterity projects including dynamic pen spinning (arxiv.org/abs/2411.12734) and learning in-hand manipulation from demonstration (arxiv.org/abs/2503.01078). 🧵6/7
March 17, 2025 at 4:02 PM
👥 Our study with 12 participants significantly preferred MOE with force estimation-based feedback across all tasks! Feedback included: "It felt really similar to human fingers" and felt "like a head massage.”🧵5/7
March 17, 2025 at 4:02 PM
🔍 The force estimation module combines visual deformation data 👁️ with tendon tensions 📈to precisely track applied forces, reducing sensing errors by up to 60% compared to tension-only approaches. 🧵4/7
March 17, 2025 at 4:02 PM
Not surprisingly, MOE passively applies 74% less force than rigid grippers while grasping comparable amounts of hair. This implies MOE is capable of more comfortable care without sacrificing effectiveness.🧵3/8
March 17, 2025 at 4:02 PM
💡 Rigid robots feel "rough" on hair and struggle with safety when hair obscures the scalp. 🐙🤖 We introduce MOE: a tendon-driven soft robot hand. Our system's inherent compliance provides both comfort and safer interaction. 🧵2/7
March 17, 2025 at 4:02 PM
🎉Excited to share that our paper was a finalist for best paper at #HRI2025! We introduce MOE-Hair, a soft robot system for hair care 💇🏻💆🏼 that uses mechanical compliance and visual force sensing for safe, comfortable interaction. Check our work: moehair.github.io @cmurobotics.bsky.social 🧵1/7
March 17, 2025 at 4:02 PM