timschneider94.bsky.social
@timschneider94.bsky.social
franky supports position & velocity control in both joint and task space — plus gripper control, contact reactions, and more! 🤖
With franky, you get real-time control both in C++ & Python: commands are fully preemptible, and Ruckig replans smooth trajectories on the fly.
July 31, 2025 at 5:09 PM
4️⃣ We also find that even for wider holes, the resulting vision-only policy is significantly less robust to changes in the environment (different hole sizes or angles) when tested zero-shot style. In contrast, the vision-tactile policy is robust even under unseen conditions.
June 13, 2025 at 10:54 AM
6️⃣ We observe that TAP learns reasonable strategies. E.g., when estimating the pose of a wrench, TAP first scans the surface to find the handle and then moves towards one of the ends to determine pose and orientation.
June 12, 2025 at 12:33 PM
5️⃣ Key Experiments:
We tested TAP on a variety of ap_gym (github.com/TimSchneider...) tasks from the TactileMNIST benchmark (sites.google.com/robot-learni...).
In all cases, TAP learns to actively explore & infer object properties efficiently.
June 12, 2025 at 12:33 PM
3️⃣ Introducing TAP:
We propose TAP (Task-agnostic Active Perception) — a novel method that combines RL and transformer models for tactile exploration. Unlike previous methods, TAP is completely task-agnostic, i.e., it can learn to solve a variety of active perception problems.
June 12, 2025 at 12:33 PM