But please don't tell Franka Robotics 🤫, because using their API like that is probably illegal.
But please don't tell Franka Robotics 🤫, because using their API like that is probably illegal.
franky exposes most libfranka functionality in its Python API:
🔧 Redefine end-effector properties
⚖️ Tune joint impedance
🛑 Set force/torque thresholds
…and much more!
franky exposes most libfranka functionality in its Python API:
🔧 Redefine end-effector properties
⚖️ Tune joint impedance
🛑 Set force/torque thresholds
…and much more!
No ROS nodes. No launch files. Just five lines of Python.
No ROS nodes. No launch files. Just five lines of Python.
With franky, you get real-time control both in C++ & Python: commands are fully preemptible, and Ruckig replans smooth trajectories on the fly.
With franky, you get real-time control both in C++ & Python: commands are fully preemptible, and Ruckig replans smooth trajectories on the fly.
Turns out, as long as the hole is wide enough, a vision-only agent learns to solve the task just as well as a vision-tactile agent. However, once we make the hole tighter, we see that the vision-only agent fails to solve the task and gets stuck in a local minimum.
Turns out, as long as the hole is wide enough, a vision-only agent learns to solve the task just as well as a vision-tactile agent. However, once we make the hole tighter, we see that the vision-only agent fails to solve the task and gets stuck in a local minimum.
We built a real, fully autonomous, and self-resetting tactile insertion setup and trained model-based RL directly in the real world. Using this setup, we run extensive experiments to understand the role of vision and touch in this task.
We built a real, fully autonomous, and self-resetting tactile insertion setup and trained model-based RL directly in the real world. Using this setup, we run extensive experiments to understand the role of vision and touch in this task.
#Robotics #TactileSensing #RL #DexterousManipulation @ias_tudarmstadt
🧵
#Robotics #TactileSensing #RL #DexterousManipulation @ias_tudarmstadt
🧵
TAP outperformed both random and prior state-of-the-art (HAM) baselines, highlighting the value of attention-based models and off-policy RL for tactile exploration.
TAP outperformed both random and prior state-of-the-art (HAM) baselines, highlighting the value of attention-based models and off-policy RL for tactile exploration.
We tested TAP on a variety of ap_gym (github.com/TimSchneider...) tasks from the TactileMNIST benchmark (sites.google.com/robot-learni...).
In all cases, TAP learns to actively explore & infer object properties efficiently.
We tested TAP on a variety of ap_gym (github.com/TimSchneider...) tasks from the TactileMNIST benchmark (sites.google.com/robot-learni...).
In all cases, TAP learns to actively explore & infer object properties efficiently.
TAP jointly learns action and prediction with a shared transformer encoder using a combination of RL and supervised learning. We show that TAP's formulation arises naturally when optimizing a supervised learning objective w.r.t action and prediction.
TAP jointly learns action and prediction with a shared transformer encoder using a combination of RL and supervised learning. We show that TAP's formulation arises naturally when optimizing a supervised learning objective w.r.t action and prediction.
We propose TAP (Task-agnostic Active Perception) — a novel method that combines RL and transformer models for tactile exploration. Unlike previous methods, TAP is completely task-agnostic, i.e., it can learn to solve a variety of active perception problems.
We propose TAP (Task-agnostic Active Perception) — a novel method that combines RL and transformer models for tactile exploration. Unlike previous methods, TAP is completely task-agnostic, i.e., it can learn to solve a variety of active perception problems.
🧵
#Robotics #TactileSensing #ReinforcementLearning #Transformers #ActivePerception @ias-tudarmstadt.bsky.social
🧵
#Robotics #TactileSensing #ReinforcementLearning #Transformers #ActivePerception @ias-tudarmstadt.bsky.social