Bart Duisterhof
banner
bardienus.bsky.social
Bart Duisterhof
@bardienus.bsky.social
PhD Student @cmurobotics.bsky.social with @jeff-ichnowski.bsky.social || DUSt3R Research Intern @naverlabseurope || 4D Vision for Robot Manipulation 📷

He/Him - https://bart-ai.com
Pinned
Imagine if robots could fill in the blanks in cluttered scenes.

✨ Enter RaySt3R: a single masked RGB-D image in, complete 3D out.
It infers depth, object masks, and confidence for novel views, and merges the predictions into a single point cloud. rayst3r.github.io
RaySt3R was accepted to NeurIPS! Check out the HuggingFace demo for image to 3D in cluttered scenes huggingface.co/spaces/bartd...
September 19, 2025 at 5:28 PM
In "hearing the slide"👂 (led by @yuemin-mao.bsky.social ) we estimate *loss* of contact with a contact microphone, and use it to learn dynamic constraints.⚡ It allows moving multiple intricate objects🍷 efficiently, even objects that would otherwise be hard to grasp. fast-non-prehensile.github.io
June 12, 2025 at 3:32 PM
Reposted by Bart Duisterhof
For which the code is also available github.com/naver/pow3r
GitHub - naver/pow3r
Contribute to naver/pow3r development by creating an account on GitHub.
github.com
June 12, 2025 at 1:41 PM
Reposted by Bart Duisterhof
Imagine if robots could fill in the blanks in cluttered scenes.

✨ Enter RaySt3R: a single masked RGB-D image in, complete 3D out.
It infers depth, object masks, and confidence for novel views, and merges the predictions into a single point cloud. rayst3r.github.io
June 6, 2025 at 1:51 PM
Reposted by Bart Duisterhof
The Best Student Paper Award goes to MASt3R-SfM! #3DV2025
March 26, 2025 at 1:25 AM
Reposted by Bart Duisterhof
🎉Excited to share that our paper was a finalist for best paper at #HRI2025! We introduce MOE-Hair, a soft robot system for hair care 💇🏻💆🏼 that uses mechanical compliance and visual force sensing for safe, comfortable interaction. Check our work: moehair.github.io @cmurobotics.bsky.social 🧵1/7
March 17, 2025 at 4:02 PM
Reposted by Bart Duisterhof
MUSt3R: Multi-view Network for Stereo 3D Reconstruction

Yohann Cabon, Lucas Stoffl, Leonid Antsfeld, Gabriela Csurka, Boris Chidlovskii, Jerome Revaud, @vincentleroy.bsky.social

tl;dr: make DUSt3R symmetric and iterative+multi-layer memory mechanism->multi-view DUSt3R

arxiv.org/abs/2503.01661
March 4, 2025 at 8:26 AM
Great news, CMU's Center for Machine Learning and Health (CMLH) decided to fund another year of our research! If you're a PhD student at CMU, consider applying for the next iterations of the fellowship - the funding is generous and relatively unconstrained :)
January 31, 2025 at 8:28 PM
Reposted by Bart Duisterhof
Watch Professor Jeff Ichnowski's RI seminar talk: "Learning for Dynamic Robot Manipulation of Deformable and Transparent Objects" 🦾🤖

@jeff-ichnowski.bsky.social closed out our Fall seminar series. Keep an eye out for the Spring schedule in the new year!

www.youtube.com/watch?v=DvvF...
RI Seminar : Jeffrey Ichnowski : Learning for Dynamic Robot Manipulation of Deformable...
YouTube video by CMU Robotics Institute
www.youtube.com
November 26, 2024 at 6:45 PM
Reposted by Bart Duisterhof
Intro Post
Hello World!
I'm a 2nd year Robotics PhD student at CMU, working on distributed dexterous manipulation, accessible soft robots and sensors, sample efficient robot learning, and causal inference.

Here are my cute robots:
PS: Videos are old and sped up. They move slower in real-world :3
November 23, 2024 at 6:49 PM
Reposted by Bart Duisterhof
My growing list of #computervision researchers on Bsky.

Missed you? Let me know.

go.bsky.app/M7HGC3Y
November 19, 2024 at 11:00 PM
Reposted by Bart Duisterhof
Welcome to all new arrivals here on Bluesky! :) Here's a starter pack of people working on computer vision.
go.bsky.app/PkAKJu5
November 17, 2024 at 8:05 AM
Reposted by Bart Duisterhof
After my general computer vision starter pack is now full (150/150 entries reached), here is one specific to 3D Vision: go.bsky.app/Cfm9XFe
November 21, 2024 at 8:15 AM
Check out this work by my lab mates: learning dynamic tasks using a soft robotic hand!
Can soft robots rapidly spin pens like humans?🤔 We’ve shown that soft robot hands can master the dynamic tasks of pen spinning—no hours of GPU training or complex sim-to-real needed! Check out soft-spin.github.io. 🤖✍️ @cmurobotics.bsky.social .
1/5🧵.
November 21, 2024 at 8:20 AM
Hello world! I'm a PhD student at CMU in robotics, at the intersection of vision and robot manipulation.
November 18, 2024 at 10:48 AM