linyijin.bsky.social
@linyijin.bsky.social
jinlinyi.github.io
PhD student at University of Michigan; Student Researcher @Google DeepMind; ex-intern @Adobe
See more scenes & details of how it works on our website: stereo4d.github.io
Paper link: arxiv.org/abs/2412.09621
Thanks to the great team! Richard Tucker, @zhengqili.bsky.social , David Fouhey @snavely.bsky.social , @holynski.bsky.social

Please stay tuned for updates on data & code.
Stereo4D: Learning How Things Move in 3D from Internet Stereo Videos
Use stereo videos from the internet to create a dataset of over 100,000 real-world 4D scenes with metric scale and long-term 3D motion trajectories.
stereo4d.github.io
December 13, 2024 at 3:13 AM
This type of data is ideal for learning the structure and dynamics of the real world.

We gave this a shot — by extending DUSt3R to model 3D motion, and training on our dataset. Given a pair of frames, our model predicts a 3D point cloud, and corresponding 3D motion trajectories.
December 13, 2024 at 3:13 AM