I'm interested in 3D Reconstruction, Radiance Fields, Gaussian splatting, 3D Scene Rendering, 3D Scene Understanding, etc.
Webpage: https://anttwo.github.io/
Check out our new work: MILo: Mesh-In-the-Loop Gaussian Splatting!
🎉Accepted to SIGGRAPH Asia 2025 (TOG)
MILo is a novel differentiable framework that extracts meshes directly from Gaussian parameters during training.
🧵👇
- 19x faster convergence ⚡
- 370x less FLOPS than FLUX-dev 📉
- 19x faster convergence ⚡
- 370x less FLOPS than FLUX-dev 📉
Antoine Guédon @antoine-guedon.bsky.social
Sinisa Stekovic
Renaud Marlet
👏
@iccv.bsky.social
iccv.thecvf.com/Conferences/...
Antoine Guédon @antoine-guedon.bsky.social
Sinisa Stekovic
Renaud Marlet
👏
@iccv.bsky.social
iccv.thecvf.com/Conferences/...
Check out our new work: MILo: Mesh-In-the-Loop Gaussian Splatting!
🎉Accepted to SIGGRAPH Asia 2025 (TOG)
MILo is a novel differentiable framework that extracts meshes directly from Gaussian parameters during training.
🧵👇
Check out our new work: MILo: Mesh-In-the-Loop Gaussian Splatting!
🎉Accepted to SIGGRAPH Asia 2025 (TOG)
MILo is a novel differentiable framework that extracts meshes directly from Gaussian parameters during training.
🧵👇
If you're in Nashville and want to discuss detailed 3D mesh reconstruction from sparse or dense RGB images, let's connect!
@kyotovision.bsky.social
🍵MAtCha reconstructs sharp, accurate and scalable meshes of both foreground AND background from just a few unposed images (eg 3 to 10 images)...
...While also working with dense-view datasets (hundreds of images)!
If you're in Nashville and want to discuss detailed 3D mesh reconstruction from sparse or dense RGB images, let's connect!
@kyotovision.bsky.social
cvpr.thecvf.com/Conferences/...
cvpr.thecvf.com/Conferences/...
🍵 MAtCha Gaussians: Atlas of Charts for High-Quality Geometry and Photorealism From Sparse Views
@antoine-guedon.bsky.social @kyotovision.bsky.social
📄 pdf: arxiv.org/abs/2412.06767
🌐 webpage: anttwo.github.io/matcha/
🍵 MAtCha Gaussians: Atlas of Charts for High-Quality Geometry and Photorealism From Sparse Views
@antoine-guedon.bsky.social @kyotovision.bsky.social
📄 pdf: arxiv.org/abs/2412.06767
🌐 webpage: anttwo.github.io/matcha/
🍵MAtCha reconstructs sharp, accurate and scalable meshes of both foreground AND background from just a few unposed images (eg 3 to 10 images)...
...While also working with dense-view datasets (hundreds of images)!
🍵MAtCha reconstructs sharp, accurate and scalable meshes of both foreground AND background from just a few unposed images (eg 3 to 10 images)...
...While also working with dense-view datasets (hundreds of images)!
Registration is open (it's free) with priority given to authors of accepted papers: cvprinparis.github.io/CVPR2025InPa...
Big 🧵👇 with details!
Registration is open (it's free) with priority given to authors of accepted papers: cvprinparis.github.io/CVPR2025InPa...
Big 🧵👇 with details!
@gbourmaud.bsky.social @vincentlepetit.bsky.social
@gbourmaud.bsky.social @vincentlepetit.bsky.social
Introducing AnySat: one model for any resolution (0.2m–250m), scale (0.3–2600 hectares), and modalities (choose from 11 sensors & time series)!
Try it with just a few lines of code:
Introducing AnySat: one model for any resolution (0.2m–250m), scale (0.3–2600 hectares), and modalities (choose from 11 sensors & time series)!
Try it with just a few lines of code:
☑️With MAtCha, we leverage a pretrained depth model to recover sharp meshes from sparse views including both foreground and background, within mins!🧵
🌐Webpage: anttwo.github.io/matcha/
☑️With MAtCha, we leverage a pretrained depth model to recover sharp meshes from sparse views including both foreground and background, within mins!🧵
🌐Webpage: anttwo.github.io/matcha/
@antoine-guedon.bsky.social, Tomoki Ichikawa, Kohei Yamashita, Ko Nishino
tl;dr: underlying scene geometry mesh->an Atlas of Charts->render with 2D Gaussian surfels
arxiv.org/abs/2412.06767
@antoine-guedon.bsky.social, Tomoki Ichikawa, Kohei Yamashita, Ko Nishino
tl;dr: underlying scene geometry mesh->an Atlas of Charts->render with 2D Gaussian surfels
arxiv.org/abs/2412.06767