Aditya Chetan
justachetan.bsky.social
Aditya Chetan
@justachetan.bsky.social
PhD Student at Cornell. Working in Vision and Graphics. justachetan.github.io
We also show improved performance in downstream applications like rendering, collision simulation, and PDE solving.
(n/n)
June 10, 2025 at 2:11 PM
To mitigate this noise, we propose a two-pronged solution. First, we leverage the classical technique of polynomial-fitting to fit low-order polynomials through the learned signal and take autodiff over the fitted polynomial.
(4/n)
June 10, 2025 at 2:11 PM
What causes these artifacts? We note that signals learned by hybrid neural fields exhibit high-frequency noise (see FFT of a 1D slice of a 2D SDF), which gets amplified when we take derivatives using standard tools like autodiff.
(3/n)
June 10, 2025 at 2:11 PM
Hybrid neural fields like Instant NGP have made training neural fields extremely efficient. However, we find that they fall short of being "faithful" representations, exhibiting noisy artifacts when we compute their spatial derivatives with autodiff.
(2/n)
June 10, 2025 at 2:11 PM
Check out our poster at #CVPR2025 on accurate differential operators for hybrid neural fields (like Instant NGP)!

🗓️ Fri, June 13, 10:30 AM–12:30 PM
📍 ExHall D, Poster #34
🔗 justachetan.github.io/hnf-derivati...
👉 cvpr.thecvf.com/virtual/2025...

Details ⬇️ (1/n)
June 10, 2025 at 2:11 PM