NYU's Center of Data Science
Rate-In's approach: We dynamically adjust dropout rates by measuring information loss in each layer. Where features are critical, we preserve more; where they're redundant, we drop more. Like adaptive noise, guided by information theory!
Rate-In's approach: We dynamically adjust dropout rates by measuring information loss in each layer. Where features are critical, we preserve more; where they're redundant, we drop more. Like adaptive noise, guided by information theory!
A) Says "There's a tumor" with blind confidence
B) Points out exactly which areas it's uncertain about, helping focus your expertise.
A) Says "There's a tumor" with blind confidence
B) Points out exactly which areas it's uncertain about, helping focus your expertise.
We present "Rate-In" - a technique that helps neural networks better express their uncertainty during inference, which is especially crucial for medical applications!
with Tal Zeevi, @yann-lecun.bsky.social , H. Stain Lawrence and John Onofrey
We present "Rate-In" - a technique that helps neural networks better express their uncertainty during inference, which is especially crucial for medical applications!
with Tal Zeevi, @yann-lecun.bsky.social , H. Stain Lawrence and John Onofrey
Thank you all my collaborators! 🎉
In 5K, I will give my secret to amazing papers titles 😎
Thank you all my collaborators! 🎉
In 5K, I will give my secret to amazing papers titles 😎