georgecazenavette.github.io/
We hope this work inspires future research in this area!
And please share your favorite images from our gallery!!
6/6
We hope this work inspires future research in this area!
And please share your favorite images from our gallery!!
6/6
Please see our gallery to browse all our images from many datasets (incl. ImageNet, Stanford Dogs, CUB 200, Flowers-102, Food-101): linear-gradient-matching.github.io/gallery/
5/6
Please see our gallery to browse all our images from many datasets (incl. ImageNet, Stanford Dogs, CUB 200, Flowers-102, Food-101): linear-gradient-matching.github.io/gallery/
5/6
The learned images seem to contain more discriminative features than any single real image, leading to a better classifier.
4/6
The learned images seem to contain more discriminative features than any single real image, leading to a better classifier.
4/6
Our meta loss is simply the distance between these gradients!
Critically, we also parameterize our images as pyramids as a form of implicit regularization.
3/6
Our meta loss is simply the distance between these gradients!
Critically, we also parameterize our images as pyramids as a form of implicit regularization.
3/6
Instead, we focus on learning images to train *linear classifiers* on top of pre-trained models, a more relevant task in the era of foundation models.
2/6
Instead, we focus on learning images to train *linear classifiers* on top of pre-trained models, a more relevant task in the era of foundation models.
2/6
We're looking forward to a great workshop and hope to see all of you there! 🌴
3/
We're looking forward to a great workshop and hope to see all of you there! 🌴
3/
The deadline for long papers wishing to be published in the ICCV workshop proceedings is July 7, and all other submissions have until August 29.
Please reply or DM me with any question!
2/
The deadline for long papers wishing to be published in the ICCV workshop proceedings is July 7, and all other submissions have until August 29.
Please reply or DM me with any question!
2/