Roy
roymiles.bsky.social
Roy
@roymiles.bsky.social
Research Scientist @Noah's Ark Lab, Huawei
PhD Imperial College London roymiles.github.io
Working on multi-modality and efficient ML
Hey! I will be at NeurIPS @neuripsconf.bsky.social this week (friday 13th) presenting our paper VeLoRA.

tldr; simple method for compressing gradients with a coarse reconstruction during the backwards pass. Significant memory reductions while being complimentary to LoRA!

github.com/roymiles/VeL...
December 9, 2024 at 2:13 PM
I made a starter pack for people working or interested in multi-modality learning.

It would be good to add lots more people so do comment and I'll add!

go.bsky.app/97fAH2N
November 27, 2024 at 1:15 PM
Reposted by Roy
Our BMVC poster is today!

Knowledge distillation can work in cross-task settings and even with a randomly initialised teacher! Our inverted projection decomposes into knowledge transfer and spectral regularisation, enabling teacher-free distillation with improvements on ImageNet-1K at no extra cost
https://arxiv.org/abs/2403.14494​​​​​​​​​​​​​​​​
November 27, 2024 at 10:18 AM
Some related work worth checking out if you find this cool.

mbaradad.github.io/shaders21k/ - learning good visual features from procedurally generated images.
arxiv.org/abs/2403.14494 - distillation from randomly weighted teachers.
What Makes a Good Dataset for Knowledge Distillation?
Logan Frank, Jim Davis

tl;dr: you can distill models on anything, but random noise.
arxiv.org/abs/2411.12817
November 22, 2024 at 6:17 PM
Hi Bluesky!

I'm currently working on multi-modality learning and efficient ML (mobile devices).

Hoping to regularly post about any interesting topics I find cool (ai, maths, physics).

Looking forward to seeing how this platforms grows and meeting you all!
November 21, 2024 at 8:21 AM