Eric W. Tramel
fujikanaeda.bsky.social
Eric W. Tramel
@fujikanaeda.bsky.social
Research Scientist, Engineer, & Builder in ML & AI (learning flavors: Generative, Privacy preserving, federated, unsupervised, Bayesian). Ex: Unlearn.ai, Amazon Alexa, Owkin, INRIA, ENS.
FL was a field that caught its own tail, in terms of hype cycle. Now it is just something entirely detached from reality.

Thankfully, while incredibly hyped, LLM/transformer field has actually delivered — or delivered enough to outpace itself.
November 25, 2024 at 10:14 PM
But federated learning is a synchronous training owing to its need for data privacy — induces huge sync overhead for SMPC. Not sure FL is the right paradigm lens to look through.

Why not go back to EA-SGD, Hogwild, etc and build back up from there? Async breaks bottleneck and grads can be quant
November 25, 2024 at 12:47 PM