Andrew Silva
andrewsilva9.bsky.social
Andrew Silva
@andrewsilva9.bsky.social
Research Scientist @  | Previously @ Toyota Research Institute and Google | PhD from Georgia Tech.
Reposted by Andrew Silva
LLMs are currently this one big parameter block that stores all sort of facts. In our new preprint, we add context-specific memory parameters to the model, and pretrain the model along with a big bank of memories.

📑 arxiv.org/abs/2510.02375

[1/10]🧵
October 6, 2025 at 4:06 PM
Reposted by Andrew Silva
Our two phenomenal interns, Alireza Mousavi-Hosseini and Stephen Zhang @syz.bsky.social have been cooking some really cool work with Michal Klein and me over the summer.

Relying on optimal transport couplings (to pick noise and data pairs) should, in principle, be helpful to guide flow matching

🧵
October 3, 2025 at 8:50 PM
Reposted by Andrew Silva
Join us at @emnlpmeeting.bsky.social for:

"Tailoring AI: Exploring Active and Passive LLM Personalization" 🎯🧠

To answer, when should LLMs personalize? What role do users play in LLM-personalization?

📅 Deadline Aug. 1
📝 Details in thread 🧵👇
#EMNLP2025 #LLM #AI #personalization

1/5
June 10, 2025 at 4:21 PM