Rajiv Sambharya
rajivsambharya.bsky.social
Rajiv Sambharya
@rajivsambharya.bsky.social
Postdoc at Penn Engineering | researching optimization, control, and machine learning | Princeton and Berkeley alumnus
I will be presenting on this work at @informs.bsky.social
in Atlanta on Tuesday, October 28, 4:15-5:30 pm, in the Machine Learning and Optimization session (Building B, Level 2, Room B208)!
We learned acceleration algorithms for fast parametric convex optimization. Only 10 training instances used for each example and robustness is guaranteed with PEP! Joint work w/ Jinho Bok, Nik Matni, and George Pappas!
October 27, 2025 at 6:19 PM
We learned acceleration algorithms for fast parametric convex optimization. Only 10 training instances used for each example and robustness is guaranteed with PEP! Joint work w/ Jinho Bok, Nik Matni, and George Pappas!
October 27, 2025 at 6:18 PM
Reposted by Rajiv Sambharya
📢 New in JMLR (w @rajivsambharya.bsky.social)! 🎉 Data-driven guarantees for classical & learned optimizers via sample bounds + PAC-Bayes theory.

📄 jmlr.org/papers/v26/2...
💻 github.com/stellatogrp/...
September 8, 2025 at 1:10 PM
We learned the hyperparameters to accelerate algorithms over a family of problems. Turns out that we only need 10 training instances in each example and learn long steps for (prox) gd! Check out this work with @stellato.io

paper: arxiv.org/pdf/2411.15717
code: github.com/stellatogrp/...
December 2, 2024 at 2:13 AM