Richard C. Suwandi
banner
richardcsuwandi.bsky.social
Richard C. Suwandi
@richardcsuwandi.bsky.social
PhD-ing at CUHK-Shenzhen. Building evolutionary coding agents at Dria. #AI4Science community leader at alphaXiv

richardcsuwandi.github.io
Pinned
Fresh out of the oven: CAKE is accepted at #NeurIPS2025! 🎉

TL;DR: We introduce Context-Aware Kernel Evolution (CAKE) 🍰, an adaptive kernel design method that leverages LLMs as genetic operators to dynamically evolve Gaussian process (GP) kernels during Bayesian optimization (BO)
Fresh out of the oven: CAKE is accepted at #NeurIPS2025! 🎉

TL;DR: We introduce Context-Aware Kernel Evolution (CAKE) 🍰, an adaptive kernel design method that leverages LLMs as genetic operators to dynamically evolve Gaussian process (GP) kernels during Bayesian optimization (BO)
September 27, 2025 at 2:30 PM
We’re training AI on everything that we know, but what about things that we don’t know?

At #ICML2025, the "Exploration in AI Today (EXAIT)" Workshop sparked a crucial conversation: as AI systems grow more powerful, they're relying less on genuine exploration and more on curated human data.
July 23, 2025 at 7:17 PM
Most AI systems today follow the same predictable pattern: they're built for specific tasks and optimized for objectives rather than exploration.

Meanwhile, humans are an open-ended species—driven by curiosity and constantly questioning the unknown.
June 27, 2025 at 4:15 PM
2 years ago, Ilya Sutskever made a bold prediction that large neural networks are learning world models through text 🌏

Recently, a new paper by Google DeepMind provided a compelling insight to this idea.
June 11, 2025 at 5:31 PM
Reposted by Richard C. Suwandi
AI that can improve itself: A deep dive into self-improving AI and the Darwin-Gödel Machine.

richardcsuwandi.github.io/blog/2025/dgm/

Excellent blog post by Richard Suwandi reviewing the Darwin Gödel Machine (DGM) and future implications.
June 4, 2025 at 10:03 AM
Reposted by Richard C. Suwandi
Most AI systems today are stuck in a "cage" designed by humans.

They rely on fixed architectures crafted by engineers and lack the ability to evolve autonomously over time.
June 3, 2025 at 4:59 PM
Excited to share that our latest work on grid spectral mixture product (GSMP) kernel has been featured in Prof. Sergios Theodoridis' latest book "Machine Learning: From the Classics to Deep Networks, Transformers and Diffusion Models"! 🎉
May 19, 2025 at 1:49 PM
What an unforgettable experience at #ICLR2025! 🎉

Still soaking in everything I learned, the inspiring conversations I had, and the amazing connections I made over the past few days.
May 7, 2025 at 3:05 AM
Honored to attend a talk by Prof. Stephen Boyd this morning on "Embedded Convex Optimization for Control". I have been a huge fan of his work for years and his book, Convex Optimization, was the very first book I picked up on #optimization.
November 28, 2024 at 5:54 AM
Reposted by Richard C. Suwandi
I made a starter pack for machine learning researchers working on Bayesian optimization and Gaussian processes!

It's *very* sparse ATM since the migration is still in progress.

Please reply to make me aware of people - potentially yourself - who should be added to it!

go.bsky.app/QYMEQ52
November 21, 2024 at 8:00 PM
Reposted by Richard C. Suwandi
November 23, 2024 at 11:08 AM