Richard C. Suwandi
banner
richardcsuwandi.bsky.social
Richard C. Suwandi
@richardcsuwandi.bsky.social
PhD-ing at CUHK-Shenzhen. Building evolutionary coding agents at Dria. #AI4Science community leader at alphaXiv

richardcsuwandi.github.io
Our analysis also revealed that LLM-guided evolution consistently improve population fitness, significantly outperforming random recombination or traditional genetic algorithms
September 27, 2025 at 2:30 PM
CAKE also excelled in the multi-objective setting:

- Achieved highest overall score and hypervolume for photonic chip design
- Demonstrated tenfold speedup in finding high-quality solutions
September 27, 2025 at 2:30 PM
On 60 HPOBench tasks, CAKE demonstrated superior performance:

- Consistently achieved highest average test accuracy across all ML models
- Showed rapid early progress, achieving 67.5% of total improvement within 25% of the budget
September 27, 2025 at 2:30 PM
1️⃣ How well the kernel explains the observed data (as measured by model fit)
2️⃣ How promising the kernel’s proposed next query point is (as measured by acquisition value)
September 27, 2025 at 2:30 PM
CAKE works via an evolutionary process:

1️⃣ Initialize a population of base kernels
2️⃣ Score each kernel using a fitness function
3️⃣ Evolve kernels via LLM-driven crossover and mutation to generate new candidates
4️⃣ Select top-performing kernels for the next generation
September 27, 2025 at 2:30 PM
Fresh out of the oven: CAKE is accepted at #NeurIPS2025! 🎉

TL;DR: We introduce Context-Aware Kernel Evolution (CAKE) 🍰, an adaptive kernel design method that leverages LLMs as genetic operators to dynamically evolve Gaussian process (GP) kernels during Bayesian optimization (BO)
September 27, 2025 at 2:30 PM
We’re training AI on everything that we know, but what about things that we don’t know?

At #ICML2025, the "Exploration in AI Today (EXAIT)" Workshop sparked a crucial conversation: as AI systems grow more powerful, they're relying less on genuine exploration and more on curated human data.
July 23, 2025 at 7:17 PM
From inventing new musical genres to imagining life beyond our universe, we continuously push the boundaries of what’s possible.

What if AI could be as endlessly creative as humans or even nature itself?
June 27, 2025 at 4:15 PM
2 years ago, Ilya Sutskever made a bold prediction that large neural networks are learning world models through text 🌏

Recently, a new paper by Google DeepMind provided a compelling insight to this idea.
June 11, 2025 at 5:31 PM
TL;DR: We present a novel Gaussian process (GP) kernel and a distributed learning framework to address key challenges in scaling GP regression for multi-dimensional and large-scale data.
May 19, 2025 at 1:49 PM
Excited to share that our latest work on grid spectral mixture product (GSMP) kernel has been featured in Prof. Sergios Theodoridis' latest book "Machine Learning: From the Classics to Deep Networks, Transformers and Diffusion Models"! 🎉
May 19, 2025 at 1:49 PM
TL;DR: a query-efficient prompt-tuning method for black-box vision-language models through low-rank reparameterization and intrinsic dimensional clipping.
May 7, 2025 at 3:28 PM
TL;DR: a game-theoretical approach to combinatorial BO which enables efficient and tractable acquisition function optimization in large combinatorial domains.

@melisilaydabal.bsky.social @arkrause.bsky.social
May 7, 2025 at 3:28 PM
TL;DR: fine-tunes LLMs against the squared error loss, making it better suited for regression tasks while outperforming standard decoding and predictive head approaches.
May 7, 2025 at 3:28 PM
TL;DR: combines the principles of evolution and diffusion to solve optimization problems, achieving competitive performance in quality-diversity metrics compared to traditional methods like CMA-ES and MAP-Elite
May 7, 2025 at 3:28 PM
TL;DR: extends GPs to non-Euclidean spaces using residual layers and manifold-aware techniques, achieving better performance in regression and BO tasks, compared to shallow GPs and baseline deep GPs.

@arkrause.bsky.social
May 7, 2025 at 3:28 PM
It was such a privilege to meet and exchange ideas with so many brilliant researchers, practitioners, and AI enthusiasts who are pushing the boundaries of machine learning every day ✨
May 7, 2025 at 3:05 AM
What an unforgettable experience at #ICLR2025! 🎉

Still soaking in everything I learned, the inspiring conversations I had, and the amazing connections I made over the past few days.
May 7, 2025 at 3:05 AM
Honored to attend a talk by Prof. Stephen Boyd this morning on "Embedded Convex Optimization for Control". I have been a huge fan of his work for years and his book, Convex Optimization, was the very first book I picked up on #optimization.
November 28, 2024 at 5:54 AM