Kimon Fountoulakis
banner
kfountou.bsky.social
Kimon Fountoulakis
@kfountou.bsky.social
Associate Professor at CS UWaterloo
Machine Learning
Lab: opallab.ca
We construct a large randomized family of shuffling machines. Each machine shuffles its input using only transpositions. By flipping a random coin to decide whether to apply or ignore each transposition, we obtain a randomized family with the desired properties.
October 18, 2025 at 9:38 PM
On the Statistical Query Complexity of Learning Semiautomata: a Random Walk Approach

Link to the paper: arxiv.org/abs/2510.04115
October 18, 2025 at 9:38 PM
I wrote a blog post about it.

Link: medium.com/@kimon.fount...
May 27, 2025 at 6:36 PM
Learning to execute arithmetic exactly, with high probability, can be quite expensive. In the plot, 'ensemble complexity' refers to the number of independently trained models required to achieve exact learning with high probability. ell is the number of bits per number in the input.
May 26, 2025 at 3:21 AM
New paper: Learning to Add, Multiply, and Execute Algorithmic Instructions Exactly with Neural Networks
May 26, 2025 at 3:21 AM
Learning to execute arithmetic exactly, with high probability, can be quite expensive. In the plot, 'ensemble complexity' refers to the number of independently trained models required to achieve exact learning with high probability. ell is the number of bits per number in the input.
May 26, 2025 at 3:19 AM
The SIAM Conference on Optimization 2026 will be in Edinburgh! I don’t really work on optimization anymore (at least not directly), but it’s cool to see a major optimization conference taking place where I did my PhD.
May 15, 2025 at 1:32 PM
Got a pin this morning

Einstein problem: en.wikipedia.org/wiki/Einstei...
May 7, 2025 at 3:23 AM
Hey, I definitely predicted this correctly.
May 1, 2025 at 8:42 PM
NeurIPS 2026 in the Cyclades. Just saying.
April 29, 2025 at 6:12 PM
I enjoyed reading the paper "A Generalized Neural Tangent Kernel for Surrogate Gradient Learning" (Spotlight, NeurIPS 2024).

They extend the NTK framework to activation functions that have finitely many jumps.
April 27, 2025 at 12:03 AM
Shenghao's Ph.D Thesis "Perspectives of Graph Diffusion: Computation, Local Partitioning, Statistical Recovery, and Applications" is now available.

Link: dspacemainprd01.lib.uwaterloo.ca/server/api/c...

Relevant papers:
1) Local Graph Clustering with Noisy Labels (ICLR 2024)
April 12, 2025 at 10:10 PM
Computational Capability and Efficiency of Neural Networks: A Repository of Papers

I compiled a list of theoretical papers related to the computational capabilities of Transformers, recurrent networks, feedforward networks, and graph neural networks.

Link: github.com/opallab/neur...
March 9, 2025 at 7:14 PM
Given the recent trend in RL and the Turing Award to RL, you might also be interested in reading "Intelligent Machinery" by A. M. Turing from 1948, where he planted the seeds for RL and discussed the ability of computers to learn from their environment.
Link: weightagnostic.github.io/papers/turin...
March 9, 2025 at 6:20 AM
Can neural networks learn to copy or permute an input exactly with high probability? We study this basic and fundamental question in "Exact Learning of Permutations for Nonzero Binary Inputs with Logarithmic Training Size and Quadratic Ensemble Complexity"

Link: arxiv.org/abs/2502.16763
February 25, 2025 at 2:42 AM
Cool paper: Theoretical limitations of multi-layer Transformer, spotted by Shenghao Yang.

link: arxiv.org/abs/2412.02975
February 12, 2025 at 5:11 PM
Out-of-distribution performance on mixed-input types (alphanumeric).
February 4, 2025 at 4:59 AM
Out-of-distribution performance.
February 4, 2025 at 4:59 AM
In-distribution performance.
February 4, 2025 at 4:59 AM
Positional Attention: Expressivity and Learnability of Algorithmic Computation (v2)

We study the effect of using only fixed positional encodings in the Transformer architecture for computational tasks. These positional encodings remain the same across layers.
February 4, 2025 at 4:59 AM
I miss those summer European conferences! I used to attend them as a PhD student, it was always fun!
January 11, 2025 at 11:42 PM
.
Shenghao Yang passed his PhD defence today. Shenghao is the second PhD student to graduate from our group. I am very happy for Shenghao and the work that he has done!
January 10, 2025 at 9:01 PM
Robert Wang is presenting his work at #NeurIPS2024 "Analysis of Corrected Graph Convolutions". Date/time: Fri 13 Dec 11 a.m. PST — 2 p.m. PST.

link: openreview.net/pdf?id=MSsQD...
December 10, 2024 at 5:47 PM
Aseem's thesis, "Statistical Foundations for Learning on Graphs", is available on UWSpace.

link: uwspace.uwaterloo.ca/items/291d10...
December 7, 2024 at 4:22 PM
Results on popular real datasets, FSC-147 and TallyQA, and a difficult synthetic dataset (Emoji-Count).
December 3, 2024 at 4:03 AM