Reyhaneh Aghaei Saem
Reyhaneh Aghaei Saem
@reyhanehaghaeisaem.bsky.social
PhD student at EPFL
Hence, some proposals—such as forms of quantum natural gradient, CVaR, classical NN-assisted initialization, and rescaled gradient optimization—that were hoped to mitigate BP may still suffer from it. We provide numerical simulations of these optimization methods under different shot budgets.
August 1, 2025 at 10:16 AM
At the end, we present a practical step-by-step guideline to determine whether a training or encoding procedure is vulnerable to scalability limitations due to exponential concentration.
August 1, 2025 at 10:14 AM
To illustrate its practicality, we apply it to the training of a standard VQA loss with the BP landscape using a traditional gradient-based method. Then, we can show that the optimization trajectory will follow a random walk trajectory with high probability.
August 1, 2025 at 10:13 AM
Here is a very nice schematic figure of this key result.
August 1, 2025 at 10:11 AM
Using this definition, we can pin down the practical consequences of the concentration in Theorem 1, using tools from hypothesis testing.

The direct consequence of this result is that no classical post-processing removes this indistinguishability.
August 1, 2025 at 10:10 AM
The scalability of such procedures depends on whether measurement outcomes carry information about the variables. This motivates a shift in focus: rather than analyzing concentration at the level of the loss function, we study it at the level of POVM outcome probabilities for individual quantities.
August 1, 2025 at 10:08 AM
To formalize this, we consider a general procedure that covers a wide range of parameterized quantum models. In particular, many procedures used in variational quantum computing involve processing sets of parameter-dependent quantities.
August 1, 2025 at 10:01 AM
New preprint on arXiv 🚀

Link: scirate.com/arxiv/2507.2...

We present a practical step-by-step guideline to determine whether a procedure that claims to circumvent exponential concentration actually works in practice.

See the following 🧵 for more details.
August 1, 2025 at 9:53 AM