Please send in your abstract, register, and join us. We look forward to seeing you there!
Please send in your abstract, register, and join us. We look forward to seeing you there!
Whether your focus is quantum algorithms and quantum machine learning, error correction .. etc, we welcome your contributions. Donโt miss out on coming to Thailand and present your work, seeking new collaborations, and getting inspired - all while enjoying a relaxed, beachside scientific environment
Zoรซ Holmes (EPFL)
Maria Schuld
Hakan Tureci (Princeton)
Zoltรกn Zimborรกs (University of Helsinki)
Francesco Tacchino (IBM Research - Zurich)
Mio Murao & Ryuji Takagi (The University of Tokyo)
Martin Larocca (LANL)
Kavan Modi (SUTD)
.
and many more leading minds from the global quantum community !
๐จ URGENT REMINDER: The deadline for extended abstract submission for contributed talk (and abstract submission for poster) is this Friday, Feb 27 ! [BUT realistically Supanut will only check on Monday]
We are excited to feature an incredible lineup of invited speakers, including:
Last call for abstracts! Join us for Quantum Information by the beach in Thailand โ๏ธ ๐๏ธ๐น๐ญ
The 2nd International Conference on Siam Quantum Science and Technology (SQST 2026), happeningย May 18 โ 21, 2026, in beautiful Jomtien, Chonburi, Thailand! ๐
๐Abstract Submission& Further Info: www.sqst2026.org
If you ever wonder during the night whether you have forgotten the effect of shot-noise in your BP-free strategy analysis ... maybe this could help ๐ Also, congrats to @reyhanehaghaeisaem.bsky.social for her first work ๐๐ฅณ
Congrasts Kasidit on his first arxiv ๐๐ฅณ such a talented and hard working master student. He's sure going to do amazing things in the quantum world โ๏ธ
Thanks so much to my co-authors Weijie Xiong @qzoeholmes.bsky.social @aangrisani.bsky.social Yudai Suzuki @thipchotibut.bsky.social It's real fun to work with you all ๐๐
Also, special thanks to @mvscerezo.bsky.social Martin Larocca for their valuable insight on correlated Haar random unitaries ๐ฎ
So yes, big question for future QRP design: how to pick your circuit depth or interaction time so that you remain powerful without going full random.
You want that โjust rightโ level of chaos: enough to get expressive states, not so much that it all washes out.
Episode 4: New Hope
Not everything is gloom and doom. We found that for moderate scrambling (like shallow random circuits or chaotic Ising with short evolution), you donโt get lethal exponential concentration.
Episode 3: Noise erases memo...
We also studied QRP under local unital or non-unital noise. While there are work that argue dissipation as a resource for QRP, we prove noise also forces your reservoir to forget states from the distant past exponentially quickly
Episode 2: Oh what ! I forgot now
We prove that in extreme-scrambling QRPs, old inputs or initial states get forgotten exponentially fast (in both time steps and system size !). Too much scrambling -> you effectively โMIBโ zap each past input.
Hence our new results show that, while chaotic (extreme-scrambling) reservoirs are fine for processing information in small setups as people have studied, they suffer from scalability issue to larger models doomed by their own chaoticity.
Episode 1: Scalability barrier
Based on the unrolled form, we prove the exponential concentration of QRP output. In a large scale setting, the trained QRP model becomes input-insensitive leading to poor generalization despite trainability guarantee.
To address this challenge, we apply tensor-diagram approaches to unroll multi-step QRP into a single high-moment Haar integral on a larger dimension amenable for scalability and memory analysis.
Episode 0: Temporal correlation hinders standard analytical techniques.
While related techniques already establish scalability barriers for other quantum models, the QRP protocol is much more demanding: a fixed reservoir repeatedly interleaves with a stream of input time-series.
Our key messages can be summarized as
๐ฏ Big scrambling in quantum reservoirs helps at small sizes but kills input-sensitivity at large scale
๐ฏ Memory of older states decays exponentially (in both time steps and system size !)
๐ฏ Noise can make us forget even faster
The QRP model processes input time series of quantum states. Here we model the extreme scrambling reservoir as an instance drawn from a high-order design unitary ensemble.
Once upon a time a myth in Quantum Reservoir Processing (QRP) goes by โmore chaos = richer feature map = betterโ
Doomed by their own chaotic dynamics, QRP may not scale in the extreme scrambling limit.
Check out our new Star Waโฆ I mean paper on arxiv: scirate.com/arxiv/2505.1...