Clearly a must read for anyone even remotely interested in numerical integration (be it stochastic and deterministic).
Toni has been (hyper)active in studying these methods both theoretically and practically.
arxiv.org/abs/2602.16218
@ryzhang
PhD student in Computational Statistics and Machine Learning at STOR-i CDT, Lancaster University, UK. Research Interests: Bayesian Experimental Designs, Gaussian Processes, Sampling Algorithms. https://shusheng3927.github.io/
Clearly a must read for anyone even remotely interested in numerical integration (be it stochastic and deterministic).
Toni has been (hyper)active in studying these methods both theoretically and practically.
arxiv.org/abs/2602.16218
Congrats !
The UCL IMSS Annual Lecture will take place on the 27th April with a keynote from @lestermackey.bsky.social.
The theme is 'Computational Statistics and Machine Learning' and we'll have talks from Alessandro Barp, Paula Cordero Encinar & Po-Ling Loh.
imss2026.github.io
@statisticsucl.bsky.social
Very excited to announce the ProbAI Theory of Scaling Laws Workshop (warwick.ac.uk/fac/sci/stat...) at @warwickstats.bsky.social, 22-24 June! (1/4)
Accurate and thorough representation of prior and related work is one of the cornerstones of good research.
It is shocking to me that so many published NeurIPS papers, even from top institutions, have fabricated references.
I recommend reading the original report: gptzero.me/news/neurips/
Mathematical Colloquium (at King's College London): A duality in the foundations of probability and statistics through history by Vladimir Vovk
www.kcl.ac.uk/events/mathe...
How do large language models interpret words relating to probability like βunlikely,β βprobably,β or βalmost certain"?
The below shows what happens when we compare judgements from different models to a benchmark dataset of human judgments (data from: github.com/zonination/p...).
Usual MCMC algorithms are typically guaranteed to work well when used to sample from target distributions for which
i) mass is reasonably well-concentrated in the centre of the state space, and
ii) the log-density is smooth and of moderate growth.
Outside of this setting, things can go poorly.
The recording of my talk on 'Multilevel neural simulation-based inference' at the 'One World Approximate Bayesian Inference' seminar series is now available on YouTube.
Link: www.youtube.com/watch?v=hBWd...
Preferential Sampling refers to scenarios where observation locations are confounded by the field of interest which the same observations are used to infer. This recent arxiv (arxiv.org/abs/2511.03158) looked at how harmful ignoring preferential sampling would be - not much, according to the paper.
Iβll be giving a talk on a recently accepted NeurIPS paper at the next OWABI seminar on Thursday. The talk will cover simulation-based inference and how you can enhance accuracy when you have cheap approximate simulators at hand.
"The Principles of Diffusion Models" by Chieh-Hsin Lai, Yang Song, Dongjun Kim, Yuki Mitsufuji, Stefano Ermon. arxiv.org/abs/2510.21890
It might not be the easiest intro to diffusion models, but this monograph is an amazing deep dive into the math behind them and all the nuances
Let me advertise a bit our Online Monte Carlo seminar:
This coming Tuesday, we have Giorgos Vasdekis speaking on some very interesting recent work.
Moreover, we have confirmed our speaker line-up through until December - very exciting!
See sites.google.com/view/monte-c... for further details.
The first talk of the season will be this coming Tuesday (23 September), given by Alexandre Bouchard-CΓ΄tΓ© from UBC. Alex is a great speaker, so do join if you have the chance!
See sites.google.com/view/monte-c... for details, links, and so on.
Returning soon - stay tuned!
sites.google.com/view/monte-c...
Join us online for a discussion on
βStatistical exploration of the Manifold Hypothesisβ and an opportunity to explore the intersection of geometry, statistics and machine learning.
π
Wed 08 Oct | π 4β6pm UK
π Register + download the paper: rss.org.uk/training-eve...
βEveryone knowsβ what an autoencoder isβ¦ but there's an important complementary picture missing from most introductory material.
In short: we emphasize how autoencoders are implementedβbut not always what they represent (and some of the implications of that representation).π§΅
Gearing up for this workshop next week, with the finalised schedule attached!
For those who are unable to attend in person, but are interested in watching the talks, they will be streamed live on MS Teams. Please do get in touch with me if you'd like to stay informed about the stream.
An announcement, which might be of some interest:
In the period 2022-2024, myself and a number of other postdocs on the "CoSInES" and "Bayes4Health" EPSRC grants were involved in organising a number of internal tutorial workshops, on topics relevant to researchers in computational statistics.
Very cool!
New paper on arXiv! And I think it's a good'un π
Meet the new Lattice Random Walk (LRW) discretisation for SDEs. Itβs radically different from traditional methods like Euler-Maruyama (EM) in that each iteration can only move in discrete steps {-Ξ΄β, 0, Ξ΄β}.
Just finished delivering a course on 'Robust and scalable simulation-based inference (SBI)' at Greek Stochastics. This covered an introduction to SBI, open challenges, and some recent contributions from my own group.
The slides are now available here: fxbriol.github.io/pdfs/slides-....
π£ Please share: We invite submissions to the 29th International Conference on Artificial Intelligence and Statistics (#AISTATS 2026) and welcome paper submissions at the intersection of AI, machine learning, statistics, and related areas. [1/3]
Liwen Xue, Axel Finke, Adam M. Johansen: Online Rolling Controlled Sequential Monte Carlo https://arxiv.org/abs/2508.00696 https://arxiv.org/pdf/2508.00696 https://arxiv.org/html/2508.00696
Really enjoyed listening to this interview with Mike Giles. Only knew him from his multilevel Monte Carlo work, and it was quite a nice surprise to learn about his contributions to CFD and experiences with industrial collaborations!
we're out here simulating, visualising, thriving
Congrats !!!
We've written a monograph on Gaussian processes and reproducing kernel methods (with @philipphennig.bsky.social, @sejdino.bsky.social and Bharath Sriperumbudur).
arxiv.org/abs/2506.17366
Line chart titled βWeekly Runs of RStudio IDEβ showing usage data from 2023 to 2025. The y-axis ranges from 2,000,000 to 6,000,000 weekly runs. The chart displays a cyclical pattern with regular peaks around 5,000,000-6,000,000 runs and dramatic drops to approximately 2,000,000 runs that occur periodically during holiday periods.
Is #rstats dead? I donβt think so.